Point of View

Before you “git some AI,” be practical about the problems you’ll solve with it and how

August 13, 2021

The value of any emerging technology is in how we apply it to solve real-world problems. The journey from emerging to accepted, fit for purpose, and broadly utilized is critical to realizing value. Artificial intelligence (AI) and its myriad permutations hold this potential but have stumbled on the path from a science project to business value.

In a discussion at the HFS OneOffice™ Digital Symposium, Elena Christopher, SVP HFS Research, discussed ways to move beyond the buzzy AI to practical AI with a select panel of experts:

  • Martin Caupin, Lead Data Scientist and head of the US AI Lab, BNP Paribas CIB
  • Elena Christopher, SVP, HFS Research (Moderator)
  • Rupinder Dhillon, Chief Data Officer and SVP of Data and AI, Hudson’s Bay Company
  • Mike Hobday, interim CEO, AntWorks
  • Tom Reuner, SVP and IT Services leader, HFS Research
Frame a business problem as a prediction problem to craft an AI use case

Solving business problems starts with understanding the problems and then devising solutions. Often with emerging technologies like AI, the temptation is to start with the cool tech and figure out where it goes. AI driven solutions may make sense initially, but ultimately the problem needs to dictate the solution. Elena likened this to Holly Hunter in the film Raising Arizona insisting, “I’m gonna git me a toddler!” Too many enterprises have perhaps gone down the path of “I’m gonna git me some AI!” without adequate thought about what to apply the cool tech to or what problems it could solve.

It is essential to target problems that business teams are actively attempting to solve, contributing to active participation and collaboration to ensure the team focuses on the right problems for AI to address. Teams can clarify problem statements by conducting business interviews to cement a consistent understanding, as BNP did when setting up its AI Lab. Hudson’s Bay trains its users to identify “prediction problems” that AI can potentially help solve. In addition, a network effect is possible in global organizations, where large business problems straddling geographies could benefit through economies of scale from both the impact of the problem and the prioritization of the solution.

An operational organization is a key to ensuring enhanced collaboration to drive value by aligning the right technology experts. Hudson’s Bay combined its RPA and AI teams to drive intelligent automation. The collaboration will help drive specificity in selecting the right AI components, such as machine learning (ML) or natural language processing (NLP), to address different parts of a business problem, a level of detail Martin Caupin (BNP Paribas CIB) indicated BNP pursues.

Overcoming trust deficit, operational risks, unclear outcomes, and enterprise culture is critical to advancing AI value creation

HFS Research has repeatedly shown that the biggest barrier to success with emerging tech is some variation of an unwillingness to change. The sentiment could be driven by a trust deficit, lack of understanding, or just simple inertia. A strategy to overcome it could be to partner with a business evangelist with positive AI-use-case experience that can drive education and encouragement to expand AI participation and adoption.

Another key operational risk is the talent to support both implementation and steady-state. The competition for talent for emerging tech is ramping up, indicated Martin, and there needs to be a multi-dimensional strategy to address it, including reskilling internal staff, reengineering HR processes to support progressive career pathing, and creating a more flexible, innovative, attractive culture. In addition to talent, Mike Hobday (AntWorks) alluded to the sub-optimum use of data; wasteful hoarding, not sharing useful data across the enterprise, or even not applying it to make smart decisions all have a detrimental effect on business operations.

The lack of clear success metrics can dissuade progress. While enterprises may identify the business problems, Tom Reuner (HFS) suggested that the commensurate outcomes are not defined, which is akin to having a roadmap without a destination. Business sponsors must invest as much time in defining outcomes as they did in framing the problems. It is equally critical to establish how outcomes will be measured so that they can be incorporated into the overall AI solution.

Successful AI implementations are business-driven. However, enterprise users often consider AI to be a technology-driven solution, with algorithms delivered by their IT teams. It cannot be far from reality. Rupinder Dhillon (Hudson’s Bay Company) suggested algorithms can realize 10% to 25%, but it is the business reengineering that brings it home. Martin indicated it is important to frame AI not as an enterprise application but rather as being embedded into enterprise processes that the broader organization can use. Rupinder highlighted AI, a business transformation driven by enterprise culture, not a technology initiative.

Avoiding pilot purgatory is key to what good AI looks like

AI, as an emerging technology, starts its enterprise journey as a pilot or a proof of concept (POC) to solve a business-driven use case. To taste success, the use case must emerge from the POC with verifiable and positive results. Rupinder’s recipe for moving the use case to reality includes three elements: key performance indicator (KPI), the combination of the right skills and data sets, and a roadmap to productionalize and scaling the use case.

The definition of project KPIs is key to understanding what “good” looks like. KPIs invariably must have a set of quantitative and qualitative metrics that not only show the go-forward performance of the AI use case but can be compared to the legacy process. This allows for a clear understanding of the true impact of the AI use case.

To deliver successfully, AI technology (ML, RPA, NLP, etc.) skills of the right level of experience and expertise must be recruited. Given a key driver of AI is data, the use case must benefit from the selection of the right data set. This combination of skills and data will be paramount to the actual execution and efficacy of a use case.

Lastly, a roadmap to convert the use case into a self-sustaining product at a scale defined by its potential installed base needs to be possible. A use case that cannot be productionalized and scaled will never graduate from POC. It is essential that, as part of the POC, such a roadmap is crafted and tested frequently.

The Bottom Line: AI is not magic in a box; rather it is the application of ML, NLP, smart analytics, automation, and more to enable a business-driven strategy to accelerate and maximize value generation through efficient and effective use of data.

Business participation in use case identification, development, and implementation; an organizational construct that brings the right skills and right data together; and an enterprise culture that is agile and flexible to support a new business modality (process and technology) are essential to enabling AI practically in any enterprise. So, let’s go git us some AI!

Watch the Being Practical about AI session

You can read other POVs and a comprehensive ebook about the Symposium, plus watch video highlights of the two-day event, here.

Sign in to view or download this research.


Lost your password?


Insight. Inspiration. Impact.

Register now for immediate access of HFS' research, data and forward looking trends.

Get Started