Insights / Finance / Article

Fact vs Fiction: Finance Use of AI

March 12, 2020

Contributor: Rob van der Meulen

To capture the transformative opportunities of artificial intelligence (AI) for finance and analytics, separate the real barriers to AI adoption from the excuses.

Finance leaders overwhelmed by AI hype or information overload can’t capture the benefits that AI offers for transforming key finance activities, from forecasting and planning to decision support. 

“We hear a range of hype and scare stories about AI, but three stand out,” says Clement Christensen, Senior Principal, Advisory, Gartner. “One, that AI or machine learning (ML) can solve any problem. Two, that implementing AI requires a data scientist and huge funding. Three, that the days of traditional forecasting and human intervention are over. None of these claims is true, yet they can still hinder the progress of AI adoption.”

Don’t underestimate the likeliness that your AI pilots will become solutions in search of a problem

Numerous AI pilots already promise huge productivity gains and AI tools, and platforms and cloud services are fast becoming better, less expensive and more accessible. Don’t let misconceptions cloud your decisions about your use of AI.

Read more: Financial Forecasters Should Beware 3 Machine Learning Myths

Perceived barriers to adoption

Three-quarters of financial planning and analysis (FP&A) leaders cite four main barriers to adoption: Poor data, insufficient use cases, inadequate talent and skills, and lack of business buy-in. “Often these are problems of perception or approach rather than actual obstacles,” says Christensen. 

Data quality

Avoid the potential trap of trying to create a “single source of truth.” It’s a nearly impossible task that often makes data less useful for decision making. Instead, aim for “sufficient sources of truth” — that is, enough to make a timely business decision.

Use cases

Start with business problems, not use cases. Too often, the first question finance leaders ask around AI is “What are other finance teams doing?” This approach often yields redundant technology in search of a problem. Instead, first identify a (potentially obvious) business problem — ideally with input from the business — and only then ask how AI can help. 

“Your data scientists, engineers and analysts will be hyperfocused on developing new capabilities. Don’t underestimate the likeliness that your AI pilots will become solutions in search of a problem. This has been a common problem for early adopters,” says Christensen.

Data scientists

Finance teams face a paradox: They feel unprepared to embark on AI projects without a data scientist but often feel unable to make the business case for hiring a data scientist without proven AI use cases.

In reality, many on the finance team have enough experience through their use of programs such as R or Python to make headway with realistic use cases. At one company, for example, two non-data-scientist employees working part time developed an ML pilot that reduced time to accounts receivable settlement by 40% and did it in just six months. 


The business doesn’t care about correlation coefficients and “p-values.” Communicate and emphasize the business context: Why does AI augmentation solve a business problem? How does it affect the output? How accurate is the output?

Actual barriers to adoption

Not all AI obstacles are perceived. Here are some of the most common real issues:

Use of poor-quality data

To ensure quality, standardize business-critical data that is common and shared across the organization. At most companies, this will be data that you report externally and data that supports your core, strategic KPIs. This approach helps surface missing, incomplete and duplicative data. 

Then create an organized data environment to handle that data effectively, while deploying other data storage solutions to manage nonstandard data. A simple rule to remember: 80% of your analytics will come from 20% of your data. But the remaining 20%, the exploratory, AI-driven analytics, will come from 80% of your data.

Human biases

Data scientists can introduce cognitive bias into the AI models they create, based on previously used projections or even anecdotal personal experience. If you don’t monitor for and remove such biases, outputs may be skewed — exposing the organization to financial and regulatory risks. Diversity in teams and datasets help to identify biases, validate outputs and train a model objectively, as does a framework for testing for bias in models.

Fear of job losses

One study claims that 43% of workers cite the fear of losing their jobs to AI as their top cause of workplace stress. Do not underestimate the potency of this fear. It leads to adoption resistance, fewer opportunities to deploy technologies and, ultimately, lower business performance.

Identify ways to bypass employee fears in the near term while building understanding and digital dexterity to complement AI with human judgment. Create new roles, such as an FP&A analyst or forecasting analyst with a responsibility for handling anomalies and exceptions arising from AI to provide opportunities for finance team members to learn and improve their skills.

Read more: How AI Will Transform Financial Management Applications

Gartner CFO & Finance Executive Conference

Join the most important gathering for CFOs to explore potential finance tech providers and get actionable insights for how you can prioritize technology investments.