Banking is an exciting place to be right now, especially if you’re a digital bank. The industry that was built around the physical distribution of money has moved into the digital age and—as traditional banks come to grips with the subsequent shift in the balance of power from bank to consumer—digital challenger banks have a clear advantage. They can be faster, cheaper, and more agile than their larger counterparts.

In this new world, branch-based activities are declining and Conversational AI is playing a leading role in the transformation of consumer-facing banking. When applied properly, Conversational AI creates a natural, seamless interaction between a human and a computer (whether by text or voice). It leverages Natural Language Processing (NLP) to learn continuously from the data it’s exposed to—data being the operative word.

Excellent Conversational AI can only be built on a foundation of excellent digital data. While traditional banks have masses of customer data at their disposal, access to data may be one area where de novo banks will need help.

Data is king in Conversational AI

Conversational AI applications rely on data, both content and context, to engage users. Georgian Partners put it succinctly, explaining that content is the information exchanged during natural dialogue with customers, while context enables conversational applications to anticipate user needs.

Conversational AI can also be likened to a prediction engine. The model takes an educated guess on the expected response based on how much data it has. The more data, the more accurate the response. Once the response meets a confidence threshold, it is provided to the user.

The key to high-performing models is the data. Data sets can come in a variety of formats, and each produces a different result:

  • Large quantity of good quality data = High-performing AI
  • Small quantity of good quality data = Good performance on a smaller set of well-defined goals, but difficult to generalize and scale
  • Large quantity of poor quality data = the AI will become confused by different concepts and return inaccurate results

The model also needs data to understand what users want to do in a particular context, at a particular point in time. To help the model understand this, we use labelling to label subsets of sentences, individual sentences, and complete conversations to map how they fit in sequence to one another. How we label will depend on how we want to solve a specific problem.

Once data is labelled, we need to evaluate how the model performs in the lab, adapt as required, and then evaluate in the real world (and continue to adapt as things change).

Simplified process for data labeling

 

What happens when data is bad?

When there is any level of ambiguity in an utterance, the AI can get confused if the data isn’t rich enough. For example, if a user asks the question, “Can I pay a bill?”, they could be trying to achieve one of three goals:

  1. Instructions on bill payments: The user may be asking if this is the right place to pay a bill and need instructions on how to do so
  2. Pay a bill right now: The user expects to be taken to the bill payment page
  3. Account balance: The user might be asking if there is enough money in their account to cover a bill payment transaction

If data is not well defined (such as being tagged with incorrect labels), the result will be disappointing to the user. In the scenario above, the best user experience would be to have the AI model return options conversationally, for example:

“Paying a bill is simple. I can help you with that right now. The balance in your checking account is $145. Would you like to use this to pay a bill?”

Training a Conversational AI model to understand the data for a transaction like this is not straightforward. Building these models in-house is time consuming, expensive, and close to impossible for de novo banks. The good news is that you don’t need to build it all yourself. You can partner with a data aggregator to help make sense of in-house data, and you can also partner with a Conversational AI solution provider such as Finn AI, who can manage your data and train the model.

Data aggregators

A data aggregation partner can help cleanse, categorize, and classify all the transaction data that you have at your disposal. One way banks use data aggregation is to add more useful descriptions to online statements and so on.

 

Raw data

 

COSTCxx 04ROCHESTER XXX726 XXX-XXX-1189 XXX027

Aggregated data

 

Costco

 

Similar to Finn AI, there are vertical specific data aggregators that focus on the financial services industry. They have expertise in taking complex, raw user data and molding it into cohesive and intelligible content for banks to leverage in their Conversational AI models. In the previous bill-paying scenario, data aggregators would have provided the checking account balance that was presented to the end user.

TD MySpend uses data aggregation to categorize the items their customers spend money on

 

While data aggregators make transaction data more palatable, there’s more to customer-facing Conversational AI models than just good labeling. Standardized labeling won’t work for all customers. Some data aggregators will aggregate data based on specific categories, such as ‘coffee shops’. However, most don’t go deeper than this. For example, they may not aggregate the data into the sub-category of ‘merchant’. Thus, although a customer could learn how much they’ve spent on coffee over a specific period of time, they would not be able to break this down by merchant, i.e. ‘Starbucks’.

That’s where a full-service AI provider can help. Finn AI partners with data aggregators to deliver high-performance solutions to banking customers. Working together, Finn AI can access their data and labeling to create customizations for customer deployments.

Vertical-specific Conversational AI partners

Data for Conversational AI is hard to come by and expensive to source. For de novo banks, a full-service AI provider is often the fastest, most efficient route to a high-performing solution. By choosing a partner like Finn AI that specializes in banking, you can save a lot of time structuring data, labeling and structuring intents, and building the taxonomy for your AI model.

This singular focus on banking supports an understanding of the language and nuances of your industry. The products and processes from bank to bank may differ, but the ways people interact with their bank and their money are the same. At Finn AI, we’re able to aggregate these interactions across all of the banks we work with—anonymizing and replicating interactions to create significant efficiencies.

If you’re exploring AI for your digital bank, talk to Finn AI to learn how you can get access to great data and the processes you’ll need to deploy useful customer-facing Conversational AI.

Learn more about the most common pitfalls and best practices when deploying Conversational AI assistants for your bank. Get the free eBook: Building Conversational AI for Digital-Only Challenger Banks.
Kenneth Conroy
Dr. Kenneth Conroy is the Vice-President of Data Science at Finn AI. He leads the development of our proprietary NLP system and leverages machine learning to enable intelligent communication through turn-based, conversational flow. When he is not busy leading the team of data scientists, Ken enjoys speaking about the application of AI at events, and taking his new-born Boston Terrier on long walks on the beach.