Conversational AI goes mainstream at Capital One bank
Capital One is one of the largest banks in the world, with revenue over $27 billion. They are also behind the ubiquitous “What’s in your wallet?” television commercials.
To take a deep dive, I invited Capital One’s head of conversational AI, Ken Dodelin, as my guest on the CXOTalk series of conversations with business and technology innovators.
Dodelin describes how the company has embraced AI in customer service. He says, intelligent assistants “interact with customers in a conversation. Customers can speak or text in natural language and interface with Capital One through an AI. Sometimes it’s our own, which we call Eno. Sometimes it’s through a parent AI, like an Alexa, Cortana, or something like that.”
Conversational AI enables great flexibility for users at the cost of significant back-end complexity. Traditional point-and-click user interfaces – whether on desktop or mobile –presents users with a fixed set of choices, represented by buttons and clickable areas. In contrast, conversational, or natural language, user interfaces are open-ended. Users can ask anything, without the constraints imposed by specific pathways defined on the screen.
Although this flexibility is highly beneficial to users, it presents a significant challenge to developers. For the system to work as users expect, the NLP (natural language processing) engine must correctly understand nuanced spoken or written commands from users.
To present open-ended choices realistically, conversational AI must first interpret language correctly. Then, the system must take action that makes sense to the user, based on her or his requests. The challenge of presenting users with helpful, non-trivial answers is a distinct challenge from accurately interpreting the nuance of language input.
User adoption rests on the system both understanding the user and delivering useful responses to questions. The ultimate goal of conversational AI is mimicking human customer service agents so well that users cannot remain unaware they are interacting with a computer system. (To learn more about this, check out the Turing test.)
To gain insight into how Capital One is deploying conversational AI and intelligent agents, watch the entire video above and read the edited summary comments that follow.
Why is conversational AI so important to Capital One?
Ken Dodelin: Capital One is a very tech-forward company. We’re the first bank to go to the cloud. We’re the first, or one of the first, banks to have an API-based infrastructure that powers all of our digital experiences. We’re always looking for ways to use technology to improve the customer experience.
It used to be that you could either come to our graphical user interface, which would be an app or a website, where we, in some ways, try to guess what it is you’re looking for. You have a finite number of buttons and links to click on or tap on. In the conversational interface, it’s whatever is on your mind. You can text in natural language, and then we’re able to start with that rather than starting with all the guesswork. It’s a great complement to GUI experiences, and it’s also a great way to interact with customers without having to pull them into our uber website or app.
What are the business reasons to deploy conversational AI?
Ken Dodelin: Well, there are fantastic business and customer benefits to conversational AI. There are three things.
The first one is, we have a recent emergence in natural language processing technology. Not too many years ago, we just couldn’t understand customers, what the intent of their spoken or typed utterances were in the way that we can today. That advancement has enabled these experiences to reach a threshold where they’re useful.
The second thing is the availability of data. Our ability to get answers and connect into our API infrastructure, the same one that powers our websites and apps, is at a place now where we can use real-time context to adapt the conversation, so it is more conversational and not just a predetermined conversation that wouldn’t feel very human.
Then the third thing, in addition to the NLP and the data, is the proliferation of the Internet of Things devices. Whereas in the not too distant past it was pretty much a website or an app that you were going to interact with Capital One through, or else you were going to call the call center. Now we have things like Alexa that use voice-enabled touchpoints with the customer within their car, in their living room, or wherever they put those connected devices that enable interaction.
Also, since we don’t have to have a GUI come with us wherever we want to interact with customers since we can do it simply through natural language, we can go to places like text messaging, messaging apps, and other similar things where the interaction is all just natural language. Even emojis. Who would have thought you’d be paying your bill with a thumbs-up emoji, but that’s what we’ve enabled, and customers have gravitated there.
Where is the technology headed?
Ken Dodelin: Natural language processing is very good at deriving a customer’s intent, where you can then go and give them a response. There are a lot of advances starting to take place along dialog management and natural language generation. These types of things are still in the formative stage, but they allow conversations not just to answer standpoint, but more of a conversation that can go in many directions, which is more of how human conversations go.
Disclosure: This video is part of a series in which IPsoft hosted eleven CXOTalk Forums, as a paid engagement, to conduct interviews with senior executives on topics related to cognitive computing and digital labor.