Contact centre agents have no easy task dealing with the escalating levels of financial vulnerability, made even more difficult by the emotionally charged nature of debt-related conversations. However, the emergence of AI technology has changed how contact centres help customers, offering support to agents in identifying vulnerability and providing personalised assistance.
Financial vulnerability can show up in various ways, such as changes in spending patterns, missed payments, excessive debt, or sudden life events that shakes up a customer's financial stability. It is important for both the customer and the financial institution to recognise these signs early.
Identifying a Financially Vulnerable Customer
But how can an agent know that a customer is vulnerable while engaging with them in a live conversation? A human agent speaking on the phone or in person can pick up if a customer is feeling fragile by using intuition and clues such as tone of voice. With digital text messaging, it is trickier. You can’t see a person to read their body language and you can’t hear them to judge by tone. This leaves the words themselves.
A sentence can mean one thing on the surface, but another meaning may lie beneath. It is hard to read things like sarcasm and exaggeration from simple texts, not to mention outright lies that people tell.
So how do you spot vulnerability in messages?
Using AI to Detect Financial Vulnerability
The secret lies in how you train your AI LLM model. A general LLM will not give you the detail and control that you need in the specific industry of credit and collections. You cannot let an uncontrolled LLM, like ChatGPT, loose on your customers.
What you need to do is focus-train a Custom Language Model (CLM) to understand the language around credit and collections. This training is done on real life conversations that have taken place in contact centres (with personal details stripped for security reasons).
Monitoring Conversations using NLU and Intent Recognition
The AI system uses the CLM and
Natural Language Understanding (NLU) technology to interpret what a person is saying.
With Intent Recognition (what a person is really saying), the system is trained to identify keywords or speech patterns that indicate financial distress. For example, falsehoods when compared to historical data, syntax changes, repeated responses, excessive mentions of late payments, overdrafts, or job loss, etc. can trigger alerts.
Sometimes, a customer will simply share their circumstances, such as: “I can’t pay today as I am sick, and my wife has lost her job.” The bots will recognise words like ‘sick’ and ‘lost her job’ and know that they indicate a vulnerability.
Spending Pattern Analysis
The AI uses the account data of the customer to better understand their situation. The system can track spending patterns across accounts and spot unusual or erratic spending behaviour, such as a spurt in credit card transactions, which will be automatically flagged for review.
AI as an Intelligent Assistant
The same AI technology is used by both chatbots and agents to identify vulnerability. With chatbots, they will be trained to direct the conversation down the best path to help the customer, which may well be to hand them over to an agent.
With an agent, the AI is an intelligent assistant that empowers them with knowledge as they engage with customers. Real Time Alerts - Labelling Vulnerability Intents As mentioned above, AI used in customer engagement is trained to identify intents (or Propensities). This feature is especially powerful in sensitive situations where people are angry, anxious and stressed. The bots are good at spotting these vulnerability cues and then labelling them on the agent’s screen for them to take the best actions. Labels such as ‘Life Events Vulnerability’ will be tagged against that customer and this helps the agents make the best choices when responding.
Response Guidance and Speed Conversational AI doesn't just stop at detection; it helps agents by suggesting conversation responses or providing links to resources that agents can use to guide their interactions with vulnerable customers. This speeds up the query resolution and lifts some of the pressure off the agent while also keeping the responses accurate and in-line with brand tone.
Personalisation with API IntegrationBy linking via an API to backend systems, the AI gives agents the data to provide tailored support based on individual circumstances.
Conversation Highlights and Summaries The AI can summarise the customer conversation for the agent so they don’t have to read the whole conversation before they take over from a chatbot or colleague. This speeds up the process and provides a better experience for the customer
AI can Help Relieve the Emotional Toll on Agents
Contact centre agents in collections/financial services are under pressure as day in and day out they are dealing with anxious and angry people. This takes its toll on agents, and it’s no surprise that there is a high turnover rate in this sector.
By helping agents detect and appropriately respond to vulnerable customers, AI systems lift the heavy burden off the agents and lower their anxiety levels. It’s like having an expert standing alongside them and guiding them through.
If agents are happier and feel more empowered, agent attrition goes down and agent morale and retention go up.
AI Chatbots Trained to Answer Empathetically
Although we are focussing on how AI helps agents in this article, it is worth noting that AI chatbots can be trained to help financially vulnerable customers with empathy and efficiency. Many people prefer speaking via a
digital messaging channel as it eliminates the embarrassment factor of being in debt, among other reasons.
Compliance, Regulations and Consumer Duty
AI can help enterprises remain compliant to regulations, especially those around how you treat customers. For example, the Financial Conduct Authority’s (FCA) Consumer Duty in the UK (
What is the FCA Consumer Duty?).
Under ‘The Four Outcomes’ of Consumer Duty is states that Consumer Support is necessary throughout the customer journey and businesses should make sure the product/service meets their needs and they understand it, especially vulnerable customers. How can you design a conversational AI platform that sits within the lines of Consumer Duty? You have to build it into the fabric of the AI bot design and operation, such as ensuring the chatbots’ interactions with customers are fair, clear, and not misleading. Bots can be programmed to simplify complex financial language and provide explanations for any industry-specific terms to ensure that customers understand.
Having these regulations encoded into the AI chatbots and the intelligent assistants helps the agents themselves remain compliant.
Evidencable Collections companies need to have a 'paper trail' - concrete evidence - that they are attending to the needs of vulnerable customers. A conversational AI messaging platform records every action taken by the agents and bots that operate on it, and every customer conversation transcript is available for inspection.
Conclusion
AI is changing the way financial institutions detect and support financially vulnerable customers. Its ability to analyse data, detect early signs of distress, and provide personalised assistance helps agents to deliver better care.
While conversational AI is a powerful tool, remember that empathy, compassion, and the human touch are irreplaceable in helping vulnerable customers. AI should complement, not replace, the human aspect of customer service. Agents should always be ready to listen, understand, and provide emotional support in conjunction with the insights offered by AI.