DIS 2019: Lendtech OpenWrks says it can predict divorce and relationship breakdown
Open banking and lendtech enabler OpenWrks says it can detect vulnerable customers before the event making them vulnerable even happens, using algorithms which can predict life events such as divorce or relationship breakdown.
Through the data provided by open banking, OpenWrks has the ability to inform lenders on these pending personal life events by looking at account-opening patterns and how people move money around.
With gambling habits too, the fintech can determine when it will become a health issue. The predictive insight offers a solution to finding those customers who don’t actively declare vulnerabilities.
But OpenWrks’ CEO, Oliver Betts, questions the moral concerns tied to these sorts of data insights, asking an audience at UK Finance’s Digital Innovation Summit 2019 as to whether they think the data should be used.
“I think being able to detect people on the margins or those at risk of falling further into debt is a great thing, because we can try to get them to pay it back more effectively and sooner,” says Betts.
But Betts uses the example of gambling, which is a legal activity in the UK, and wonders whether we should therefore treat it as a vulnerability, even if it does risk someone’s financial capabilities. Just the same as with a divorce or relationship breakdown – these life events are personal and legally sound, even if they will inevitably impact their financial situation.
Fellow panelist Richard Johnson, Computershare Loan Services’ head of strategy & proposition, says his company is not brave enough to totally digitise the process of detecting vulnerable customers. Instead, it will digitise a preliminary stage, and then this will lead to a shorter, more informed call.
Read more: AI: Understanding bias and opportunities in financial services
As well as talking about detecting vulnerable customers, the panel also talked about innate biases when FinTech Futures put the question to them. Betts accepted that there has been sound research which shows biases in algorithms against people of colour (POC). He refers to the insurance industry, which saw the Financial Conduct Authority (FCA) launch an investigation into the insurance companies’ pricing structures and how the algorithms it was based on indirectly discriminated against POC.
Betts says all he and his company can do is “provide more information which shows a more granular reality” of someone’s financial circumstances. “We can’t stop a lender from falling into these biases,” says Betts, who counters this by saying it can provide a better-informed information set to outweigh these.
For Betts, innate biases “are a potential consequence of optimisation in pricing”, which certainly seems to have been the case in the insurance market.
Talking more broadly about the data lenders who don’t use open banking currently rely on to issue loans, Betts says “he can’t believe we only lend to people on the basis of three months of transaction data”. In his eyes, institutions should use April as a starting point to get a more accurate reading, and look at as much as 24 months of financial transactions.
“We can miss quite a lot,” says Betts, who points out the annual and quarterly patterns which can’t be detected from a three month banking statements request.
Read on: True diversity or tokenism