The real-time data conundrum facing banks
Providing the best possible online experience to their customer is key for banks, especially at a time when many fintech firms are now offering innovative banking solutions.
But, unlike the digital natives, incumbent banks find themselves stuck with legacy systems and their back office may not be able to support the huge amount of data needed to power the online banking services used by their customers.
That’s why some banks are now looking at alternative solutions to store and exploit real-time data.
“If we want to exploit the data at scale, we need to be able to do that in platforms and solutions in specific domains,” said Mark White, global consulting technology CTO at Deloitte, when speaking at the DataStax Accelerate conference in Maryland, US, in May. White has served clients in financial services, among other industries, and told the audience that they could no longer exploit the data themselves.
Several US bank representatives were in the audience this year, and it’s easy to see why they were all ears. A North America banking operations survey published by Accenture in 2018 showed that improving the customer experience was the top strategic priority of 74 percent of the bank operations leaders.
Unfortunately, the authors of the survey added, “in most North American banks, traditional approaches to operations aren’t well aligned with that vision”. The survey stated that centralised back-office operations are too slow, inefficient, and inflexible to meet customer expectations, and the data needed to personalise interactions remains trapped in disparate legacy systems.
The problem is not limited to US banks. In fact, banks around the globe built their mainframes decades ago at a time when customers did not have access to internet banking, or even mobile apps.
But times have changed. Customers now ask for real-time access to all their accounts and products and a seamless journey across the channels – whether it’s from an ATM, the bank website or a smartphone app. This puts a lot of pressure on banks’ legacy systems as this generates a huge amount of data to store for millions of customers.
For the customer, though, an old IT framework, which is not designed to run at the right scale or which has a high latency, may well be enough of a good reason to move towards a new, more agile banking provider.
This is the problem one of the largest Polish banks faced until Braintri, a fintech firm which creates technologically advanced solutions for banks, payment institutions, and other financial organisations, came up with a solution.
The Polish bank built its mainframe system in the 1990’s at the end of the Soviet Union. Unlike in many other countries though, banks in Poland often serve as market places where customers can not only see their latest statement, but book a parking space or even buy an insurance product. As a result, it is not uncommon for Polish customers to log into their mobile app at least six or seven times a day.
“On top of it, when people look at their bank statement online, they usually want to see their transaction history over the last couple of years and not only over the last three months,” said Maciej Stępień, CEO and co-founder of Braintri, during a presentation delivered at the conference. “All of this generates a huge amount of data that adds pressure on the old legacy system of the bank.”
To solve this issue, Braintri built a solution that takes the data from the core banking system in real-time. The data – which is split between two data centers in that particular case – is then made available to external systems through compatible APIs.
Their solution is built on top of DataStax Enterprise and works around the idea of change data capture (CDC). In practice, this means that anything that changes in the bank’s legacy system gets automatically pushed into the scalable Cassandra cluster – a sort of datastore developed by DataStax.
Instead of having the bank back office doing all the work, the DataStax datastore receives all the updated data that will help power the online banking system.
As to why Braintri opted for the DataStax database over other solutions available on the market, Wojciech Zatorski, COO and co-founder of the firm, put forward the ease with which the fintech could use the Cassandra cluster.
“With a mainframe system, at some point, firms need to move to a larger machine,” he said. “For other database providers, scaling-up is quite difficult because they need to move to a bigger machine, or to clusters that are difficult to manage, even more so if institutions are dealing with multiple data centers.”
This is where Zatorski believes that DataStax comes in as “you only add a note to the cluster and you don’t need to do much; the replication kicks in automatically.”
However, the solution developed by Braintri only offers a transition from the bank mainframe systems. Banks will therefore keep their mainframe for the time being and will not operate a full migration of their data.
But, whether banks are willing to opt for such a solution remains to be seen. After all, one of the main concerns banks have raised in the past is the extra layer of complexity this type of datastores brings to an already complex IT system.
Patrick Callaghan, solutions architect at DataStax, acknowledges this concern. “Adding an extra layer can be a disadvantage. However, this solution allows banks to take off a huge amount of processing from their mainframe system. This can help them save millions of dollars in a year depending on how much they would spend without this extra layer.”
By Cécile Sourbes, freelance writer and editorial contributor to FinTech Futures