Two keys protecting the privacy of the underbanked in Asia
On 12 November 2018, the Monetary Authority of Singapore (MAS) released a set of principles for the responsible use of artificial intelligence (AI) and data analytics in finance. Dubbed FEAT – for fairness, ethics, accountability and transparency – the principles should be viewed as a watershed moment for both financial institutions and fintechs.
With the rising power of bleeding edge technologies like AI and automation, stakeholders are beginning to realise that consumers deserve to be protected. AI’s role in finance, in other words, needs terms and conditions that we as an industry adhere to.
But companies should not just wait for these principles to come from the top-down, passed on from government agencies and professional organisations. In a consumer environment where people are increasingly security and privacy conscious, brands need to take it upon themselves to develop a guiding light not just in the application of their tech, but how every aspect of their business is handled. They need to scrutinise their product development, their operations, their customer service, and everything in between with a focus on privacy, especially when working with vulnerable groups like the underbanked as most of us do.
To put it simply, the underbanked lack a voice. Most of them live off-the-grid, are employed in the informal economy, and get by on a hand-to-mouth existence where they do not even have enough to meet the minimum balance required by banks. So while the underbanked occupy an entirely different world than most banking leaders and fin-tech entrepreneurs in Asia Pacific are accustomed to, we need to put ourselves in their shoes in order to be that missing voice in developing fully thought-out policies that protect their privacy.
I share my own two guiding policies here in the hope that we continue to focus not only on growth in serving the underbanked but also much more on how we achieve it.
Ask for only what you need
Whenever you download an app from the Play Store or the App Store, it will ask permission to access to different parts of your phone, including everything from your calendar and your contacts to your SMS and storage. Since most consumers will agree to these terms, as they are eager to use the app’s functionality, many developers will push for blanket permission. That is, they’ll ask for access to every conceivable part of your phone, even if they have no immediate need for doing so. Why would an e-commerce app, for example, need access to the body sensors on your phone?
Asking for blanket permissions is a violation of personal privacy, particularly when they lack education in this space, such as with underbanked consumers. Many companies in the digital finance space are especially guilty of this issue, with some consumer lenders even requiring full access to a customer’s online social networks and email accounts.
Rather than force this kind of intrusion upon consumers fintechs should ask for only what they need to perform their functions. Nothing more, nothing less. In CredoLab‘s case, for example, we extract smartphone data that is completely anonymised to credit score our consumers. This approach ensures that underbanked consumers who want access to digital finance and other instruments do not need to compromise their privacy.
Empower rather than identify
Ever wonder why there is such a controversy when hackers release information like a list of user emails? A person’s email and other seemingly trivial data points, such as a person’s home address or phone number, can be used to identify the person, potentially enabling much more nefarious violations of privacy. In the wake of 2017’s Equifax data breach, for example, millions of customers had sensitive, personal data accessed, running the risk of becoming victims of financial fraud and even blackmail or extortion.
Despite the dangers of non-anonymised data, many companies still extract and store these data points. One major class of offenders are alternative credit scoring companies that scrape social media data – including your real name and other personal information – as part of its credit evaluation process. These methods expose the underbanked to an enormous amount of risk in the event of a data breach. The optimal approach is for data to be anonymised at every stage of your company’s service, so no one would ever be able to tie it to an individual. Data should be completely anonymised from extraction to evaluation, ensuring that the integrity of customer data will never be compromised.
These are just two best practices in what is sure to be an ongoing evolution – and debate – in how we can best protect the privacy of consumers, especially vulnerable groups like the underbanked. It is in my hope that in talking more about the privacy rights of underbanked consumers we can get past the notion that there need to be trade-offs between functionality and security, and see them as two ideals that should go hand-in-hand.
By Peter Barcak, founder and CEO, CredoLab
Connect with Peter on LinkedIn