Data structures hampering banks’ ability to monitor risk
The credit crisis highlighted the inability of financial institutions to aggregate risk exposures and concentrations quickly or efficiently enough. In particular, it demonstrated the difficulty of aggregating enterprise data without losing the significance of the underlying source data in the process.
According to a new white paper from Wolters Kluwer Financial Services, one of the key issues faced by data architects tasked with creating a unified data management infrastructure is the fact that operations in different countries often have different internal systems. As well as the obvious issue of inconsistent data formats, these systems often lack common vocabularies and definitions. This creates a “considerable obstacle to achieving a group-level view or consolidated reports needed to meet specific regulatory requirements”, the paper says.
“The usage of data within a financial institution differs from team to team, and this is complicated by regulatory differences – for example, risk departments following Basel bank capitalisation rules will measure probability of default over a 12 month period whereas under IFRS, the finance department will look over the life of a product, such as a 20 year mortgage,” said Wolfgang Prinz, vice president of product management at Wolters Kluwer Financial Services. “The issues raised in this paper show the paramount importance that firms have standard data architecture to make this kind of distinction so that the data is meaningful to all potential recipients, while retaining the underlying consistency so that all users are working from the same base figures.”
The paper, Data Management – a Finance, Risk and Regulatory Perspective, considers how firms should implement a best practice data management solution that ensures consistent data across the enterprise, and explores the benefits such an approach can offer, including:
- Yield efficiency benefits from the data itself: Within individual business units and geographical entities, a standardised data infrastructure can be an enabler for more tailored access to data and analysis. With a common vocabulary in place, individual entities can be sure of the data sets they are identifying, and can monitor them appropriately and interpret responses on the basis of who’s asking the question, whether it’s the finance, risk, trading, settlement or another department.
- Economies of scale Adoption of a standard data infrastructure across domains means that both internal developers and third-party application vendors can implement solutions across the financial institution’s various business, regulatory or geographical silos. It becomes much more straightforward to expand successful projects across the enterprise, since interfacing to the data infrastructure would be easier ensuring that any new implementation is not in essence a matter of starting from scratch. Also, with a solution implemented across various silos, expansion of projects across an enterprise is far simpler, providing significant cost savings.
- Faster response times A standardised approach to data management across activities can yield improvements in response times, addressing the emerging demand for faster access to risk and other information that can aid in the decision-making process. Firms are seeing the value of looking at broad exposures in a timely manner, and an automated set of processes based on standardised data can be the catalyst of achieving this.
Click here to download the paper