By Michael Chow, GoldenSource
As volumes of data continue to increase exponentially, organisations across all industries, including financial services, are wrestling with the problem of transforming massive heaps of bits and bytes decision-enabling information.
A McKinsey Global Institute (MGI) study on the topic cites some staggering statistics that speak to the magnitude of this exacerbated problem ...
- "40% projected growth in global data generated per year vs. 5% growth in global IT spending"
- "30 billion pieces of content shared on Facebook every month"
- “235 terabytes were collected by the US Library of Congress in April 2011 alone”
- “15 out of 17 sectors have more data stored PER COMPANY than the US Library of Congress”
The challenge for financial organisations is transforming such monumental amounts of data into insightful and actionable information. At first glance, this seems to be endemic of the never ending saga of "doing more with the same or less." However, advances in technology are enabling firms to capture, store, manage and analyze datasets previously too unwieldy for traditional solutions; this is what underlies "big data.”
Plummeting storage prices allow for more digital data capture; approximately $600 can now store the world’s entire commercial music library. Combined with intelligent algorithms, one can enable an organisation to make sense of the deluge of data. As a result, there is an unprecedented and justified level of enthusiasm in the potential of big data. Its current usage has moved far beyond the original, scientific definition. Today’s data ecosystem demands the ability to distinguish between areas where emerging technologies are necessary and those areas where proven technologies can successfully address big data-like problems.
The reality in capital markets is that a ton of valuable data lives in data silos across multiple lines of business, geography or product desks. The additional challenge is that it is unable to be normalised, related and interpreted for analysis. In this post-securities master world, the challenge is to not only create a "golden copy" of instrument data but also integrate related datasets, including positions, transactions, customers and counterparties. The ultimate goal is to unlock the value at the intersection of these domains; this is exactly the kind of problem big data can solve.
The broader definition of big data allows advanced relational database technologies to address the problem head on. These technologies will further speed up performance and scalability beyond the requirements of even the largest global investment banks, exchanges, and regulatory agencies. Such advanced relational technologies are even suited to monitoring systemic risk, the mother of all enterprise data management (EDM) challenges. At its core, the relational data model enables a 360 view of enterprise data management. In essence, having a complete 360 view will allow a user to define objects and every relationship between instruments, entities and transactions.
In the next few years, it will be critical for firms to take a 360 view of their data to eliminate potential blind spots and to strengthen operational efficiency internally. For example, customer profitability analysis can be conducted for mutual funds to segment and target profitability pools as well as increase per customer profitability with cross-selling and up-selling insights.
In addition, having a robust counterpart hierarchy will be necessary to enables firms to roll-up exposure quickly. Without it, they won’t be to answer critical questions, such as, "What is my exposure to Lehman, AIG or MF Global, etc."?
The ability to consolidate multiple versions of the same data enables nontrivial cost savings in addition to preventing expensive errors from occurring due to differing datasets on opposing sides of a process. These applications, combined with data standards and semantics to homogenise complex trading environments, enable a firm to unlock the value of structured data by achieving a single version of the truth.
Can big data deliver great business results, or will firms need to employ a “smart data” model? Whatever definition you subscribe to currently, the debate around data is a positive sign that firms are becoming more aware of data and extracting value from their data assets. Being able to harness the latent potential of large datasets in a structured, purposeful manner will undoubtedly yield lucrative opportunities, where cheap debt and exotic instruments have lost much of their steam.