The concept of big data is becoming a reality as financial firms question whether it will be a data management problem or an opportunity to commercialise in-house data. Backing up the potential of big data, the role of chief data officer (CDO) is emerging and technology vendors are looking to fulfil the big data brief.
Panel participants at last week’s FISD Real-Time Technology Roundtable in London, hosted by BT and sponsored by CME Group, Equinix and Sybase, debated the challenges and potential of big data, how it can be identified and managed, and who will calm what was described as the data tsunami.
Deciphering how big big data could be, speakers and panellists at the FISD event offered insights into big data, and some of the networking and data management technology solutions that are being built to support expanding volumes of data generated in capital markets and increasingly scrutinised by regulators worldwide.
Considering the ‘problem or opportunity’ aspect of big data, consensus among panel members suggested big data, well managed, will create commercial opportunities to identify and sell specific data sets and types, but all agreed that delivering big data projects has inherent problems.
During the panel discussion, Stuart Grant, EMEA business development manager, financial services, at Sybase, described the biggest big data problem as knowledge, an issue underlined by a show of hands from the audience that indicated few with a good grasp of big data and many with an interest in understanding the concept.
Grant also noted the need for a uniform approach across business and IT if big data projects are to succeed over a long term, but warned that many organisations have data gaps, particularly missing meta data.
Andrew Delaney, A-Team Group editor in chief, reported ongoing research by A-Team that indicated a wide variety of views in the market on what big data is all about. These range from simple multi-data source implementations, to high volume data management and enterprise-wide management of structured and unstructured data sources.
But whatever the terminology used to describe big data, the pitfalls remain the same. Among them, Delaney described the challenge of data variety, saying: “This is an orchestration issue. There is legacy data, silos of data, data from different sources and formats, data from different locations and data that is updated at different times. Then there is structured and unstructured data. How can a snapshot of data be delivered on demand to the business when all the data is different? Pulling it all together for a report is a big problem.”
Seeking to resolve the problem, the panel and audience discussed the imposition of standards on data, but concluded that these were unlikely to be successful in the short term as data is lucrative and many financial firms want standards to work in their favour. The variety of data sets could also aggravate the imposition of standards, as could a shortfall in multiple regulators’ understanding of all data issues in global trading.
Despite the challenges, interest in big data is gaining traction in financial firms. Some banks are driven by fear of regulatory failure as they have little control over their data, others see the potential of using big data in trading analytics, and many see that breaking down data silos could generate revenues and save money.
As Sybase’s Grant commented: “Banks we talk to about big data see themselves as factories, so they need to innovate to gain competitive edge. They need more granular information, they need to consider using cloud technology and they need to start and follow trends such as analysing data in flow rather than storing it first and then analysing it.”
Big data projects are unlikely to be funded on an enterprise basis, leading Delaney to suggest they will be developed using stealth budgets won on the basis of proving the business case for certain parts of an organisation’s business.
While organisations are unlikely to take a big bang approach to big data, they are looking at data specialists for solutions. Panellist Robin Manicom, EMEA director, financial services, at data centre specialist Equinix, said: “We see data volumes continuing to surge in response to regulatory changes, mainly around market data growth for effective trading. Organisations are looking at more diverse input data, including sentiment, news, weather and digital trail data – data generated as a result of how we interact electronically with the world – as a way of gaining competitive advantage. The focus is moving beyond customer transactional analysis to customer interactional analysis. We are working with customers to find ways to store, assimilate and analyse these volumes of data. The need is to minimise the movement of data and analyse in situ.”
Fellow panellist Michael Le Lion, executive vice president international at data visualisation provider Panopticon Software, commented: “The right technology and good people are must haves to implement successful big data projects, such as moving from end of day to intraday, or even real time, reporting. If organisations wanting to do this don’t have the necessary technologies, they need to act quickly. I expect to see a lot of proof of concept projects and due diligence over the next three to six months. If this doesn’t happen, these projects won’t deliver.”
Reflecting on the four ‘Vs’ of big data described by Amir Halfon, Oracle senior director of technology, global financial services at A-Team’s Data Management for Risk Analytics and Valuations conference last October – value, volume, velocity and variety – Le Lion, offered a fifth ‘V’ of data visualisation.
In terms of who will manage big data, Le Lion suggested the chief information officer (CIO) will develop into more of a CDO role, but this was not a view shared by all panellists. Delaney expects the CIO to continue working on internal technology infrastructure, while an additional CDO will have an external focus and consider how opportunities can be generated around a firm’s data and how it could be sold. Similarly, Grant envisages a commercial role, but not without an understanding of a firm’s data and technologies.
Technologies for data management and potentially big data were the subject of the individual speakers’ presentations at the FISD event. Chris Pickles, BT head of industry initiatives, global banking & financial markets, addressed the issue of rising volumes of market data coupled to the need to reduce its cost. He proposed the use of unified communications that can combine high volume market data and other types of data in fat pipes, reducing the cost of market data delivered unilaterally through several skinny pipes.
Martin Millstam, solutions consultant at Exegy, a provider of ticker plant solutions, discussed the need for constant innovation in capital markets prone to volatility and increased messaging. Technologies he noted that could support the resulting data tsunami include InfiniBand and 10GigE networks, and multi-core and field programmable gate array chips.
On a wider scale, Darren Lewis, head of CityVision development at real-time data systems supplier Arcontech, suggested a technical approach to data that could go some way towards meeting market demand for lower market data costs, freedom to chop and change between data vendors, and savings from desktop consolidation