Big data represents a huge challenge but offers a world of opportunities
The irresistible rise of big data continues in the securities services world. Several senior market participants identified big data as one of the next big things to affect their industry. If conversations undertaken in preparation for this article are representative of on-going industry thinking, that is increasingly the case.
Securities services providers will concede that big data is a gigantic chore and an expensive one, given the need to continue investing in technology, systems and processes. Planning for capacity is a fundamental element of any IT solution. But it also represents a business opportunity, or even business opportunities, as new ways of thinking are generated by new ways of doing.
But for securities services providers, what does big data mean? It is certainly a catchy, if increasingly hackneyed, phrase, but what lies beneath in the day-to-day reality? The answer is nothing if not multi-layered.
Craig Bell, global head of information services, BNP Paribas Securities Services (BNPP), summarises the pressing technical issues. The big data frameworks address limitations and challenges typically associated with conventional data storage solutions, he says. These are commonly known as the five Vs: volume, velocity, variety, value and veracity.
• Volume is no longer an issue as working in petabytes has become the norm.
• With regard to velocity, processing speeds are significantly more powerful, which is key as more use cases require real-time data processing.
• With variety, the issue is no longer just structured financial data but unstructured data like documents and images.
• Value derives from the ability to store, aggregate and link data from many sources including structured and unstructured which can start to provide previously unknown insights into data. Hence data now begins to generate genuine value in its own right.
• Veracity – that is, the correctness and accuracy of information – have never been more important. Any information management framework must have embedded within it core principles of data quality, data governance (lineage and transparency) and metadata management, along with considerations for privacy and legal concerns.
“By fundamentally changing the approach to delivery of reporting and dashboarding solutions through the use of big data technologies we are bringing down time to market by aligning agile delivery closer to the business”
The work involved in employing big data effectively is Herculean. That said, there are clear positive developments. Bell, for instance, is not alone when he says that implementing a big data framework represents a business opportunity for his institution. “As such, we have identified many different use cases which will benefit from the enhanced functionality, agility and power that a big data framework brings which within BNPP we have called our Data Hub,” Bell says.
While some unnamed institutions have taken the approach to simply load all their data into a data lake in the hope of then finding additional value from data, BNPP has strict governance on the use of the Data Hub. Use cases are carefully reviewed to assess whether the Data Hub is in fact the right data and technology framework with which to deliver the use case, he says.
“One such use case which we are developing is the creation of our data as a service product,” Bell continues. “Within this we take post-operational data from multiple sources, aggregate it, enrich it and create new analysis and corresponding data sets for our clients to consume. This process of investment data management represents a real challenge for our clients and is one that many of our asset-owner clients are faced with. Our aim is to provide a common client experience to our clients when consuming their data (typically sourced across many platforms) and act as a single point of access to their aggregated data.”
Bell helpfully introduces several metrics to illustrate the points he is making: “Another use case where we are harnessing the full extent of the processing power of the Data Hub is where we monitor intra-day the counterparty and liquidity risk arising out of over-the-counter (OTC) derivatives exposures. It involves the calculation of 300m prices for 1,200 OTC contracts. As part of using the Data Hub, processing times have reduced from seven hours to completing the calculations to under five minutes.”
There is more to it than simply reducing the speed of operations. Bell’s colleague, Dave Morris, global head, data shared service centre at BNPP, adds: “On top of this, by fundamentally changing the approach to delivery of reporting and dashboarding solutions through the use of big data technologies we are bringing down time to market by aligning agile delivery closer to the business. Data-sourcing challenges cannot be underestimated and this is certainly an area where we have focused on ensuring the processes are industrialised.”
Are clients benefitting in any way other than having state-of-the-art IT at their custody provider? Craig Bell says yes. “Bringing together our clients’ data into a single environment offers up many opportunities,” he says. “Our vision for our investment office product is that it provides a collection of ‘data services’ sitting across our big data framework. See it as an ecosystem of service outputs, where an institutional investor can become a data explorer by visualising and interacting with their data.
“We also envisage that it will help with the burden of things like regulatory reporting which at the heart of it is a data management challenge due to having to source data from many disparate sources, ensure coherence and provide a clean and consistent output,” he says.
“This is the key,” says Sarj Panesar, global head of business development, insurance, at Société Générale Securities Services (SGSS ). “The technology is the tools to be able to provide key insights, trends, accurate and timely reporting,” he adds.
“We are collating, holding, integrating and reporting on more data points than ever before. The adage that knowledge is power has never been truer. Regulatory reporting items such as Solvency II, packaged retail investment and insurance products (PRIIPs) and MIFID II have required for more data points to be collected upon and reported. What this has shown us is that you do need an open architecture to enrich the data you have. Every day we are delivering data and reporting to our clients to ensure that they can run their businesses.
“Yes, it is additional work and it is endless amounts of data gathering and processing,” adds Panesar. “But the ability to integrate the data using new data management tools is already yielding positive results. The insights that you can deliver to your clients really do outweigh any additional work. The insights and results are only limited by your mind. The new technologies really are making a difference in terms of speed and accuracy.”
And it is clear that the additional work never stops. “We are looking at more efficient ways of generating, holding, manipulating and reporting on the data,” says Panesar. “An example is a service we developing requiring data from numerous sources (both in-house and external) where we are using data lake technologies. This provides a very efficient way of holding and accessing the data.” The never-ending task of painting Scotland’s iconic Forth Bridge springs unbidden to mind.
“This process of investment data management represents a real challenge for our clients and is one that many of our asset owner clients are faced with”
BNPP’s Morris adds a human touch at this point, tacitly acknowledging that despite predictions of the growth of artificial intelligence and machine learning, humans will have a role to play for a while yet. “We have also had to look at the introduction of big data technology from an organisational perspective to ensure we are set up to be able to leverage this technology. This has meant working with our human resources and learning and development teams to ensure we are equipping our employees with the correct skills and hiring the right calibre of staff.”
Jamie Stevenson, managing director for data analytics at RBC Investor and Treasury Services, is another who identifies a change in mindset among clients in relation to data. “I am increasingly spending time with clients who say they need a data strategy,” he says. “We’ve had plenty of hype around big data but asset managers are now asking how they can best use it.”
Stevenson refers to three principal areas of discussion. One, using APIs (application programming interfaces) in uploading and transferring files to enhance the client experience and reduce complexity.
Two, clients are asking for access to the tools, talent and technology offered by securities services providers.
Three, insight. “The nature of what we are providing is changing,” Stevenson explains. “We don’t just deliver the files we are asked for. We look to get under the skin of the original underlying problem that clients are trying to solve. We see similar problems recurring across our client base and this helps us recommend solutions more effectively. Hearing a client’s frustration can help us change our product and delight our client.”