New issuance was in full swing in the first week of December with 26 deals priced for total proceeds of $4.7B. Down… https://t.co/WZx4xuN3m6
Capitalizing on the data deluge
Managing the huge amounts of data now available to the financial sector is a challenge that is being met by integrating new tools and methods with legacy systems.
The volume, complexity and pace of growth in the amount of data used by the financial services industry is opening up new opportunities but at the same time creating many challenges, not least in the management and analysis of "alternative data".
Institutional investors and investment managers have traditionally sourced their data directly from the companies they invest in - presentations, stock marking filings, press releases and the like, which is structured, and therefore relatively easy to gather, store, manage and analyse.
Alternative data, however, is collected from non-traditional sources. It comes from website use, social media, card payments, press articles and other sources. Although much of it is structured, a lot is unstructured and therefore more difficult to handle.
"There is a huge ecosystem for alternative data, but it takes time to collect it, and this is especially difficult for small teams", says Tony Guida, Senior Investment Manager for RPMI Railpen, the railway workers' pension scheme.
As a result, investment groups have more than quadrupled their number of alternative data analysts over the past five years in order to cope, according to a study by AlternativeData.org.
Antoine Forterre, Co-CEO of Man AHL, the diversified quantitative investment manager, says the explosion in data has created a "new frontier" that is difficult navigate through. "As with any frontier, it's full of cowboys, tribes and adventurers," he says. "It's hard to manage and it's expensive. Sometimes it's legally blurry. Sometimes it's morally unclear, as we have seen in the news with stories about the misuse of personal data.
"Despite all of that, we have developed the capabilities to source data sets, scrub them, on-board them, and make sure they fit in our system. Speed is of the essence."
Despite the easy availability of new tools and methods for processing these huge data volumes, integrating them with old technology is difficult, but not impossible, as the 235-year-old Bank of Ireland has found. "We have lots of legacy technology so we need good data management practices," says Barry Green, the bank's Chief Data Officer. "So, for us it's all about building an infrastructure that allows us to be nimble."
Andrew Barnett, Chief Data Officer for Legal and General Investment Management, one of Europe's biggest asset managers and a global investor, says that "legacy technology is really expensive to turn off". His approach therefore is to leave much of the data in its old environment but use new technology, such as virtualisation, "to analyse it better than ever before and use it to create value for our distribution and investments teams".
Many financial institutions are managing their data in the cloud. Cloud service providers like Amazon Web Services, Google Cloud and Microsfoft Azure offer "very sophisticated data management tools that allow you to run data analysis against complex data sets," says Andrew Eisen, head of EDM Product Management and Cloud Strategy at IHS Markit.
Companies analyse commercial and financial data to improve their sales, or in the case of investment managers to generate alpha for their clients. But they also analyse operational data to gauge how efficient and cost-effective they are - data on premises, staff travel, legal and compliance functions, cash management, logistics, IT and more.
"You can put in tools to understand these operational patterns of behaviour, and then address them if necessary," says Mr Eisen.
When all is said and done, though, "technology is just the enabler", he says. Effective data management is always down to "people and culture". Firms need to recruit and train the best people and ensure that corporate culture has changed to keep in step with the digital age.
Michael Imeson is Senior Content Editor, Financial Times Live; and Contributing Editor, The Banker. This article is based on a panel session at "The Data and Disruptive Technology Forum: Rethinking the Financial Services Ecosystem" organised by the Financial Times and IHS Markit in London in June 2018.
- Corporate Bond Pricing Data, November 2019 Recap
- Cloud 2.0 – What to expect in financial services
- Due Diligence 2.O: How Vendor Risk Assessment Will Evolve
- Payments for Research – The View from Frankfurt
- Protests to weigh on Hong Kong SAR's dividend outlook
- Securities Lending Q3 Review
- Muni Monthly Summary - October 2019
- Employee spotlight: Tozar Gandhi, The Bond Buyer’s rising star
203 new issue bonds will hit the market next week; Texas PAB Surface Transportation Corporation leads with over $1B… https://t.co/FMviNlWrIu