The derivatives proficiency puzzle facing many organisations is not the volume of data they possess but how to maximise its utility.
Data is the source of some impressive statistics. Ninety percent of all data was created in the past two years, 2.5 quintillion bytes are made every day (that’s 2.5 followed by 18 zeros), and there are 400,000 bytes of data for every grain of sand on earth. The numbers are mind-boggling, but they don’t answer the most telling question facing financial businesses: How do you make the best use of data to create meaningful value?
The challenge facing many banks and other financial organisations is not the volume of data they possess – large banks have thousands of pieces of information and billions of rows of data for each customer. Rather, it is how to effectively marshal these resources to maximise their utility. The key is conceptually simple: higher levels of standardisation can streamline processes, cut inefficiencies, and ensure comparability. However, in practice, and against a background of highly divergent data and IT architectures, standardisation across markets and their participants is significantly more challenging.
As they invest in digital applications —from new platforms to APIs and advanced analytics— financial market participants should ideally ensure that the data and processes that underlie these innovations are mappable across infrastructures and geographies. They understand that clean, normalised and structured data, back by market-neutral workflows, is the key to value creation at scale. That means not only getting standardisation right internally but also across the industry.
From ISINs to legal entity identity identifiers and XBRL tagging, the journey to more standardised capital markets is well underway. However, despite progress in some pockets, much work needs to be done. The scale of the task has been highlighted by the FIA, which in November followed up on a review of post-trade settlement processes with the launch of a blueprint that recommended action to develop market standards and best practices in the trade and clearing lifecycle. As part of that effort, the FIA recommended the creation of an independent body to lead a consensus-based approach to change.
The catalyst for FIA action was the events of March 2020, at the beginning of the global pandemic. During those first chaotic months, amid a wholesale switch to remote working, the futures and options markets accumulated a severe backlog of unallocated and misallocated trades. Amid high trading volumes, subsequent months saw numerous days on which as many as 15 times more contracts than normal were unallocated on trade date. Drilling down into the root causes, market participants cited challenges including manual processing on T+ protocols, siloed processes, extended allocation windows, and the rise of big asset managers trading on block.
In the wake of the 2020 volatility, many market participants reflected on the fact that there has been a piecemeal evolution of the exchange-traded derivatives workflows over the past 30 years. While individual firms have made efforts to drive in-house standardisation, there has been relatively little coordinated effort across the system. As a result, activities such as fair allocation and average pricing had become significantly deficient.
Based on the principle that fire prevention is better than firefighting, the FIA formulated a work plan to migrate the listed derivatives industry to a more efficient and resilient operating environment. In a progress report published in March, it reported that it had taken the first steps toward a more standardised cross-market approach. A new standards body—the Derivatives Market Institute for Standards (DMIST)—was launched, attracting much attention at the recent IDX event in London. Conversations at IDX revealed progress over the past two years in the processing of give-ups and trades outstanding, but continuing challenges around allocation delays.
In its March progress report, the FIA highlighted a number of individual bottlenecks on both allocations and give-ups. These included instructions commonly given late in the day, inconsistent messaging formats and content, different CCP closing times, tight clearing windows, and limited trade date reconciliation between participants. In addition, market participants cited a lack of average pricing at many exchanges and inconsistent calculations of average prices. Static data remains highly unstandardised, partly as a result of the fact that it emits from so many different sources, while reconciliations are driven by manual processes.
Looking to the coming year, there is much work to be done to introduce the standardised approaches that will enable the derivatives market to operate efficiently. The good news is that led by the FIA, a plan is beginning to emerge. High on the agenda is action to introduce common standards for allocation and give-ups. Proposals include a maximum period for instructions to be sent to brokers and for confirmation of those instructions, as well as the potential for clearing houses to consider standardisation of cut-off times between the end of trading and clearing, including the extension of clearing windows. The next steps include moving towards consensus around suggested time frames.
These are small steps in on a bigger journey toward a more reliable and predictable marketplace. Still to be tackled are averaging pricing standards, the format and content of allocation and give-up messaging, and common data taxonomies – essential for many regulatory obligations. The good news is that the direction of travel is set. Now all that is required is concerted action, robust decision-making, and collaboration across the market that will drive consensus.