Having recently made the move from ‘Fintech’ to ‘Regtech’ there is a lot that feels familiar, I cannot help but draw parallels between the two genres and wonder whether any lessons can be taken from the past.

 

Most would tell you that Fintech emerged from the financial crisis in 2008, driven by consumer demand for lending and a rise of disruptive innovative technology. In capital markets though, the beginnings of Fintech can be traced back to the large scale electronification of the exchanges in the late 1990s. This created the opportunity for technologists and Independent Software Vendors (ISV) to satisfy the needs of a then relatively analogue and resistant trading community.

 

When the London International Financial Futures Exchange (LIFFE) closed for open outcry and the exchange opened up their Application Programming Interface (API) to the vendor community, though it seems implausible now, there was an initial hesitancy to accept that this new world was here to stay, and represented an advancement. The change provided a window of opportunity for those firms who could both provide innovative software services and had the market understanding to build relevant tools to steer their clients through the new infrastructure landscape.  With banks and brokers largely preoccupied with the criticality of transforming back and middle office systems, it was a golden period for tech start-ups that could offer solutions in this space.

 

Those days are resonating strongly with me right now as there is clearly another window of opportunity at play. I am seeing the same energy, momentum and creativity to provide solutions around post-trade risk and compliance now that once existed for developing tools for trading and execution. As before, many banks, brokers, hedge funds and asset managers are not fully prepared, partly due to a lack of in house knowledge when it comes to new and more complex regulations. Furthermore, a lot of market participants are facing a challenge of managing multiple legacy systems and internal resources that look after fragmented pieces of the regulatory puzzle. Therefore, they are looking to the vendor community’s expertise and specialist focus to ensure they can easily and efficiently meet ever changing obligations. They are right to do so as the costs of building their own solutions can spiral and the implications of getting it wrong can be costly.

 

Fines under MIFID have been steadily increasing and there seems little argument to counter that this won’t be set to continue after the EMIR rewrite and MIFID II come into force.

 

fines-graph

 

So, are there any other past lessons we can take to guide us through the coming years in Regtech? I have observed three obvious commonalities so far.

 

Data quality

 

In the front office, maintaining quality static and reference data has been an underlying theme over the years to ensure trading continuity and accurate portfolio analysis. The size of the task grew exponentially in line with client demand for additional markets and asset classes. Getting this right allowed us to gather and serve our clients with valuable outliers around predictive analysis and risk.  Over time the same tools will have relevance and value in the back office, particularly around transaction reporting but it is apparent there is a lot of work to do before informed insight from these tools can be realised.  I have been surprised by the general acceptance of low reconciliation rates, which underlines the importance of getting your data right at source.

 

Latency

 

In the exchange traded space the one constant over the years has been the need to reduce execution and market data latency, whether this has been for competitive edge or around price formation. We have conversed in seconds, milliseconds, microseconds to nanoseconds as the technology arms race has evolved and companies that have not adapted have fallen by the wayside. I could see a similar pattern emerging in the back office. In the transaction reporting space history tells us that current T+1 reporting obligations will quickly move to ‘near to real-time as possible’ and reporting firms should ensure they have the architecture ready early to accommodate these changes. Requirements for increased granularity in timestamping and the inevitable adoption of distributed ledger technologies only add long term weight to this argument.

 

Normalisation

 

Supplying in depth native connectivity to futures exchanges such as LIFFE, Eurex, CME and ICE was a differentiator at first but globalisation, exchange consolidation and client demand quickly expanded the scope of expected connectivity. You had to think carefully how to best utilise in house engineering teams in order to remain competitive.  Vendors and brokers started to engage with specialist vendors who could normalise global markets to fill these connectivity gaps and further Fintech firms were born from this requirement.  In the post trade world, again a similar theme is emerging.  As regulation starts to capture more market participants and volumes grow, the need to accept data in a unified and ingestible format is paramount or you risk a potentially crippling operational and development overhead, RegTech firms need to think carefully about where their true value lies and whether strategic partnerships are the path to this.

 

In conclusion, far from it being a case of regulation stifling innovation, it seems to me now a golden period for technology in the back office and that this is only the beginning.  Data quality will improve, latency will reduce and the scope and expectation will grow for the vendor, we should leverage the lessons from the front office to ensure we can gain meaningful insights and fulfil our ultimate objective of contributing to a safer marketplace.