Trading automation, regulations, and systemic risk
The financial services industry has always pursued technical supremacy. After all, a faster trading platform, lower latency transactions or better financial modelling means competitive edge. But after years of financial crisis and attempted reforms to improve the transparency and understanding of risk exposure in financial services, we seem as much in the dark as ever as to understanding the risk the financial system is exposed to, writes Said Tabet.
I think this comes down to a few key things, which I’ll explore in this article. First, the nature of automation in financial services itself generates systemic risk (through repetition, perpetuation, and interconnectedness). Second, slowly evolving ‘good enough for compliance’ approaches to governance and data management don’t take us far enough in delivering real insight into where any given financial institution – never mind the market – is from a risk perspective today.
Our aspiration has to be, even insofar as algorithms predict waves of future financial success for HFT traders, that better data management and governance, and better transparency and risk assessment in financial services could deliver an overall barometer for the stability of the market.
The finance industry is going through significant change even as the pressure and scrutiny on these institutions remains at an all-time high. New regulations and changing processes dictated by the Bank of England and the Government are rippling through the industry, including Basel III, Living Wills and Recovery and Resolution Plans. When you look at how we are complying with these regulations, you see that the innovators of technology which underpins our financial institutions are working hard to keep pace. Increasingly automated, intelligent and data-driven, financial systems are doing more, faster, and often with very little overhaul of banks’ underlying infrastructures.
Alongside this, there is some anxiety about the risk that comes with a technology driven approach. The European Systemic Risk Board, much like the US and other National regulators, has already made its concerns about new fast developing technologies very clear.
While technology innovation is a good thing and should be encouraged, its implementation and application in financial services requires supervision and regulation. With new developments enabling virtually zero latency transactions, we are facing brand new challenges with the potential for future risks of a completely different nature, complexity, and time horizon than those we have seen to date. For example, with more complex configuration and new technologies, failure in mission critical systems can affect the market within seconds. The Knight Capital crash is a warning; in less than an hour, the firm lost $440million after an erroneous computer algorithm sent errant orders that exchanges declined to cancel. Technology has the potential to drive our financial institutions forward, but also to cause significant damage when not managed well.
The global inter-connectedness of financial systems, and the increase in automated trading, coupled with complex financial products, high performance and low cost computing, creates the potential for risk that is challenging to mitigate using traditional industry practices – risks that we can’t measure and an environment and threat matrix likely to become even more complex over time.
Data is at the center of risk assessment in financial services but requires quality, integrity, security, semantics, context, traceability and transparency to become useful insight. To deliver all of these requirements in a disciplined and consistent fashion, we need standards, industry best practice, effective and efficient regulation and supervision. For example, knowing credit default swaps spreads, prices, and other data that is used primarily to infer correlations is not enough on its own to measure risk. As shown by the 2008 crash, too little insight was available to make informed decisions and assumptions made based on limited data caused a huge crisis in the market.
Using data in all its forms to measure risk, and maybe even drive a system that accurately measures the vulnerability of the global financial system, is the key to protecting financial institutions from any number of threats, from cyber risks to more traditional fraud, spotting patterns of behavior and suspicious activity in real time.. When we consider potential breaches that may affect high frequency trading platforms and the hijacking of sophisticated algorithms, the risks are extremely high and operational cyber security must be addressed effectively. Strong controls and continuous monitoring environments based on smart data analytics are needed to keep track of threats in real time.
Advanced frameworks are being developed to take advantage of new machine learning technologies with this volume, variety and velocity of data and today’s high performance computing resources. But, before these insights can be gleaned, an understanding of the network and the data on it, particularly its location and security measures around it, is crucial.
This is particularly key in times of crisis, when financial institutions need to react in real-time. They need to know where critical data is, and who owns it. They need to know who the key stakeholders are, how their data relates to external data within and across counterparties, and jurisdictions. They need to have actionable context-aware information. For example, can you recognise when an attack is taking place and, if so, can you protect your most sensitive data from it and ensure that customer facing services remain unaffected, in real time?
Today’s systems are still lagging behind technological advances and industry standards and best practices. Recent stress tests in Europe and the USA and news that they are to become tougher in coming months, demonstrates how many of the most important firms are still failing to track their data, often for elementary reasons such as poor governance, weak risk management and ineffective controls.
Financial institutions must understand the data they have, the applications they run, where data is stored and how it is secured, before we can gain more solid insight into the businesses’ – and indeed the sector’s – risk exposure. Management of the IT infrastructure will be vital in driving innovation in the future, as it will be a key enabler of future change.
In broader terms, regulatory and global supervision is needed to support transparency and build trust. Industry standards and best practice need to catch up with the pace of technological developments.
A more holistic view of the available data may help us take a view of the longer term risks to the industry, perhaps even succeeding where most economists failed; predicting, or at least giving an early warning of, the next big financial crisis.