In a world dominated by turbulence and interdependency, fragility and complexity are the main new factors which are impacting the global economy and driving financial performance. While the unprecedented challenges affecting the global economy are a source of both opportunities and threats, traditional analytics technology is insufficient when it comes to capturing new drivers of value creation. The ‘New Normal’ of a more uncertain world requires a different kind of approach and a new set of analytical tools. In order to address these challenges and to provide guidance to financial markets the recently-developed Quantitative Complexity Theory (QCT) helps reflect the interplay between these drivers and provide new insights and knowledge.
Systemic risks are not well-defined and are a generally poorly understood concept. There is very little research on the subject. This leaves the door open to regulatory discretion, which can compound these risks further. The idea, therefore, is to approach the problem from a totally different angle. Instead of trying to measure the ‘degree of riskiness’ or exposure, the approach is to measure the resilience of a given system. Resilience measures the ability of a system to absorb shocks and destabilizing events, such as financial contagion, shocks, conflicts, catastrophic events, sudden loss of major suppliers or customers, etc. In a turbulent economy, a resilient business has better chances of survival.
A simple and schematic example of a system is illustrated below, where the interactions between a bank and its ‘ecosystem’ are indicated. The state of health of a bank – essentially its rating – is equivalent to that of its entire ecosystem. The idea, therefore, is to infer how resilient a bank is from the resilience of its entire client base and not from its balance sheet.
The problem is that conventional risk assessment and risk rating technology cannot be easily applied to ensembles of hundreds of thousands or millions of corporations. Rating them one by one is not a problem – this is what the rating agencies already do – the issue is to rate the entire set as a single entity in which the components interact with each other. Today the economy is highly interdependent. Because of this, the state of health of a system (market, industry sector) cannot be determined from that of its components. Individually corporations may appear as healthy and yet a system thereof may still be fragile. Analyzing market statistics can be very misleading to say the least. Conventional analytics technology and ‘linear thinking’ can lead to extremely misleading results and conclusions. Moreover, as the number of system components increases, the number of the so-called ‘modes of behaviour’ also increases. This means that as the turbulence and uncertainty of markets grow, more complex systems can develop the capacity to suddenly produce surprising behaviour. A system of apparently healthy components may conceal far from obvious concentrations of fragility. The whole can be greater than the sum of the parts but is can also be much less. This is precisely why it is so important to actually get the big picture. This is the main motivation behind systemic analyses such as the one described herein.
Below is an example of the so-called Complexity Map which reflects the structure and interdependencies of a system of sixteen large European banks. The size of the nodes is proportional to the footprint of each bank in terms of resilience of the system as a whole.
The above system, which contains only sixteen elements, has been analyzed based on quarterly financial statements provided by each of the banks. Each bank has millions of customers which compete/collaborate with each other, forming a huge dynamic network. Visualizing such a network is practically impossible.
The availability of massive computing power and new approaches such as the Quantitative Complexity Theory, allow us to analyze, for example, quarterly financial statements of thousands of corporations, identifying the interdependencies between each and every one and measuring the resilience of the system. A systemic resilience analysis is extremely interesting to banks, regulators, governments and, of course, to investors. This is because it allows us to answer the following questions:
The new paradigm is “Too Complex To Survive”. The enemy is excessive complexity. And why?
As a next step, Ontonix, CINECA and QBT are working on developing a Massively Parallel Processing version of OntoNet™ which will be able to harness to the full extent the immense compute power of CINECA’s Fermi supercomputer.
Systemic resilience analysis brings to light another fundamental and new issue. When he served on the Banking, Finance and Urban Affairs Committee in 1984, Stewart McKinney claimed that certain super-huge companies are “Too Big To Fail”. He was wrong. Super-huge companies and banks have failed and without early warning. Size matters but not always. It all depends on the context. Today, in a turbulent, uncertain and inter-connected economy, the problem is not so much size as complexity. The new paradigm is “Too Complex To Survive”. The enemy is excessive complexity. And why?
ABOUT THE AUTHOR – Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. In 2009 he developed a complexity and resilience-based rating system for businesses and financial products. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings, which focuses on Resistance-to-Shocks ratings for financial products, corporations and countries.
A pleasure!
Welcome to The Resilience Post Jacek!