In a world dominated by turbulence and interdependency, fragility and complexity are the main new factors which are impacting the global economy and driving financial performance. While the unprecedented challenges affecting the global economy are a source of both opportunities and threats, traditional analytics technology is insufficient when it comes to capturing new drivers of value creation. The ‘New Normal’ of a more uncertain world requires a different kind of approach and a new set of analytical tools. In order to address these challenges and to provide guidance to financial markets the recently-developed Quantitative Complexity Theory (QCT) helps reflect the interplay between these drivers and provide new insights and knowledge.

Systemic risks are not well-defined and are a generally poorly understood concept. There is very little research on the subject. This leaves the door open to regulatory discretion, which can compound these risks further. The idea, therefore, is to approach the problem from a totally different angle. Instead of trying to measure the ‘degree of riskiness’ or exposure, the approach is to measure the resilience of a given system. Resilience measures the ability of a system to absorb shocks and destabilizing events, such as financial contagion, shocks, conflicts, catastrophic events, sudden loss of major suppliers or customers, etc. In a turbulent economy, a resilient business has better chances of survival.

A simple and schematic example of a system is illustrated below, where the interactions between a bank and its ‘ecosystem’ are indicated. The state of health of a bank – essentially its rating – is equivalent to that of its entire ecosystem. The idea, therefore, is to infer how resilient a bank is from the resilience of its entire client base and not from its balance sheet.

The problem is that conventional risk assessment and risk rating technology cannot be easily applied to ensembles of hundreds of thousands or millions of corporations. Rating them one by one is not a problem – this is what the rating agencies already do – the issue is to rate the entire set as a single entity in which the components interact with each other. Today the economy is highly interdependent. Because of this, the state of health of a system (market, industry sector) cannot be determined from that of its components. Individually corporations may appear as healthy and yet a system thereof may still be fragile. Analyzing market statistics can be very misleading to say the least. Conventional analytics technology and ‘linear thinking’ can lead to extremely misleading results and conclusions. Moreover, as the number of system components increases, the number of the so-called ‘modes of behaviour’ also increases. This means that as the turbulence and uncertainty of markets grow, more complex systems can develop the capacity to suddenly produce surprising behaviour. A system of apparently healthy components may conceal far from obvious concentrations of fragility. The whole can be greater than the sum of the parts but is can also be much less. This is precisely why it is so important to actually get the big picture. This is the main motivation behind systemic analyses such as the one described herein.

Below is an example of the so-called Complexity Map which reflects the structure and interdependencies of a system of sixteen large European banks. The size of the nodes is proportional to the footprint of each bank in terms of resilience of the system as a whole.

The above system, which contains only sixteen elements, has been analyzed based on quarterly financial statements provided by each of the banks. Each bank has millions of customers which compete/collaborate with each other, forming a huge dynamic network. Visualizing such a network is practically impossible.

The availability of massive computing power and new approaches such as the Quantitative Complexity Theory, allow us to analyze, for example, quarterly financial statements of thousands of corporations, identifying the interdependencies between each and every one and measuring the resilience of the system. A systemic resilience analysis is extremely interesting to banks, regulators, governments and, of course, to investors. This is because it allows us to answer the following questions:

  • What is the overall state of health of a given system (market or industry sector)?
  • How resilient/fragile is the system?
  • What is its degree of interdependency?
  • Which components of the system make is fragile and vulnerable?
  • Which are the dominant components of the system and what is their footprint?
  • Which are really ‘systemic’
  • How far is the system from potentially critical states?
  • How much market uncertainty can the system absorb while remaining stable?
  • What are the most like modes of failure and failure propagation?
  • Recently, Ontonix in collaboration with the Bologna-based supercomputer center CINECA and Chiasso-based QBT, has analyzed a system of over 3400 corporations listed on Wall Street using its complexity and resilience management tool OntoNet™. The objective of the study was to determine the resilience of the US economy based on the analysis of a large set of representative corporations, spanning all industrial sectors. Balance Sheet, Cash Flow and Income Statements of each of the companies have been analyzed. The corresponding number of variables is over 260000 while the number of interdependencies is well above 10 billion. The analysis runs in just under two hours. The resilience of this system of corporations, based on the available information, is 86%. This encouraging result means that the system in question is able to absorb a large increase in uncertainty and maintain its structure intact. From a practical point of view this means the system is quite stable. Resilience is of course independent of performance. A company may be very profitable and, at the same time, be quite fragile. Moreover, given the dynamic and unsteady nature of the global economy, things can change very quickly. Therefore, similar analyses should be performed at least on a quarterly basis.

    The new paradigm is “Too Complex To Survive”. The enemy is excessive complexity. And why?

    As a next step, Ontonix, CINECA and QBT are working on developing a Massively Parallel Processing version of OntoNet™ which will be able to harness to the full extent the immense compute power of CINECA’s Fermi supercomputer.

    This will enable, for example, to analyze the over 40000 companies listed on all of the World’s markets as a system, providing a measure of the resilience of the entire global economy and its capacity to survive future crises. Such a problem involves over 4 million variables. On a machine such as the Fermi the analysis will run in less than one day. An even more involved application is the analysis of the mentioned ecosystems of clients of banks, which can easily involve tens of millions of variables. The bottom line, however, is that today this kind of analysis is possible and within reach of banks, corporations, analysts, regulators and governments as well as investors.

    Systemic resilience analysis brings to light another fundamental and new issue. When he served on the Banking, Finance and Urban Affairs Committee in 1984, Stewart McKinney claimed that certain super-huge companies are “Too Big To Fail”. He was wrong. Super-huge companies and banks have failed and without early warning. Size matters but not always. It all depends on the context. Today, in a turbulent, uncertain and inter-connected economy, the problem is not so much size as complexity. The new paradigm is “Too Complex To Survive”. The enemy is excessive complexity. And why?

  • Highly complex systems are intrinsically hazardous systems.
  • Highly complex systems often run in degraded mode.
  • Catastrophe is always just around the corner.
  • The US economy, as based on the above analysis, has reached a relative complexity of just over 50%. It is therefore still quite far from a “Too Complex To Survive” situation. In today’s economic context in order to establish the real footprint of a company or a bank, i.e. to determine if it really is ‘systemic’, it is necessary to quantify how much complexity it contributes to the system, not the size of its balance sheet.


    ABOUT THE AUTHOR - Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. In 2009 he developed a complexity and resilience-based rating system for businesses and financial products. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings, which focuses on Resistance-to-Shocks ratings for financial products, corporations and countries.

    4 Comments »

    Leave a Reply