In the era of globalized and interconnected economy, it is becoming clear that all corporations, banks and countries form part of a huge system of systems. As turbulence increases and the possibility of shocks and extreme events rises, the importance of a systems perspective of the economy becomes evident. In fact, due to turbulence, and because the global economy is increasingly fragile, in highly interconnected systems the propagation of stresses and traumas is very fast and can lead to a huge number of possible often surprising outcomes. This number increases with the complexity of the system. However, the idea and concept of ‘systemic risks’, even though it has become popular during the current economy meltdown, is very difficult to define. By ‘systemic’ we refer of course to anything that can have repercussions (damage, consequences) at system level. For example, the so-called Too-Big-To-Fail companies are thought to be of systemic importance as their collapse could affect the economy severely. The TBTF concept, however, is becoming less significant and a new idea – Too Complex To Survive – is gaining popularity precisely because of what a systems approach can teach us. Since excessive complexity is a formidable source of vulnerability (exposure) and because the global economy is increasingly complex, a systems approach is mandatory. Conventional pre-crisis techniques of risk assessment, management and rating have been conceived in an almost turbulence-free world and are not applicable to the new situation as the current crisis so eloquently demonstrates. This article illustrates a new approach, based on the quantification of systemic complexity, which allows one to better understand the dynamics and functioning of systems.
Systemic risks are not well-defined and are a generally poorly understood concept. This leaves the door open to regulatory discretion, which can compound these risks further. The idea, therefore, is to approach the problem from a totally different angle. Instead of trying to measure risks – a non-physical quantity which mandates the construction of a model – the idea is to measure directly the resilience of the system instead. Clearly, model uncertainty has a dramatic impact on the results and consequences of its usage. The advantage of measuring system resilience stems from two key issues. First of all resilience is a physical quantity which measures a system’s resistance to shocks. It can be measured. Second, the computation may be performed without actually building a model via a model-free technique. In practice, all that is needed is periodically sampled data which reflects the functioning of the system in question. Good examples are cash flow, ratios or income statement-type data, which all corporations possess. The most important things in a model are those it doesn’t contain. When it comes to building models of complex systems the danger of leaving out important items increases rapidly with complexity. This is the main argument behind using model-free techniques.
Aa fundamental source of system fragility is excessive complexity. Corporations with excessively complex business models can be shown to be intrinsically fragile. It is for this reason that resilient businesses are generally simpler and leaner. It is intuitive that, all things being equal, a simpler solution is preferred if it does the job. Clearly, complexity and resilience are intimately related which is why complexity reduction is a means of increasing resilience. The Quantitative Complexity Management (QCM) technology has shown how the complexity of a business is also a new and holistic Key Performance Indicator and how it can be used to not only measure its resilience but also to establish metrics of its governability (controllability) and stability.
A bank is as healthy as the ecosystem of its clients. This can range from hundreds of thousands to millions. These clients themselves constitute a highly interconnected system. The clients of a bank provide it with two types of information:
- Balance Sheet data (corporate clients)
- Transaction-based data (corporate and retail clients)
Data in Balance Sheets is, generally speaking, subjective. In fact the same business may be reflected in a multitude of Balance Sheets compatible with the accepted accounting standards. Transactional data, on the other hand, is objective as it reflects real operations (deposit, withdrawal, purchase of stocks, loans, transfer of salaries, etc.). In any event both types of information may help a bank infer the state of health of its client base and, therefore, its own situation from a resilience, stability and sustainability standpoints. At a macro level, a complexity map of an ecosystem of corporate clients of a bank is shown below.

Complexity Map of ecosystem of 1000+ corporate clients of a retail bank
In an interconnected and complex economy corporations form networks, or ecosystems. In the case of banks, telecoms or insurance companies these ecosystems can be huge as they may contain millions of clients. Moreover, such ecosystems are changing constantly as companies compete, cooperate and default while new companies are being formed. Rapid change and complexity of the dynamics of these systems are their main characteristics. Like all companies, banks, telecoms or insurance companies exist thanks to the ecosystems of their respective clients and, to a large degree, it may be said that the ‘state of health’ of their respective businesses depends on that of their client ecosystems. In the case of large corporate clients of, say, a bank, they themselves depend on their own ecosystems of clients. In practice, we’re talking of a system of systems of immense proportions. The complexity and degree of interdependency between its components determines many interesting characteristics:
- resilience – capacity to absorb destabilizing events and survive turbulence
- speed of propagation of contagion, shocks
- weak points, hubs
The clients of banks, insurance companies or telecoms fall into two main categories: retail and corporate. An experiment has performed an in which an ecosystem of over 1000 corporate clients of a retail bank has been analysed. The goal was to establish its stability and resilience.

Complexity Map of ecosystem of 1000+ corporate clients of a retail bank. Each node contains tens or hundreds of corporations. The system has approximately 100.000 variables.
The 1000+ clients span 24 market sectors:
- Automobiles & Components
- Banks
- Capital Goods
- Commercial & Professional Services
- Consumer Durables & Apparel
- Diversified Financials
- Energy
- Food & Staples Retailing
- Food, Beverage & Tobacco
- Health Care Equipment & Services
- Hotels Restaurants & Leisure
- Household & Personal Products
- Insurance
- Materials
- Media
- Pharmaceuticals & Biotechnology
- Real Estate
- Retailing
- Semiconductors & Semiconductor Equipment
- Software & Services
- Technology Hardware & Equipment
- Telecommunication Services
- Transportation
- Utilities
Analysis has been performed using quarterly balance sheet data but it may also be performed using, for example, monthly transactional data.
The Complexity Map of the ecosystem illustrates the structure of inter-dependencies between the various industry sectors. Large nodes correspond to sectors with a larger footprint on the ecosystem. In the case in question, the structure of said ecosystem is ‘dominated’ by corporations belonging to the following sectors:
- Capital goods
- Commercial & professional services
- Transportation
- Healthcare equipment & services
- Real estate
- Semiconductors and semiconductor equipment
Clients from these sectors from the structural backbone of the ecosystem in terms of its stability (robustness, or resilience) and not of its performance. Evidently, the bank in question is quite aware of the performance of its clients. The goal of the study is mainly to identify the overall stability, or resilience, of the system of 1000+ clients and specifically to measure the resilience of each sector.
The results are reported in the bar chart illustrate below. It may be noted that the most resilient sectors in the case in question are:
- Capital Goods
- Commercial & Professional Services
- Health Care Equipment & Services
- Technology Hardware & Equipment
- Transportation
- Real Estate
The resilience of the above sectors is above 70% (see top part of bar chart). The following sectors are the most fragile with resilience below 50%.
- Utilities
- Semiconductors & Semiconductor Equipment
- Hotels Restaurants & Leisure
- Consumer Durables & Apparel
The overall resilience of the entire ecosystem is just over 72% while average resilience is approximately 64%.

Resilience by sector of ecosystem of 1000+ corporate clients of a retail bank
Optimum Complexity’s objective, in the near term, is to essentially remove the above limits and to take the analysis to global scale. With world-class computational resources – our Parallel version of OntoNet runs on supercomputers – and in collaboration with data providers we are planning to map the entire global economy on a daily basis so to offer systemic analyses from a multitude of perspectives, such as:
- Market segment
- Geography
- Size/revenue
- Stock markets
- Rating
Our analysis shall become increasingly comprehensive and broad-scope not just broad-scale. For example, currently there are approximately 45000 listed companies in 59 stock exchanges. We’re speaking of over 5 million variables. As an example, in 2014 an analysis of a system of 3400 companies listed on Wall Street has been performed. The analysis was based on quarterly balance sheet information and involved more than 260000 variables. It took 4 hours to run on a supercomputer. Analysing 45000 companies and the corresponding systemic risks is a totally different animal.
An example of how intricate a system of companies is, consider the map of interactions illustrated below. It contains 1300 companies, each represented by a square node on the diagonal of the map. There are more than 500000 interactions between these nodes.

The larger nodes represent companies that have a dominant footprint in terms of resilience (risk) of the whole system. Their number is quite small compared to the size of the ecosystem but they indicate where potential criticalities affecting the entire system can be found. This is what large supply chains look like (by the way, they should be called “supply networks”).
The above map of this relatively small system – 1300 nodes, 500 000 interdependencies – illustrates why it is so difficult to understand the dynamics of the global economy and of the global financial system. Unless specific tools are used it will not be possible to make credible and realistic (realistic, not precise!) forecasts, performance objectives, or risk assessments on a systemic scale. It so happens that today the most insidious form of risk is systemic risk – a complexity-induced risk.
Given that the economy is turbulent and punctuated by destabilizing events, which will grow in intensity and frequency, similar analyses will have to be performed on at least a quarterly basis in order to be useful. If an investor invests in, say, the Telco sector, he may want to know how healthy that particular sector is. The old approach was to take the top companies, check their rating, look at market statistics, forecasts, and to ask a few experts for their (subjective) opinion. Today we can analyse the entire sector, taking into account all company-to-company interactions and offer a complete and systemic picture in a matter of hours. Most importantly, this can be done based on objective data, not on subjective opinions of experts.
You liked what you read ? Leave a comment.
See more posts from Jacek Marczyk (Ontonixqcm) at ontonixqcm.blog