Sharing economy gets boost with new ISO international committee

ISO has recently established a technical committee to support this new business model in reaching its full potential.

Standardization can reduce these woes and exploit the benefits that such a business model can bring, by providing internationally-agreed ways of working that take into account everyone’s need
In 2017 ISO stepped in, bringing together some of the world’s leading experts on the subject to develop high-level international guidance and a foundation for future standards in the form of IWA 27, Guiding principles and framework for the sharing economy.

A lot has changed in the sharing economy in the ten or so years since the likes of Airbnb and Uber were launched. Then, there were just a handful of platforms, now there are literally thousands, some doing better than others. A few are going bankrupt, while others are worth a fortune, such as Uber, which was recently valued at USD 120 billion).

The sharing economy was born, at least in part, with the spirit of creating communities and reducing over-consumption. While some of that remains, there has also been a sharp shift of focus towards price and convenience, bringing with it as many opportunities as challenges. Consumers may pay less and get new forms of goods, services or experiences, but questions are sometimes raised over privacy, reliability or trustworthiness. There are also issues related to working conditions, providing convenience for some, precarity for others. Some believe that issues such as these are preventing the sharing economy from reaching its full potential.)

Read entire post Sharing economy gets boost with new ISO international committee | Clare Naden | ISO.org
Advertisements

Can we reinvent democracy for the long term?

“The origin of civil government,” wrote David Hume in 1739, is that “men are not able radically to cure, either in themselves or others, that narrowness of soul, which makes them prefer the present to the remote.” The Scottish philosopher was convinced that the institutions of government – such as political representatives and parliamentary debates – would serve to temper our impulsive and selfish desires, and foster society’s long-term interests and welfare.

Today Hume’s view appears little more than wishful thinking, since it is so startlingly clear that our political systems have become a cause of rampant short-termism rather than a cure for it. Many politicians can barely see beyond the next election, and dance to the tune of the latest opinion poll or tweet. Governments typically prefer quick fixes, such as putting more criminals behind bars rather than dealing with the deeper social and economic causes of crime. Nations bicker around international conference tables, focused on their near-term interests, while the planet burns and species disappear.

As the 24/7 news media pumps out the latest twist in the Brexit negotiations or obsesses over a throwaway comment from the US president, the myopia of modern democratic politics is all too obvious. So is there an antidote to this political presentism that pushes the interests of future generations permanently beyond the horizon?

Read entire post Can we reinvent democracy for the long term? | Roman Krznaric | BBC

 

Urban resilience: Why should we pay more attention?

Think cities — how they form, prosper, interconnect, and yield exponential gains on all fronts. There are numerous reasons why cities are created — colonial ambitions; sea-connectivity; part of ancient routes of trades, including slavery; centre for learning; economic growth; sites of administrative and cultural centres; and religious importance. Thus, there are reasons galore why cities are formed but very few on why they disappear at the drop of a hat.

However, climatic events can cause catastrophe to cities that can render them grounded in minutes

Change in the structure of national and local economy, poor infrastructure, rising pollution levels and lack of physical safety leads to decline of cities at a glacial pace. However, climatic events can cause catastrophe to cities that can render them grounded in minutes. The floods of Mumbai and Chennai, Nepal Earthquake, Uttarakhand floods are few such instances where our cities, many hundreds of years old, became paralysed and inhospitable. Cities are at real risks.

By one estimate, every year, around 46 million people in cities are at risk from flooding from storm surges in the East Asia region alone. Many coastal cities, particularly in Asia, are staring at the risk of submersion due to rising sea levels. More than 1,000 people died and 45 million people suffered losses in terms of loss of livelihood, homes, and services in 2017 when severe floods hit south-east Asian cities, including Dhaka, Mumbai and Chennai

Read entire post Urban resilience: Why should we pay more attention? | DEVASHISH DHAR | OrfOnline

Seven reasons why the world is improving

The late Swedish academic Hans Rosling has identified a worrying trend: not only do many people across advanced economies have no idea that the world is becoming a much better place, but they actually even think the opposite.

This is no wonder, when the news focuses on reporting catastrophes, terrorist attacks, wars and famines.

Who wants to hear about the fact that every day some 200,000 people around the world are lifted above the $2-a-day poverty line? Or that more than 300,000 people a day get access to electricity and clean water for the first time every day? These stories of people in low-income countries simply doesn’t make for exciting news coverage. But, as Rosling pointed out in his book Factfulness, it’s important to put all the bad news in perspective.

While it is true that globalisation has put some downward pressure on middle-class wages in advanced economies in recent decades, it has also helped lift hundreds of millions of people above the global poverty line – a development that has mostly occurred in South-East Asia.

Read entire post Seven reasons why the world is improving | Julius Probst| BBC

In the face of extreme weather events, Canada must plan for resiliency

As we consider the federal government’s latest budget, we need to pay attention to the investments we are making to build a strong and resilient economic foundation; they are more important than ever.

Posted on The Globe and Mail  |  By Charles Brindamour

Finance Minister Bill Morneau delivering the federal budget in the House of Commons in Ottawa on Feb.27, 2018
Every goal we have as Canadians depends on the careful stewardship of the land we share. Climate change is dramatically changing the equation. To protect our shared goals and dreams, we need to plan for resiliency now in the face of an increase in extreme weather events.Canadians have first-hand experience with the ravages of a changing environment. Floods, fires and other climate-related disasters disrupt families and lives. They also endanger the growth and vitality of local economies. As Canada’s largest property and casualty insurer, we have dealt with the challenge of climate change for some time. In recent years, the majority of home insurance claims we have resolved with Canadians have been related to severe or extreme weather.

This year’s federal budget is another step forward in ensuring we are prepared to manage the risks that lie ahead.

Read entire article In the face of extreme weather events, Canada must plan for resiliency | The Globe and Mail

Payment card industry in Vietnam – A systemic risks analysis

From a recent article by the Vietnam Chamber of Commerce and Industry:

Cash remains king in Vietnam but credit card issuers are predicting an imminent boom in the card market with the number of Vietnamese cardholders potentially growing by 10 times the current number of nearly one million.

Nguyen Thu Ha, chairperson of the Vietnam Card Association under the Vietnam Banking Association, said from both the macro-economic and banking perspective, the domestic card market is considered very strong given the rising incomes among the country’s 82-million people, rapid economic growth and improving legal system.

Increasing tourist arrivals and the influx of money remitted home by overseas Vietnamese would also facilitate credit card growth, Ha said at a conference touting the potential for electronic payments in Hanoi last week.

So, in theory things look great for the Vietnamese payment card industry. Let’s see if this is confirmed by a systemic Resistance to Shocks analysis. In other words, instead of analyzing, for example, a single bank issuing credit cards, we will analyze a total of forty banks as a system. The analysis has been performed using publicly available data. The data in question is the following (number of parameters is 35, all data is relative to 2016):
Total number of cards
Number of Domestic Debit cards
Number of International Debit cards
Number of Domestic Credit cards
Number of International Credit cards
Number of Domestic Prepaid cards
Number of International Prepaid cards
Number of Other Cards
Total Cards Revenue
Domestic Debit Card Revenue
International Debit Card Revenue
Domestic Credit Card Revenue
International Credit Card Revenue
Domestic Prepaid cards Revenue
International Prepaid cards Revenue
Other Cards Revenue
Total Card Payment Revenue
International Card payment revenue at card accepting units
International Card payment revenue at ATMs
Domestic Card payment revenue at card accepting units
Turnover of Cash Advances by Domestic Cards at POS
Domestic Card payment revenue at ATMs
Number of ATM until 31/12/2015
Number of ATM until 31/12/2016
Number of POS 31/12/2015
Number of POS until 31/12/2016
Cash Withdrawal
Cash Transfer
Revenue spending at the Card Accepting Unit
Contract Payment Revenue
Other Domestic Cards Revenue
International Card Payment Revenue at Card Accepting Units
Online Payment of International Cards at Card Accepting Units
Domestic Card Payment Revenue at Card Accepting Units
Online Payment of Domestic Cards at Card Accepting Units

The corresponding Complexity Map is illustrated below:

The analysis reveals a very high Resistance to Shocks (RtS), namely 95.6%, which corresponds to a five-star rating.  It is interesting to note that this situation hinges on the following parameters: number of ATM, cash withdrawals, number of international debit cards. Basically, the first four parameters are responsible for nearly 38% of the overall state of health of the system. Any policies aiming at improving or strengthening the payment card industry in Vietnam should target these parameters first.

The complete ranking of parameters in terms of how they impact the situation is reported in the chart below (values are in %).

It is also interesting to analyze the system of forty card issuing banks. The Complexity Map, based on the above data per bank, is illustrated below:

Again, the RtS rating is very high, a staggering 99% with a five-star rating. One must remember, however, that this is an analysis based on payment cards data alone. The curious thing is that the degree of interdependency of this system of forty banks is 87%. This is extremely high. If one looks at the map one realizes that it is very dense. What this comes down to is quite evident – every bank is correlated to almost every other bank. This is not good if the system is exposed to a shock as its effects would propagate very quickly throughout the entire network.
In terms of systemic risk, the banks which are situated at the top of the bar chart shown below are the ones that are most exposed (values in %).

The banks which are exposed the most when it comes to systemic risks are Eximbank, Maritime bank, Ocean bank, VPBANK and CBBank. In case of a shock, these banks will be most vulnerable. In fact, note that they are also the hubs of the Complexity Map, i.e. they have the largest number of inter-dependencies with the other banks.
Based on the 2016 revenue and systemic exposure of each bank, the footprint (weight) of each bank on the system is indicated below (VBANK has a value of 1 as it has the highest revenue and is taken as reference).

What is evident is that VPBANK is critical to the system (i.e. payment card industry). It has the highest revenue, which, combined with a high systemic exposure (number 3 in the previous ranking) turns it into a hub. Any actions, aiming at the improvement, or growth of this particular sector, should be targeted at the banks at the top of this chart.

Rating the Bitcoin – When new technologies meet

Bitcoin is a cryptocurrency and worldwide payment system. It is the first decentralized digital currency – the system works without a central repository or single administrator – and has been introduced in 2009. Unlike fiat money, Bitcoin is unique because it is de-centralized and, more importantly, not under the control of bankers or financial regulators. An argument often used by Bitcoin supporters calling the currency insulated to any kind of manipulation.
New Bitcoins are generated by a competitive and decentralized process called “mining”. This process involves that individuals are rewarded by the network for their services. Bitcoin miners process transactions and secure the network using specialized hardware and are collecting new bitcoins in exchange. Basically it is a high-tech exercise which means you need sufficient computational firepower.
Bitcoins, just like traditional currencies, are traded. Recently the value of Bitcoin has been increasing very rapidly and there is much excitement in the markets. There is also talk of a potential Bitcoin bubble. Recently, Bitcoin futures have been approved. Unlike futures exchanges for the regular markets, there are more than one settlement places for the Bitcoin futures. This brings some additional complexity to a crypto currency which is already complex itself.
Unlike fiat money, Bitcoin is unique because it is de-centralized and, more importantly, not under the control of bankers or financial regulators.
Given this (growing) complexity, and the emergence of new crypto-currencies, such as Ethereum, Ripple, Litecoin, or Monero, it is interesting to measure the complexity of the Bitcoin, as well as its rating. Obviously, we’re speaking of a Resistance to Shocks rating. Over the past few years, the price of Bitcoins has been increasing, notwithstanding de-stabilizing events such as the Ukraine crisis, the Brexit, the US elections, the Korean crisis, as well as scandals, tsunamis, or the fall of oil prices.
The price of the Bitcoin over the past 8 years is indicated in the plot below. It clearly shows a phenomenal acceleration over the past year.

The complexity of the dynamics of Bitcoin’s price (of Bitcoin, in other words) is shown in the next plot. Here we note something interesting: when complexity increases, the price goes down (this starts in 2013). When, complexity decreases, the price goes up again. This is clearly visible after 2015. At present, as Bitcoin is skyrocketing, its complexity is dipping.

The Resistance to Shocks Rating of Bitcoin is depicted in the last chart, below. The rating has a very high value most of the time, close to nearly 100%, which corresponds to a five-star rating. The minimum value of 80% – four-star rating – has been attained in 2016, however, it has risen rapidly to 90% and more. For the moment, things look pretty solid.

Given that, unlike corporations, currencies (and crypto currencies) react quickly, the RtS rating of Bitcoin is issued on a daily basis. The goal is to capture the dynamics of the rapidly changing economy. This is why the above plot is continuous.
The above analysis is unique. Bitcoin is a high-tech crypto currency. RtS ratings are provided by an equally high-tech rating robot. While conventional currencies can be manipulated, not to mention simply printed, Credit Rating Agencies are known for opaque rating practices not to mention conflict of interest. What this short article illustrates is how leading edge technologies can join forces in a context devoid of regulators, administrators, bankers and, most importantly, where manipulations take place on a daily basis.

Jacek Marczyk

Author of nine books on uncertainty and Complexity Management, Jacek has developed in 2003 the Quantitative Complexity Theory (QCT), a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. Read more publications by Jacek

Setting standards for good governance in the latest ISOfocus

Abuse of office for private gains. Trust undermined. Poor governance can have disastrous consequences. It can also threaten market integrity, distort competition and endanger economic development.

How can organizations improve good governance? In its November/December 2017 issue, ISOfocus gives an overview of the most interesting, important and complex changes needed to implement and sustain good governance. It looks at ways to improve business practices and policies and where ISO standards can contribute.

This edition offers coverage of key issues ranging from risk management and business continuity to sustainable procurement. It also provides a complete picture of ISO 37001 on anti-bribery management systems, particularly useful in today’s governance matters.

Updated throughout, this November/December 2017 issue contains testimonials from some of today’s most important companies, highlighting why ISO standards are good for business, what key considerations are needed for implementing them and their role in building a trusted, resilient organization. Along the way, it illuminates many key benefits thus far overlooked.

Ryanair, a fragile complex giant

The laws of systemantics are said to be pseudo-science. Fair enough. A few of these laws are listed below. They apply to highly complex systems. Think of these laws and then think of the EU, the Euro or, most recently, Ryanair.
  1. Le Chatelier’s Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
  2. A complex system cannot be “made” to work. It either works or it doesn’t.
  3. A complex system that works is invariably found to have evolved from a simple system that works.
  4. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
  5. The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
  6. The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
  7. A complex system can fail in an infinite number of ways. One is pilot shortage.
  8. The larger the system, the greater the probability of unexpected failure.
  9. As systems grow in size, they tend to lose basic functions.
  10. Colossal systems foster colossal errors.

Ryanair has cancelled thousands of flights because of pilot shortage

So much for planning, risk management, huge IT investments, Big Data, etc., etc. It is the infantile linear and flat thinking that induces many people to imagine that Big Data, AI and other old technologies will miraculously solve problems inherent to high and uncontrolled growth of complexity, the greatest impediment of sustainable development.

Excessive complexity makes things fragile. Recall the Principle of Fragility:

Complexity X Uncertainty = Fragility

What this means is that a highly complex, or excessively articulated business, will experience high fragility (low resilience) in an uncertain/turbulent economy. Sounds familiar?

Dear Ryanair, instead of using old and outdated technology to manage your business, reach out for something a bit more modern, a bit more cutting edge: Quantitative Complexity Management. Oh, and remember, Too Big To Fail no longer works. Today it is Too Complex to Survive.


ABOUT THE AUTHOR – Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings. Read more publications by Jacek Marczyk

Why ICT systems implode

Remember the recent British Airways IT meltdown? Problems with power supply are believed to have caused the problem. Note the word “believed”. According to the mentioned source, when the system came back online, it did so in an uncontrolled manner, damaging the IT system and initiating a sequence of events that plunged the system into a state of chaos.

Others say that the cause of the meltdown is outsourcing. By the way, why would one outsource – to another continent – a really critical part of one’s business?British Airways blames an engineer who supposedly didn’t follow the right procedures for restarting the system after a power failure.

The data culture behind contemporary ICT systems belongs to the Stone Age

Finding the single cause, or a set of unfortunate circumstances, may not be possible and we may never know why this happened in the first place. But even if they do identify the trigger event what will happen is that they will put a fix in place, so that particular trigger event will never happen again. Until another glitch appears and grounds tens of thousands of passengers, causing losses of hundreds of millions.

Meltdown

The BA IT meltdown is a delicious example of what linear three-dimensional thinking is all about. People insist on putting in place super complex systems – generally these are ICT infrastructures – without focusing on two keywords:

SYSTEM

COMPLEXITY

How can you possibly neglect the two key attributes of something that is critical to one’s business and, most importantly, reputation?

By neglecting complexity – that is measuring it from day one and using it as a design attribute and objective – one risks putting in place solutions that are in close proximity of what is known as critical complexity. Critical complexity is basically being on the edge of chaos – one small glitch and all hell breaks lose.

Oh, incidentally, we know how to measure complexity, critical complexity of any kind of system. That is not the issue. The issue is that of culture and this brings us to another embarrassing point, which is:

DATA

One would imagine that in our digital age corporations and businesses would be drowning in data. There is even talk of Big Data! Well, big or not, most of today’s business are incapable of putting together a small table of numbers that are critical to its business and that are monitored with a reasonable frequency.


Also by Jacek Marczyk!

Why is Resilience (in economics) such a difficult concept to grasp? – Jacek Marczyk explains why high resilience capacity doesn’t necessarily mean high performance.

When will AI become less artificial and more intelligent? – What can we do to take AI to another level, to provoke a quantum leap?

Systemic Resilience Analysis: Supercomputers provide new tools for regulators, investors and governments – Discover Quantitative Complexity Theory, a different approach and a new set of analytical tools to address modern day challenges.


Ontonix deals mainly with very large corporations and helps them solve the so-called Extreme Problems, providing pre-alarms or early warnings of systemic failure. However, in order to do that, we need data. Often it is a small amount of data, a few tens of kilobytes. Imagine this dialog:

Q: Do you have a list of business critical KPIs (Key Performance Indicators) that you monitor on, say, a weekly or monthly basis?

A: What do you mean? What kind of KPIs are you referring to?

Q: Data which reflects the functioning of your business, data that your CEO has on his desk every Monday morning. You know, strategic kind of data.

A: Could you provide us with examples?

Q: Sure we can do that. But are you saying that you actually don’t know what YOUR critical KPIs are?

A: Well, no, not really.

Q: So you don’t even know how these KPIs are correlated, do you?

A: We’ve never thought of it like that.

The list of KPIs is submitted to the (large) corporation, that spends hundreds of millions of dollars on ICT every year. This is how the dialog continues.

Q: Have you received our list of KPIs?

A: Yes, and we’ve shown it to our IT governance, IT architecture guys, the accounting department, the HR department….

Q: And?

A: There are problems to retrieve this kind of data. We would need to interrogate different databases, approach different individuals. In some cases we wouldn’t even know who to ask.

Q: So you don’t know your critical KPIs, you don’t monitor them, you don’t know if they are independent or not and you’re not concerned. And you call this risk management?

The point is simple

It is not sufficient to purchase plenty of hardware and software and go to Big Data conferences or get excited about the Internet of Things if your data culture belongs to the Stone Age. If you a managing a complex system, then think in systemic terms. And monitor its complexity. If you think that compliance is more important than innovation, if you think that the world is linear and Gaussian, then you’d better brace yourself for your very own IT meltdown. Coming soon in a corporation near you.

Next week: Ryanair, a fragile complex giant


ABOUT THE AUTHOR – Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings. Read more publications by Jacek Marczyk

Why is Resilience (in economics) such a difficult concept to grasp?

Resilience, put in layman’s terms, is the capacity to withstand shocks, or impacts. For an engineers it a very useful characteristic of materials, just like Young’s modulus, the Poisson ratio of the coefficient of thermal expansion.

High resilience doesn’t necessarily mean high performanceBut high resilience doesn’t necessarily mean high performance, or vice versa. Take carbon fibres, for example. They can have Young’s modulus of 700 Gigapascals (GPa) and a tensile strength of 20 GPa while steel, for example, has Young’s modulus of 200 GPa and a tensile strength of 1-2 Gpa. And yet, carbon fibers (as well as alloys with a high carbon content) are very fragile while steel is, in general, ductile.

Basically, carbon fibres have fantastic performance in terms of stiffness and strength but responds very poorly to impacts and shocks.

What has all this got to do with economics?

Our economy is extremely turbulent (and this is only just the beginning!) and chaotic, which means that it is dominated by shocks and, sometime, by extreme events (like the unexpected failure of a huge bank or corporation, or default of a country which needs to be bailed out, like Ireland, Greece, Portugal, or natural events such as tsunamis). Such extreme events send out shock waves into the global economy which, in virtue of its interconnectedness, propagates them very quickly.

This can cause problems to numerous businesses even on the other side of the globe. Basically, the economy is a super-huge dynamic and densely interconnected network in which the nodes are corporations, banks, countries and even single individuals (depending on the level of detail we are willing to go to). It so happens that today, very frequently, bad things happen at the nodes of this network. The network is in a state of permanent fibrillation. It appears that the intensity of this fibrillation will increase, as will the number of extreme events.

Basically, our global economy will become more and more turbulent. By the way, we use the word ‘turbulence’ with nonchalance but it is an extremely complex phenomenon in fluid dynamics with very involved mathematics behind it – luckily, people somehow get it! And that’s good. What is not so good is that people don’t get the concept of resilience. And resilience is a very important concept not just in engineering but also in economics. This is because in turbulence it is high resilience that may mean the difference between survival and collapse. High resilience can in fact be seen as s sort of stability. It is not necessary to have high performance to be resilient (or stable). In general, these two attributes of a system are independent.

To explain this difficult (some say it is counter-intuitive) concept, let us consider Formula 1 cars: extreme performance, for very short periods of time, extreme sensitivity to small defects with, often, extreme consequences. Sometimes, it is better to sacrifice performance and gain resilience but this is not always possible. In Formula 1 there is no place for compromise. Winning is the only thing that counts.

But let’s get back to resilience versus performance

Let’s try to reinforce the fact that the two are independent. Suppose a doctor analyzes blood and concentrates on the levels of cholesterol and, say, glucose. You can have the following combinations (this is of course a highly simplified picture):

Cholesterol: high, glucose: low
Cholesterol: low, glucose:high
Cholesterol: low, glucose: low
Cholesterol: high, glucose: high

You don’t need to have high cholesterol to have high glucose concentration. And you don’t need to have low glucose levels to have low levels of cholesterol.

Considering, say, the economy of a country, we can have the following conditions:

Performance: high, resilience: low
Performance: low, resilience:high
Performance: low, resilience: low
Performance: high, resilience: high

Just because the German economy performs better than that of many countries it doesn’t mean it is also more resilient. This is certainly not intuitive but there are many examples in which simplistic linear thinking and intuition fail. Where were all the experts just before the sub-prime bubble exploded?

Who rates ratings?

The economy is a dynamic system which is far too complex for us to understand. Human nature is extremely complex and billions of irrational humans form the economy. How can such a system ever be thought to be efficient, in equilibrium and stable, as many prominent economists have claimed? But this system, like every other natural or man-made system, must respect the non-negotiable laws of physics, even if they may be unknown at a given time.
One of the instruments that have enabled the 2008 crisis are sophisticated math models and ratings. There was nothing in those models that would even hint catastrophe because models can only tell you what you hard-wire into them. The construction of placebo-generating models has led to a Panglossian approach to finance and the economics which excludes extreme events and catastrophes, allowing bubbles to grow and Ponzi schemes to flourish. So failure was not contemplated in the models. And there are no model-building laws that would force one to do so.

There was nothing in those models that would even hint catastrophe because models can only tell you what you hard-wire into them.

Models are based on assumptions

Hence they are disputable and, at the same time, provide an enormous margin of manoeuvre. And, when needed, impunity.

There are no universally accepted laws on building math models or rating schemes. Sure, you can dream up an equation and claim that it provides a basis for the pricing of some derivative. And then have people invest based on it. You cannot be held accountable simply because you are using an equation that one day implodes. You cannot take mathematics to court but you can put in prison an engineer or a doctor who is responsible for the loss of lives. Why is that? Because physics is not an opinion, while financial mathematics, together with its underlying assumptions, is. Just because you manipulate equations according to strict rules doesn’t mean you’re doing science. You could just as well be playing an extravagant video game which has no relevance or reflection in anything that is physical and that really exists. The fact that we are still unable to fix the mess, even though everything went off the rails almost ten years (and many trillions of dollars) ago, just goes to show how little we understand the economy, its systemic aspects and its dynamics.

We must change approach radically

When you face a super-complex system which you don’t understand – the crisis proves that we understand the economy very little – do you model very precisely a tiny subset thereof or do you try to get a global coarse picture of the situation? Isn’t it true that the closer you look the less you see?

The Principle of Incompatibility (L. Zadeh, UCAL) states that high precision is incompatible with high complexity. This means that the economy – which is evidently highly complex – cannot be modelled precisely and that all effort to squeeze decimals out of math models is futile, even though sometimes this gets you into the Nobel zone. In actual fact, the more complex models one is conceiving the more assumptions one must make. And that means more risk and, at the same time, more freedom to steer your model in a desired direction.

From a practical and physical standpoint, what is the difference between AAA and AA+? Is it correct (and ethical) to have over 20 rating classes?

So we need to change paradigm

Less hair-splitting, less fiddling with decimals and unlikely probability distributions or Brownian motion. Things have gotten very complex and we must place science not mathematic alchemy at the centre of our thinking.

The Probability of Default (PoD) of a company is the central concept behind a rating and ratings are a key link between the markets and investors. Their importance cannot be overstated. However, the PoD is not a physical quantity and there exist very many ways of computing it. Each method has its own assumptions – the degrees of freedom are phenomenal. Not only is a PoD a non-physical quantity, it is also highly subjective. In fact, rating agencies themselves claim that ratings are merely opinions. In mechanical engineering, for example, things like mass, strength, energy, stiffness or margin of safety are computed according to non-negotiable laws of physics which are the same all over the World. The PoD does not obey any such laws.

It may have become some sort of a standard but it is not the result of any law of physics. This means the PoD must be replaced by something more rational and relevant.  Something that not only has its roots in physics, but which is also more in line with the turbulent character of our times.

Let’s not forget that ratings have been conceived a century ago

The world was very different then. Conventional business intelligence and analytics technology have become dangerously outdated and, most importantly, it is not well suited for a turbulent economy.

As the complexity of the economy increases, traditional analytics produces results of increasing irrelevance. Mathematically correct but irrelevant. Markets are not efficient. In nature there is no such thing as equilibrium.

So, beyond a PoD-based rating, we propose a complexity and resilience-based rating. High complexity is, with all likelihood, the most evident and dramatic characteristic of not just the economy but also the hallmark of our times. Resilience is the capacity to withstand extreme events and is a measurable physical quantity – for example there are standard tests in engineering to determine the resilience of materials – and resilience rating is applicable to companies, stocks, portfolios, funds, systems of companies or national economies. In our turbulent economy, which is fast, uncertain and highly interdependent, extreme and sudden events are becoming quite common. Such events will become more frequent and more intense, exposing fragile businesses to apparently unrelated events originating thousands of kilometres away. It is good to be resilient.

An impact test to measure the resilience (fragility) of a material

Independently of the bad reputation, conflicts of interest, law-suits, rating agencies will probably continue to flourish. What can be done at this point is to provide a mechanism which allows investors to check how ‘solid’ a given rating actually is. In other words, how trustworthy is a rating? This is how it can be done. Rating agencies typically use the fundamentals (Balance Sheet, Income Statement, Cash Flow, Ratios, etc.) to establish a rating based on a set of calculations. This is the ‘scientific part’ of the process. Then comes the subjective human component in the form of interviews with the management of a rated company, upon which analysts decide subjectively a given rating based on their experience, sensations, benchmarks, scorings, ets. The process is so subjective that two rating agencies will not always agree on a rating for the same company. Even two analysts in the same rating agency can disagree on a rating!

Ratings really are opinions, not science

However, the same fundamentals can be used to compute a resilience rating which does not involve humans in the loop. The results, as indicated in the figure below, can be generally split into four cases:

Conventional rating  Resilience rating

High                      High

Low                     Low

High                     Low

Low                     High

 

The cases in which both ratings disagree are of course the most interesting. In particular the case in which a company is awarded an investment-grade rating and, at the same time, its resilience is low. In other words a great fragile rating.

Before you invest wouldn’t you want to know? Considering the fact that rating agencies are unregulated, how about a second truly independent opinion?


ABOUT THE AUTHOR – Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings.