How to ‘fix’ standard deviations

Standard deviations are a popular and often useful measure of dispersion. To be sure, a standard deviation is merely the most likely deviation from the mean. It also doesn’t take into account the shape of the probability distribution function (this is done better using, for example, entropy, which is a more versatile measure of dispersion).

Standard deviations, however, may be ‘adjusted’ to take into account an interesting aspect of data, namely complexity. Let’s see an example. Say you have a portfolio of 28 stocks, all of which are independent (i.e. uncorrelated). In such a case the complexity map of the portfolio is as the one below.

One computes the standard deviation of each stock and may them use it to measure the volatility of the portfolio or other measures of risk. Suppose now that some of the stocks are indeed correlated. Say that the complexity map is now the one below.
Stocks 5 and 7, for example, are correlated with numerous other stocks, while 3, 6 and 25 are uncorrelated. This is reflected in the Portfolio Complexity Profile (or Portfolio Complexity Spectrum) which ranks the complexity footprint of each stock in the portfolio. This is illustrated below.
Stock 7 has a footprint of just over 17% while stock 5 is responsible for nearly 15% of the complexity of the portfolio.

Clearly, just like in the previous case, one can calculate the standard deviations of all stocks one by one. However, in the first case all stocks were uncorrelated, here some of them are. These two cases are obviously different, in particular from a structural point of view. The question now is this: why not use the information in the Complexity Profile to ‘adjust’ standard deviations by adding a correction originating from complexity? Clearly, a stock that is heavily correlated to other stocks in a portfolio could be more ‘dangerous’ than an uncorrelated one.  Evidently, it is the job of covariance to express this:

Covariance(i,j) = Correlation(i,j) x STD(i) x STD(j)
But why not take this into account also at standard deviation level? One simple way to accomplish this is the following:
Adjusted STD = (1 + Complexity contribution) x STD
Basically, stocks that increase portfolio complexity see their standard deviations corrected (increased) by a complexity-based factor. The (ranked) result is illustrated below.
The bar chart below shows the complexity-induced corrections of standard deviations:
For example, the standard deviation of the biggest complexity contributor – stock 7 – which is 3.81, is incremented by 17.1% (its complexity footprint) to yield a value of 4.46. The norm of the original covariance matrix is 58.21, while the ‘corrected’ covariance matrix has a norm of 68.15.

Portfolio complexity, which is a factor that is neglected while analyzing or designing a portfolio (a covariance matrix is a poor substitute) ‘increases’ standard deviations, illustrating eloquently the concept of complexity-induced risk.

Doing classical stats may produce overly optimistic results if complexity is neglected. In reality, every system has some degree of complexity, which is invisible to conventional analytics . In reality, there is often more risk than one may think.
Advertisements

Published by

ontonixqcm

Established in 2005 in the USA, Ontonix is headquartered in Como, Italy, and develops the World’s first system which allows one to measure and manage complexity - the main enemy of modern businesses. Our award winning technology and exclusive services help our clients view strategy, business risk management and economic intelligence from a radically innovative perspective. In turbulent times conventional analytics and BI technology prove ineffective and it is necessary to turn to new methods. Our technology takes advantage of the recent developments in science and has been engineered to specifically treat turbulence and extreme events. We know how to identify the hidden sources of fragility in a business and how to make it more resilient. And how to prepare it for Black Swans. Based on the discovery that excessive complexity is the true source of exposure, we have devised a new theory of risk which is particularly suited for a turbulent global economy and which allows us to provide our clients with real-time early warnings of increased vulnerability and exposure. Our unusual software solutions, which integrate with ERP or Data Warehouse systems, deliver new and critical information on a business and its performance. Our intrinsic KPIs and business diagnosis tools provide management with strategic business and economic intelligence in a dynamic fashion. Ontonix offers a unique cloud-based business diagnosis and rating capability, which allows one to perform a real-time check of the complexity and resilience of a corporation. The on-line service generates intuitive Business Structure Maps which pinpoint the sources of criticality within a business. With the Internet as the backbone of this global service, our goal is to deliver objective ratings and complexity management to every corner of the economy, helping corporations cope better with our turbulent times.

Comment here

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s