Today with 1 TB of RAM and our OntoNet QCM engine we can analyze systems having up to 1 million nodes (variables). Consider that an L1 + L2 description yields 1300 X 500 = 650000 variables. This means we can easily analyze even large proteins composed of hundreds of amino acids and hundreds of thousands of atoms. See here an example of Complexity Map of a protein.
Enjoy this article? See more by Jacek Marczyk
Ryanair, a fragile complex giant Jacek Marczyk explains why ”Too Big To Fail” no longer works. Today it is ”Too Complex to Survive”.
Why ICT systems implode The data culture behind contemporary ICT systems belongs to the Stone Age!
Why is Resilience (in economics) such a difficult concept to grasp? Jacek Marczyk explains why high resilience capacity doesn’t necessarily mean high performance.
Peak performance: 125000 Tflops/s
LINPACK benchmark: 93 petaflops
- Measure the complexity of the DNA. We know that it is complex, it is time to find out how complex that is
- Identify the genes that are the main drivers of DNA complexity
- Measure in cbits (complexity bits) how much information is encoded in the DNA
- Find out which genes are the hubs of the DNA
ABOUT THE AUTHOR – Jacek Marczyk, author of nine books on uncertainty and complexity management, has developed in 2003 the Quantitative Complexity Theory (QCT), Quantitative Complexity Management (QCM) methodologies and a new complexity-based theory of risk and rating. In 2005 he founded Ontonix, a company delivering complexity-based early-warning solutions with particular emphasis on systemic aspects and turbulent economic regimes. He introduced the Global Financial Complexity and Resilience Indices in 2013. Since 2015 he is Executive Chairman of Singapore-based Universal Ratings. Read more publications by Jacek Marczyk