A lot of the past underperformance within the major project context has relied on the notion of limited information; it has been posed that costs or time uncertainties cannot be mitigated due to data scarcity. This has conversely led to information-seeking efforts. At present, restrictions should be applied to limit data production and dissemination. The information is also becoming more complex, intense and interdependent, and thus more efforts are required to process elementary information without causing informational overload or leading to issues related to computational illiteracy and incapacity of the human brain to deal with complex problems.
Unlike the enigmatic Laplace’s Demon, human cognition is unable to deal with interdependent non-linear data. The sequential nature of problem-solving that we are accustomed to hardly exists in the major project environment either. Individuals are born with a strong aptitude to understand and mimic processes so long as they are characterised by consistency and can be transferred into procedural knowledge. This deficiency can only be abated when interfaces and causation chains are unambiguous and simplistic. Otherwise, humans attempt to support their lack of analytical capabilities with inference, which only works in non-arbitrary relationships, either with a knowledge domain or system. For example, an engineer may understand how to operate unfamiliar equipment on the basis of having had contact with similar machinery in the past. However, it would be unreasonable to ask an estimator for a reliable quote for a prototypical solution.
Added to this, the quality of information does not necessarily fit with its quantity – the former often has to be assessed ad-hoc by the decision-maker. This in turn leads to cognitively exhausting data assessments, often subconscious, where actors not only need to denote the meaning of the data itself, but also verify its impact on the entire system and potential interfaces. Inability to deal with such analytical pressures may lead to what Wurman calls information anxiety, resulting from the continuously expanding gap between what a person can and what they should, subjectively, understand. To avoid this outcome, individuals are often inclined to make collective decisions or at least try to alleviate the decisional pressure by pooling the computational analysis. It is common to consult others or negotiate what is wrong or right when difficulties arise.
Considering the major project context, with potentially thousands of stakeholders, decisions and analyses to be dealt with, it is reasonable to assume that this environment goes beyond the sphere that the individual can comprehend. Similarly, it seems unreasonable to expect accurate estimation in such scenarios. Whenever we engage in greater depth in understanding one issue, we become blind to everything else in the background. This is generally known as the attentional blink paradigm, along with associated theories. Under such conditions rational decision-making cannot take place. Attention is simply too fragmentary to allow for sensible choices. This is why estimation efforts usually fail.
Computational limitations and the lack of a cognitive ability to sustain attention for a prolonged period also force decision-makers to seek coping mechanisms. Conscious behavioural responses predominantly revolve around information hierarchisation and intentional adaptation. Project professionals thus tend to apply subjective filters to information and assess its relevance and importance based on experiential criteria. For example, cost or time estimates thus stop being reliant on objective data, but begin to rest on the feelings and values of the decision-maker.
Withdrawal strategies, on the other hand, are applied to reduce the informational load to a minimum. A typical subconscious behavioural response to information overload is stress, confusion and a feeling of incompetence. Another is selective reception or attention, which is triggered at the point of information overload and nullifies the individual’s ability to absorb or understand new stimuli. In parallel with these reactions, or independently, decision-makers resort to heuristics, firstly to make decisions quickly and with less cognitive expense and, secondly, to avoid or abate the two previous responses. Gigerenzer calls this fast – because computational steps are limited – and frugal – as the information needed to make a decision is minimal. And most estimates are compiled in this manner when too little is known of too much information is to be processed.
You liked what you read ? Leave a comment.
See more posts from Gregor Grzeszczyk on LinkedIn.