Regulators responded to pre-crisis analytical failures by demanding much more extensive information from financial firms and by increasing the quantity of high-skilled personnel devoted to risk analysis. The most notable example of this new regulatory approach is stress testing, which uses complex statistical models to assess whether large banks have sufficient capital reserves to survive a hypothetical severe recession.
While improving the quantity of data and the quality of analytics is a positive step, the literature in the physical and behavioral sciences on the study of ignorance suggests that these steps will not be sufficient without also addressing well-known cognitive biases. Individuals consistently demonstrate overconfidence in existing knowledge and they avoid focusing on areas of ignorance. Nobel Prize Laureate Daniel Kahneman noted that overconfidence bias is “built so deeply into the structure of the mind that you couldn’t change it without changing many other things.” These biases concentrate attention on whatever answers are produced by existing analytic methods and tend to deflect attention away from unknowns and data anomalies.
The general bias toward overconfidence in existing knowledge and lack of attention to unknowns is exacerbated in the context of financial markets because of two factors. First, risks in financial market organically evolve in ways that avoid the existing risk measurement system. Second, business and professional incentives typically reward these biases in the context of low frequency/high impact risks. A strong and dynamic risk detection system requires developing organizational structures and incentive systems designed to counteract these common cognitive biases.