82nd International Atlantic Economic Conference

October 13 - 16, 2016 | Washington, USA

The virtues of recognizing ignorance in analyzing risk

Friday, October 14, 2016: 9:40 AM
William Lang, PhD , Promontory Financial Group, New York, NY
One remarkable feature of the 2008 financial crisis is that the vast majority of risk experts were caught largely unaware of the extreme fragility of our financial system.  Even among analysts who saw the potential for distress in the housing market, few recognized that serious problems with mortgage defaults could destabilize the global financial system. 

Regulators responded to pre-crisis analytical failures by demanding much more extensive information from financial firms and by increasing the quantity of high-skilled personnel devoted to risk analysis. The most notable example of this new regulatory approach is stress testing, which uses complex statistical models to assess whether large banks have sufficient capital reserves to survive a hypothetical severe recession.

While improving the quantity of data and the quality of analytics is a positive step, the literature in the physical and behavioral sciences on the study of ignorance suggests that these steps will not be sufficient without also addressing well-known cognitive biases.  Individuals consistently demonstrate overconfidence in existing knowledge and they avoid focusing on areas of ignorance.    Nobel Prize Laureate Daniel Kahneman noted that overconfidence bias is “built so deeply into the structure of the mind that you couldn’t change it without changing many other things.”  These biases concentrate attention on whatever answers  are produced by existing analytic methods and tend to deflect attention away from unknowns and data anomalies.      

The general bias toward overconfidence in existing knowledge and lack of attention to unknowns is exacerbated in the context of financial markets because of two factors.  First, risks in financial market organically evolve in ways that avoid the existing risk measurement system.  Second, business and professional incentives typically reward these biases in the context of low frequency/high impact risks.  A strong and dynamic risk detection system requires developing organizational structures and incentive systems designed to counteract these common cognitive biases.