Aletho News

ΑΛΗΘΩΣ

Mutually Assured Delusion (MAD)

By Judith Curry | Climate Etc. | November 5, 2013

Groupthink: A pattern of thought charaterized by self-deception, forced manufacture of consent, and conformity to group values and ethics.

Groupthink: Collective Delusions in Organizations and Markets, by Roland Benabou, published in the Review of Economic Studies.  Benabou also has a talk (ppt slides) on this subject.

First, a definition of groupthink (from the ppt slides):

Janis (1972)’s eight symptoms [of groupthink]:

  • illusion of invulnerability
  • collective rationalization
  • belief in inherent morality
  • stereotyped views of out-groups
  • direct pressure on dissenters
  • self-censorship
  • illusion of unanimity
  • self-appointed mind guards

Sound like any groups that we know?  If you are on different ‘sides’ of the AGW debate, you may be evaluating the IPCC and anthropowarmists  against these criteria, or you may be evaluating the opposition against these criteria.  While both groups seem to be subject to the first 4 symptoms, I would say that the IPCC and anthropowarmists have a lock on the last 4 symptoms.

Excerpts from the paper:

To analyze these issues, I develop a model of (individually rational) collective denial and willful blindness. Agents are engaged in a joint enterprise where their final payoff will be determined by their own action and those of others, all affected by a common productivity shock. To distinguish groupthink from standard mechanisms, there are no complementarities in payoffs, nor any private signals that could give rise to herding or social learning. Each agent derives anticipatory utility from his future prospects, and consequently faces a tradeoff: he can accept the grim implications of negative public signals about the project’s value (realism) and act accordingly, or maintain hopeful beliefs by discounting, ignoring or forgetting such data (denial), at the risk of making overoptimistic decisions.

The key observation is that this tradeoff is shaped by how others deal with bad news, creating cognitive linkages. When an agent benefits from others’ over optimism, his improved prospects make him more accepting of the bad news which they ignore. Conversely, when he is made worse off by others’ blindness to adverse signals, the increased loss attached to such news pushes him toward denial, which is then contagious. Thinking styles thus become strategic substitutes or complements, depending on the sign of externalities (not cross-partials) in the interaction payoffs. When interdependence among participants is high enough, this Mutually Assured Delusion (MAD) principle can give rise to multiple equilibria with different ‘social cognitions’ of the same reality. The same principle also implies that, in organizations where some agents have a greater impact on others’ welfare than the reverse (e.g., managers on workers), strategies of realism or denial will ‘trickle down’ the hierarchy, so that subordinates will in effect take their beliefs from the leader.

JC note: This last sentence highlights one of the problems of AGW advocacy statements by professional societies in terms of amplifying groupthink.

[…]

The intuition for what I shall term the ‘Mutually Assured Delusion’ (MAD) principle is simple. If others’ blindness to bad news leads them to act in a way that is better for an agent than if they were well informed; it makes the news not as bad, thus reducing his own incentive to engage in denial. But if their avoidance of reality makes things worse than if they reacted appropriately to the true state of affairs; future prospects become even more ominous, increasing the incentive to look the other way and take refuge in wishful thinking. In the first case, individual’s ways of thinking are strategic substitutes, in the latter they are strategic complements. It is worth emphasizing that this ‘psychological multiplier’, less than 1 in the first case and greater in the second, arises even though agents’ payoffs are separable and there is no scope for social learning.

Proposition 1 shows that the scope for contagion hinges on whether over-optimism has positive or negative spillovers. Examples of both types of interaction are provided below, using financial institutions as the main illustration.

Limited-stakes projects, public goods: The first scenario characterizes activities with limited downside risk, in the sense that pursuing them remains socially desirable for the organization even in the low state where the private return falls short of the cost.

High-stakes projects: The second scenario corresponds to ventures in which the downside is severe enough that persisting has negative social value for the organization. In such contexts, the greater is other players ‘tendency to ignore danger signals about ‘tail risk’ and forge ahead with the strategy — accumulating yet more subprime loans and CDO’s on the balance sheet, increasing leverage, setting up new off-the-books partnerships– the deeper and more widespread the losses will be if the scheme was flawed, the assets ‘toxic’, or the accounting fraudulent. Therefore, when red flags start mounting, the greater is the temptation for everyone whose future is tied to the firm’s fate to also look the other way, engage in rationalization, and ‘not think about it’.

The proposition’s second result shows how cognitive interdependencies (of both types) are amplified, the more closely tied an individual’s welfare is to the actions of others.

Groupthink is thus most important for closed, cohesive groups whose members perceive that they largely share a common fate and have few exit options. This is in line with Janis’ (1972) findings, but with a more operational notion of ‘cohesiveness’. Such vesting can be exogenous or arise from a prior choice to join the group, in which case wishful beliefs about its future prospects also correspond to ex-post rationalizations of a sunk decision.

A first alternative source of group error is social pressure to conform.  For instance, if agents are heard or seen by both a powerful principal (boss, group leader, government) and third parties whom he wants to influence, they may just toe the line for fear of retaliation.

Self-censorship should also not occur when agents can communicate separately with the boss, who should then want to hear both good and bad news. There are nonetheless many instances where deliberately confidential and highly credible warnings were flatly ignored, with disastrous consequences for the decision-maker.

A second important source of conformity is signaling or career concerns. Thus, when the quality of their information is unknown, agents whose opinion is at odds with most already expressed may keep it to themselves, for fear of appearing incompetent or lazy. Significant mistakes in group decisions can result in contexts where differential information is important, if anonymous communication or voting is not feasible.

This paper developed a model of how wishful thinking and reality denial spread through organizations and markets. In settings where others ignorance of bad news imposes negative externalities (lower expected payoffs, increased risk), it makes such news even worse and thus harder to accept, resulting in a contagion of willful blindness. Conversely, where over-optimism has beneficial spillovers (thus dampening the impact of adverse signals), ex-ante avoidance and ex-post distortion of information tend to be self-limiting. This mechanism of social cognition does not rely on complementarities in technology or preferences, agents herding on a subset of private signals, or exogenous biases in inference; it is also quite robust. The Mutually Assured Delusion (MAD) principle is thus broadly applicable, helping to explain corporate cultures characterized by dysfunctional groupthink or valuable group morale, why willful ignorance and delusions flow down hierarchies, and the emergence of market manias sustained by new-era thinking, followed by deep crashes.

Patterns of Denial

The paper has an Appendix D: Patterns of Denial, listing 7 patterns of denial and illustrating with examples from Space Shuttle disasters and financial crises. Here I discuss these in context of the IPCC:

1. Preposterous probabilities.  The 95% confidence level is arguably an example of this, although it is not exactly clear how to interpret the 95% in context of probabilities.

2. New paradigms: this time is different, we are smarter and have better tools. Every case also displays the typical pattern of hubris, based on claims of superior talent or human capital.   The ‘we are smarter and have better tools’ is reflected in the extensive reliance on climate models, and labeling of anyone who disagrees as a ‘denier.’

3. Escalation, failure to diversify, divest or hedge. Wishful beliefs show up not only in words but also in deeds. The most vivid current example seems to be President Obama’s ramping up of a climate program in the U.S.

4. Information avoidance, repainting red flags green and overriding alarms.  The ‘pause’, and its dismissal in the AR5 is a prime example of this one.

5. Normalization of deviance, changing standards and rationales. How do organizations react when what was not supposed to happen does, with increasing frequency and severity? An example of this is the changing goal posts for the pause.  A few years ago, periods of pause/cooling longer than 10-15 yrs were not expected, which was recently bumped to 17 years by Santer et al.  The start date for the pause seems to be moving towards 2001 – away from the big El Nino of 1998.

6. Reversing the burden of proof.  See my essay on Reversing the Null Hypothesis for a discussion of this issue.

7. Malleable memories: forgetting the lessons of history.  This one is particularly true re arguments linking AGW and extreme weather.  Often ‘remembering’ back to the 1950’s or the 1930’s is all that is required.

JC comments: I find Benabou’s analysis to be very insightful.  Awareness of these symptoms and patterns is the first stop towards inoculating against groupthink.  Encouraging dissent is key to not falling into the groupthink trap.

While the examples provided are markets and public and private sector disasters, these ideas are broadly applicable to the different social ‘realities’ surrounding anthropogenic climate change.  I’ve tried to find an analogous set of examples for the ‘denial’ of say U.S. Republicans and some oil companies, but could only come up with  examples for 3, 4, 5 of the ‘patterns of denial’.  Sort of changes which foot the ‘denier’ shoe fits best.

July 12, 2015 - Posted by | Deception, Economics, Science and Pseudo-Science, Timeless or most popular

No comments yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.