Categories
Uncategorized

PRECIS: Learning from the evidence


Learning from Evidence in a Complex World by John D Sterman. Am J Public Health. 2006 March; 96(3): 505–514.

Precis of the Precis

A “Manhattan Project” approach to crises (experts secretly provide advice to inform policy decisions) fails when success requires change of behaviour throughout society!

Measurement in science introduces distortions, delays, biases, errors, and other imperfections, some known, others unknown, maybe unknowable. Groups of experts may help overcome these challenges; but they may be thwarted even if participants receive excellent information and reason well as individuals.  They are prone to use defensive routines to save face; make untested inferences seem like facts; and advocate positions, while appearing to be neutral. Some groups appear functional but suffer from groupthink: the members of a group mutually reinforce their current beliefs, suppress dissent, and seal themselves off from those with different views or possible disconfirming evidence.

 Our judgments, decisions and behaviours are moulded not only by evidence, but by personal traits and frames, emotions, reflex, unconscious motivations, and other nonrational or irrational factors, and by pressures created by the systems in which we act.

Implementation of social policy affects people. They seek to achieve their own goals and act to restore the balance that that has been upset by the policy. Citizens become cynical, non-compliant and actively resistant when they think that those with power and authority manipulate the policy-making process for ideological, political, or pecuniary purposes. The problem intensifies when those in authority respond by pulling harder on their policy levers, thus creating a vicious cycle.

The power of ‘the system’ to shape behaviour is an opportunity for policymakers to design enabling systems in which ‘outsiders’ can contribute to decisions and actions … and to achieve what central command and control cannot.

PRECIS

“We shall be guided by the evidence” is a common refrain in the COVID-19 pandemic. Should policy the complex problem of a contagious epidemic be left to the scientific or technical experts who can provide the evidence? A “Manhattan Project” approach (where experts secretly provide advice to inform policy decisions) fails when success requires behaviour change throughout society (as in COVID-19). The following explains this apparent paradox.

Measurement in science introduces distortions, delays, biases, errors, and other imperfections, some known, others unknown, maybe unknowable (so, for example, no one knows the real current incidence or prevalence of COVID-19 in South Africa). Measurement is an act of selecting a fraction of possible experiences of which we are aware. We may not even be aware of the remote issues or effects of our decisions and therefore omit important data from the evidence

            Deep learning arises when evidence not only influences our policy-decisions within existing mental models of reality, but also feeds back to alter those very models. The same information, interpreted by a different model, yields different decisions and different policies!

 (Of course, many issues relating to the pandemic and other complex problems are outside the remit of science – ethics, for example – and that opens another can of worms).

Groups of experts may help overcome these challenges, but also can be thwarted even if the participants receive excellent information and reason well as individuals.  Defensive routines are used to save face; make untested inferences seem like facts; and advocate their positions, while appearing to be neutral. Individuals may make strong attributions to other members that are not grounded in evidence or may be irrelevant (groups of experts are prone to become dysfunctional and unable o arrive at consensus).

Some groups appear functional but suffer from groupthink: members of a group mutually reinforce their current beliefs, suppress dissent, and seal themselves off from those with different views or possible disconfirming evidence. Experts and decision makers tacitly may avoid publicly testing their beliefs; sometimes they explicitly communicate that they are not open to having their mental models or decisions challenged (or say they are but are not).

Even if a team were to overcome its challenges and were united in recommending the best course of action, the implementation of their decisions is often distorted by asymmetric distribution of information in society, private agendas, and game playing within the whole system. Obviously, implementation failures can aggravate the situation: the decision-makers who are evaluating the outcomes of their decisions may not understand the ways in which those decisions were distorted, delayed, or derailed by other actors in the system.

Finally, because error is often costly (not just in monetary terms) and many decisions are irreversible, the need to continue with past decisions often overrides needed change or experimentation.

 Our judgments, decisions and behaviours are influenced not only by evidence, but by personal traits and frames, emotions, reflex, unconscious motivations, and other nonrational or irrational factors; and by pressures created by the systems in which we act by not only evidence

Judgments are also strongly affected by the frame in which evidence is presented or its obvious implications (e.g. to support or counter an existing policy). Scientists are not exempt from judgmental biases. Common problems in judgments made by experts and authorities include overconfidence (underestimating uncertainty), wishful thinking (desired outcomes are more likely than undesired outcomes), and confirmation bias (seeking evidence consistent with our preconceptions and policy (so-called ‘policy-based evidence making’).

Unexpected or seemingly aberrant behaviours in others are persistently attributed to their undisciplined personal habits or qualities, attitude or failure to follow procedure (and, therefore, they are not credible). The reaction of those in authority then becomes scapegoating, blame and policy focused on controls to force compliance. This authoritarian reaction provokes resistance and non-compliance, which strengthens the authorities’ erroneous belief that these ‘deviants’ (including scientists with alternate views) are unreliable and require still greater monitoring and control.

Social systems contain intricate networks of feedback processes. Implementation of policy decisions affects people. They seek to achieve their own goals and act to restore the balance that that has been upset by the policy. Citizens become cynical, non-compliant and actively resistant when they think that those with power and authority manipulate the policy-making process for ideological, political, or pecuniary purposes.

Their reactions also generate intended and unintended consequences. The problem intensifies when those in authority respond to the feedback by pulling harder on their policy levers, thus creating a vicious cycle.

The power of ‘the system’ to shape behaviour is an opportunity for policymakers. It is an opportunity to focus efforts where they have highest leverage; not to control people, but to design enabling systems in which people outside the formal decision-making structures can contribute to decisions and actions … and to achieve what commands and controls cannot.

Roger Stewart

24 May 2020


Leave a Reply

Your email address will not be published. Required fields are marked *