Psychological and Cultural Biases Distort Fracking Debates

Energy law is the product of energy politics, and politics can be emotional and contentious.  At the same time, many energy policy disputes turn on questions amenable to scientific study — the fight over acid rain in the 1980s, the long-running battle over climate change, and the current fight over hydraulic fracturing (“fracking”), to name a few.

We commonly look to academic and government studies to play a constructive role in policy debates by adding expertise, and a point of view that is not tethered to a particular favored outcome.  As noncombatants, third party experts ought to be able to act as honest brokers in policy debates; however, some common psychological and cultural biases can undermine that task.

In a perfectly rational world, we would adjust our beliefs about what is true – for example, about the magnitude of the risks posed by emissions of greenhouse gases or hydraulic fracturing – as scientific study provides us with new information about those risks. Instead, we commonly assimilate new information  in biased ways, crediting studies that support our positions and discrediting those that don’t.

Psychologists attribute this phenomenon to “confirmation bias,” the notion that people are motivated to defend and protect cherished beliefs, and so will assimilate and interpret new information in ways that protect those beliefs.  Psychologist Raymond Nickerson describes this process as a kind of unconscious analog to the process trial lawyers go through when building a case — a kind of “unwitting selectivity in the acquisition and use of evidence.”

Cultural anthropologists, for their part, offer a slightly different explanation of our biased assimilation of new information.  They point to our prior “cultural commitments,” which they say shape our beliefs about what is true.  We are each psychologically committed to our own social identity, which in turn is tied to our group memberships, our ideology, etc.  These commitments prevent the rational processing of information on public policy matters.  We rely on experts, but we only trust those experts who “share our values”; and we assess whether an expert shares our values based, in part, on the content of the expert’s opinion. This is a phenomenon that Dan Kahan and his colleagues at the Yale Law School’s Cultural Cognition Project call the “cultural cognition of risk.”

Thus, for example, a dwindling number of climate change skeptics work to discredit climate science, despite the clear scientific consensus that human activity drives climate change. These skeptics include not only those with an economic incentive to oppose climate science (such as members of the energy industry, for whom confirmation bias may be at work), but also many ideological conservatives with no such incentive (whose cultural identity may be driving their skepticism).

Similarly, opponents of fracking confidently assert that it “inevitably” leads to drinking water contamination, despite the accumulating academic studies indicating that contamination is likely very rare.  Indeed, it is not uncommon to see proponents and opponents of fracking cite the very same anecdotes and studies in support of their claims about fracking’s risks.

This problem of a biased assimilation may be exacerbated by “framing effects.”   We create taxonomies – mental stories — into which we fit new information, including new information about climate change or fracking.  These taxonomies include heroes and villains, imbuing our mental stories with a moral component.

It is easy to see how this kind of framing might influence our assimilation of new information about climate change or shale gas production.  There is a long history of framing political conflict over energy policy as “energy versus the environment,” “people versus profits,” and “fossil fuels versus clean energy.”  Often, these kinds of associations are not conscious choices; to the contrary, they are a function of how the human brain stores (and recalls) information.  Having heard or read news stories about the Exxon Valdez accident and the Deepwater Horizon spill, the brain may develop neural connections between the parts of the brain that store information about oil and gas companies and the parts of the brain that store information about pollution.

Thus, when proponents of shale gas production tout the relative environmental benefits of clean, inexpensive natural gas, they are running headlong into these framing effects. The environmental battles of the past often pitted the forces of environmentalism against the “fossil fuels” industry, creating associations in our minds between coal, oil and gas that impede the efforts of fracking’s proponents to draw environmental distinctions between those fuels.

All of which puts elected politicians in a difficult position.  On the one hand, they are accountable to voters for their jobs, not to science;  on the other, they may suspect that scientific truths will win out in the end, and that they pander to unsupported popular beliefs at their own risk.  We may already be seeing the scientific consensus over climate change slowly finding its way into public opinion, if recent public opinion polls are any indication.

But the dialogue between experts and the public over fracking is not nearly as far along.  Politicians facing strong public  pressures over fracking, such as Governor Cuomo of New York, ought to do their best to avoid succumbing to confirmation bias or the cultural cognition of risk by (i) recognizing that the scientific debate over fracking is much more circumspect and careful than the public debate, (ii) delegating to technical experts the task of developing the factual basis of any policy decision, and (iii) avoiding staking out a position on the risks of fracking in front of constituent groups prior to reviewing the scientific evidence.

In the midst of energy policy debates that emit more heat than light, it can be difficult for policymakers to muster the will to base their decisions on good science and a broad view of relative risks.  Recognizing the biases at work in the public debate can help policymakers insulate themselves from those biases, and make better policy decisions.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>