Transmuting language: changing “risks” to “safety”

What is the basis of the claim by proponents of nuclear power that nuclear power plants can be operated safely?

For decades, nuclear proponents have defended the safety of nuclear plants using what they claimed were mathematically-sound probability estimates of low risk. Engineers provided estimates of the probability of failure for every component in the plant’s operating systems. By chaining all of probability estimates together, the engineers ended up with a final estimate of the over-all probability that the plant could be run without endangering the public.

To no one’s surprise, the nuclear industry and government regulators insisted that this process proved mathematically that nuclear plants posed a very small threat. There were serious critics of this claim, but they struggled to make themselves heard over the industry/government juggernaut.

But the safety engineers were unable to estimate the probabilities of what actually happened in one accident after another, from Three Mile Island to Chernobyl to Fukushima. What was missing from all of the fanciful calculations of probability was any acknowledgement of the fundamental impossibility of predicting in advance everything that could go wrong, an admission that would have been gravely damaging to the industry’s public relations efforts to sell “safety.”

A new report from the EU’s European Environment Agency (EEA) demolishes the probabilistic foundations of the nuclear industry’s safety calculations. The 750-page EEA report (Late Lessons from Early Warnings) draws on twenty case studies of the launching of controversial technologies. The chapter on nuclear power, “Late lessons from Chernobyl, early warnings from Fukushima,” concludes that the industry’s claims about safety are based on a deep philosophical mistake. The report shows how the commercial nuclear industry and government regulatory agencies (AEC, NRC) have consistently transmuted the great uncertainties about operating this unforgiving technology into the reassuring certainty of probability estimates.

The failure to predict severe accidents ought to be embarrassing, to say the least, if not a deal breaker for any country considering building nuclear plants. But instead of openly admitting the philosophical unreliability of using probabilities to estimate the risk of operating such complex machines, nuclear proponents use a wonderful phrase that covers all of their sins of probability omissions: a “beyond design basis accident.”

The term is used, according to the Nuclear Regulatory Commission’s glossary,

as a technical way to discuss accident sequences that are possible but were not fully considered in the design process because they were judged to be too unlikely. (In that sense, they are considered beyond the scope of design-basis accidents that a nuclear facility must be designed and built to withstand.) As the regulatory process strives to be as thorough as possible, “beyond design-basis” accident sequences are analyzed to fully understand the capability of a design.

According to this definition, nuclear engineers foresaw all the accident sequences that could lead to the various “beyond design basis accidents” that have occurred, but chose not to “fully consider” them because the engineers “judged [these accident sequences] to be too unlikely.” Thus the failure was not one of failing to anticipate the danger, but something entirely different, a failure of judgment about a danger that was known, but put aside as “unlikely.”

This definition begs the question of whether there is, or should be, another term to describe accidents that the engineers failed to recognize during the design process, a Dr. Seussian “beyond the beyond design basis accident.”

The term is one of those dexterous phrases that simultaneously affirms and denies the existence of a problem. If the probability estimates were correct, then there could never be a “beyond design basis accident.” But when a serious accident does occur, the term closes the discussion. Once a “beyond design basis accident” has taken place, regulatory bodies engage in ritualistic investigations that end up with recommendations for changes that bring the causes of the latest accident inside the boundaries of “design basis accidents.” Any questions about how it is possible for “beyond design basis accidents” to occur disappear into the probabilistic fog.

The EAA’s authors carefully explain the process of transmuting the uncertainties of “fundamental radiation protection science” into the “language of certainty” that describes “the regulation and operation of nuclear facilities.”

To begin with,

the nearer one gets to the fundamental science and engineering of complex technological systems, the greater the uncertainty and complexity

But there is a paradox when one turns to operating such systems:

yet the nearer one gets to regulation and operation, the greater the certainty and simplicity.

There is a vast but essentially invisible qualitative change as we move from the uncertainties of science to the world of regulation and operation: “…somewhere along this continuum, uncertainty has been translated into certainty, and risk has been translated into ‘safety’…”

Turning “risk” into “safety” is the kind of linguist transmutation that Orwell described in 1984 as the essence of his totalitarian language Newspeak. But recognizing this transformation does not explain “when, how, and why does this transformation happen?”

We fall into error because our attempts at estimating probabilities are based on prior assumptions. We cannot estimate the probabilities of things we either cannot imagine in the first place, or choose to consider “unlikely” even when we are aware of potential problems.

Given the degree of uncertainty and complexity attached to even the most tightly framed and rigorous nuclear risk assessment, attempts to weight the magnitude of accident by the expected probability of occurrence has proven problematic, since these essentially theoretical calculations can only be based on sets of pre-conditioning assumptions.

In the politely refined language that characterizes most government reports critical of the status-quo, the authors conclude that experience has shown us that probability assessment is an unreliable basis for understanding the risks of nuclear power plants:

With its failure to plan for the cascade of unexpected beyond design-base accidents, the regulatory emphasis on risk-based probabilistic assessment has proven very limited. An urgent re‑appraisal of this approach, and its real-life application seems overdue. [my emphasis]

But gaining a better understanding of the problems of relying on probability estimates is only the beginning of what the authors recommend. Their final conclusion should send shivers down the spine of anyone with a personal, financial, or political investment in building new nuclear plants:

In the context of current collective knowledge on nuclear risks, both the regulation of operating nuclear reactors and the design-base for any proposed reactor will need significant re‑evaluation. [my emphasis]

But re-evaluation by whom? The industry and government regulators who designed the current fatally flawed system? The authors of the EEA report recognize that their recommendation is highly political. In the last paragraph of the nuclear power chapter, the authors state that any re-evaluation has to include significant public participation, a level of participation that was not only not allowed, but was basically unthinkable during the industry’s first decades:

…it is clear that European public needs to play a key role in taking these critical, social, environmental and economic decisions (8). Here, public values and interests are central, and the role of public dialogue and the participatory practices that enable it are core to the building of mutual understanding between European states, governments, industry and people. If carried out in a truly involving way, the integration of public, policy, and expert scientific knowledge allows for greater accountability, transparency, and much better take-up of necessary change and improved long-term likelihood of problem resolution. This conclusion mirrors those from many chapters in this publication — from leaded petrol to nanotechnology: that wider public engagement in choosing strategic innovation pathways is essential. [my emphasis]

About these ads
This entry was posted in Hiding the Hazards: Reactor Safety, Nothing Can Go Wrong and tagged , , , , , . Bookmark the permalink.

One Response to Transmuting language: changing “risks” to “safety”

  1. www.Kramm.pl says:

    As you move forward, 2 men rush out of the door ahead of you.
    For an even additional professional update to your small business, adorn your interior window with frosted glass patterns and etched glass titles on workplace doors.
    t let our low prices fool you, we only offer the best.

Comments are closed.