Failure is the only option

My colleague Marlen Ronquillo’s Wednesday, June 8 column (“Geology-challenged pols: Curb your BNPP enthusiasm”), sketched out the main geological arguments against the resurrection of the Bataan Zombie Nuclear Power Plant (BNPP). These are that it is located on the slope of an at least hypothetically active volcano (Mount Natib), is within potential striking distance of a definitely active volcano (Mount Pinatubo), and is near a major seismic fault (Lubao fault).

As I said in last week’s column on this topic, I don’t think the “geological hazard” argument against the BNPP is the strongest there is, although it’s not entirely invalid. There are credible arguments both ways; that is, the risk is significant or it is not significant. Thus, the only conclusion that can be drawn with certainty at this stage is that the risk is not zero, but that it is also indefinite. Geological events – earthquakes, volcanic eruptions – are notoriously unpredictable, and so “uncertain but not zero” might be the best you can do.

So why am I talking about it here? It serves a useful purpose, that’s why. Marlen’s argument, and especially the negative reactions to it, highlight a huge flaw in nuclear policy that scholars have only recently identified. To fully understand this, we need to backtrack a bit and discuss which theory inadvertently exemplifies it best.

“Normal” accidents

“Normal” accidents, or “systemic” accidents, is an idea first presented in a 1984 book by Yale University sociologist Charles Perrow, whose work was inspired by the nuclear accident in Three Mile Island in 1979. Perrow’s theory, which has since been repeatedly demonstrated by real-world events, is that there are technological systems that are so complicated that multiple unexpected failures are inherent in them and cannot be designed from them.

Get the latest news


delivered to your inbox

Sign up for the Manila Times daily newsletters

By registering with an email address, I acknowledge that I have read and accept the terms of use and the privacy policy.

Perrow explained that there are three criteria that identify systems that will inevitably fail in one way or another: first, the system is complex; second, the system is tightly interconnected with one or more other complex systems; and third, the system has the potential for catastrophic failure. The only way to minimize the impact of an unavoidable significant failure in these systems is to subject them to substantial overhaul, in most cases on an ongoing basis. The only way to avoid or prevent significant failure is to ditch the technology altogether. Examples provided by Perrow of systems that will experience “normal accidents” include air traffic, marine traffic, chemical plants and refineries, hydroelectric dams and, in particular, nuclear power plants.

The Chernobyl nuclear disaster was a “normal” accident. UN PICTURE

One of the key points underlined by Perrow is that the human management structure of complex systems is an inseparable part of the complexity of these systems. Even systems that operate relatively autonomously ultimately do so for human purposes, so some level of human interaction is always present. What Perrow discovered is that the human factor in normal accidents is always either the point of failure or the catalyst that turns one or more small failures into a catastrophic failure.

The normal accident theory has been an important tool for anti-nuclear activists since it was presented nearly 40 years ago, but of course the same knowledge is also available to nuclear advocates, and they have therefore had just as much time to refine – assertions about him. So while people like me argue that every incident or accident in nuclear power systems has been historically unique because the systems are inherently prone to unpredictable failure, advocates, drawing on the same knowledge, argue that the same incident does not happen twice because “lessons are learned”.

Sociotechnical systems

It was only relatively recently that researchers were able to explain why this type of conflict persists, in two different articles published in 2015 and 2019. The first, by Stanford University professor François Diaz-Maurin and Professor Zora Kovacic from the Universitat Oberta de Catalunya and published in the journal Global Environment Change in March 2015, analyzed the differences in nuclear power stories and outcomes.

In other words, assertions about economics, safety, reliability, etc., whether made by industry players, governments or advocates at different levels prior to construction and operation nuclear power plants, are not confirmed in practice. The difference is constant in the sense that it always exists, although it varies in detail from case to case.

In their own words, what the researchers found is “…a systemic inconsistency between the way the story of nuclear energy is told and the experience gained after the implementation of nuclear energy according to the This inconsistency is due to the inconsistent levels of observation used by different social actors endorsing different perspectives.The implementation of nuclear energy has been based on the engineering viewpoint, focusing on the operation power plant considered apart from the wider environmental, economic and societal implications of the adoption of this technology.” This leads them to conclude that “the nuclear power controversy can be treated as a problem of contrasting beliefs and normative values ​​in stark disjunction with experience”.

The second paper, written by a dozen scientists from the Institute of Nuclear Energy Safety Technologies of the Chinese Academy of Sciences in Hefei, was published in the journal PNAS in March 2019 and examined the safety factors nuclear energy in the renewed interest in nuclear energy in developing countries, mainly in Asia.

Using a different approach, they find themselves in the same place as the first research team: nuclear advocacy, whether individual or institutional, is largely (but of course not entirely) based on beliefs and values, and this is even more pronounced. in developing countries, largely because they are inexperienced with the technology and dependent on potential suppliers for the supporting narrative. Remember that this comes from a group of scientific experts who are by nature and profession defenders of nuclear energy: “Nuclear power plants are complex socio-technical systems, and their safety has never been fully defined. We argue that social aspects, rather than mere technical measures, must be involved in ensuring nuclear safety,” they write.

And that brings us back to the geological risk factor for the BNPP, where the arguments on both sides turn out to be questions of belief based on slightly different things. Those who see a significant risk believe there is one due to the existence of a fault line and nearby volcanoes; those who don’t believe there is no risk because there has been no demonstrated threat of these things.

If the BNPP were a warehouse, a shopping center or a confectionery, the geological risk could perhaps be dismissed out of hand. But the BNPP is a nuclear power plant, a system which, if exploited, will at some point experience a normal accident. One would hope that it would not be a very serious accident, but it is inevitable. Failure is the only option, and the only uncertainty is its magnitude and scope. That said, geohazard becomes relevant and strengthens the case against BNPP, albeit to a modest extent.

[email protected]

Twitter: @benkritz

Comments are closed.