In my post last week, examining the evidence of major tsunamis in the past, apparently ignored by the authorities, I chose various options to explain this failing: complacency, hubris, or simply human nature. In the light of reports that have emerged since, I was being generous. From the New York Times last weekend:
Two independent draft research papers by leading tsunami experts — Eric Geist of the United States Geological Survey and Costas Synolakis, a professor of civil engineering at the University of Southern California — indicate that earthquakes of a magnitude down to about 7.5 can create tsunamis large enough to go over the 13-foot bluff protecting the Fukushima plant.
Mr. Synolakis called Japan’s underestimation of the tsunami risk a “cascade of stupid errors that led to the disaster” and said that relevant data was virtually impossible to overlook by anyone in the field.
The Fukushima Daiichi nuclear plant was built on a 4-meter coastal bluff and “protected” by sea walls, largely designed for typhoon waves, not tsunamis, that were 5.5 meters in height. The tsunami that struck the plant was 14 meters high.
Masataka Shimizu, the President of Tokyo Electric Power, described the tsunami as “bigger than our expectations” and put the blame for the catastrophe on mother nature. “We can only work on precedent, and there was no precedent,” said Tsuneo Futami, a former Tokyo Electric nuclear engineer who was the director of Fukushima Daiichi in the late 1990s. “When I headed the plant, the thought of a tsunami never crossed my mind.”
These kinds of statements (and there are many more) are totally stupefying. This is not just stupidity, it is, arguably, criminal stupidity. In 2009, one of Japan’s leading seismologists, Yukinobu Okamura, warned that research called into question the assumptions underpinning the plant’s design, telling safety review meetings convened by the Nuclear and Industrial Safety Agency that “it’s now known that [tsunamis] of a completely different scale have come before.” Some of the evidence he was referring to came from the research, published in 2001, that the earthquake of 869, estimated magnitude 8.3, had caused an 8 meter high tsunami that inundated the Sendai coastal plain for a distance of more than four kilometres inland; the evidence came from the extent of sand deposits left by the tsunami. No precedent?
Now, to be fair, in 2002, the Power Company had taken action - “raising the level of an electric pump near the coast by 8 inches, presumably to protect it from high water, regulators said.”
Following Mr. Okamura’s repeated warnings, the authorities only agreed to to consider the issue further – the topic of tsunamis was not on the agenda for the meeting. As reported in the Financial Times “ ‘We did not ignore tsunami issues but we thought it was not the appropriate place to talk about it,” the company said. It added, however, that it was unsure what the proper place for discussion of tsunami risk might have been.” See what I mean by stupefying?
The entire approach had been to choose historical precedents that were convenient, if not relevant, to make the totally false assumption that earthquake magnitude correlates with tsunami size, to fail abysmally to respond to expert evidence, and to attempt to bluster (a euphemism that I choose for the sake of objectivity) their way out through bureaucratic excuses. “A cascade of stupid errors.”
And there’s more:
The Japanese approach, referred to in the field as “deterministic” — as opposed to “probabilistic,” or taking unknowns into account — somehow stuck, said Noboru Nakao, a consultant who was a nuclear engineer at Hitachi for 40 years and was president of Japan’s training center for operators of boiling-water reactors.
“Japanese safety rules generally are deterministic because probabilistic methods are too difficult,” Mr. Nakao said, adding that “the U.S. has a lot more risk assessment methods.”
I can’t think of a more generous description of this than as complete rubbish. Probabalistic methods of risk assessment are common practice in a vast variety of situations and are hardly rocket science. They simply involve using evidence of the known variability in the key parameters determining the outcome of a complex phenomenon and running these through a large number of possible combinations to yield a statistical range of possible outcomes. That range – of size of outcome versus the probability of it occurring – can be depicted as a size versus probability graph like this: This kind of shape is typical of natural events: earthquakes, storms, floods – and tsunamis. A high probability (or frequency) of “typical,” modestly sized occurrences but with the possibility of very large, “unusual” events. This is “the long tail” of rare events that are, nevertheless, consistent with the range of known parameters that conspire to cause them. The realistic possibility (particularly given the historical and geological records) of a 14-meter tsunami would not be far down that tail.
This is not rocket science – it is not “too difficult.” This is a methodology that many of us use routinely professionally on a regular basis – a simple means of doing this is available through Excel-based software. “Too difficult”? “Too difficult” when the lives of hundreds of thousands of people and the safety of a major nuclear facility are at stake? This is Japan, for crying out loud. This would be pathetic if it weren’t so utterly callous.