> “But recall that this chapter is about layering, units, hierarchies, fractal structure, and the difference between the interest of a unit and those of its subunits. So it is often the mistakes of others that benefit the rest of us—and, sadly, not them. We saw that stressors are information, in the right context. For the antifragile, harm from errors should be less than the benefits. We are talking about some, not all, errors, of course; those that do not destroy a system help prevent larger calamities. The engineer and historian of engineering Henry Petroski presents a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts.
The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes. (Note that the errors of naive stress testing and reliance on risk models were quite obvious at the time; as with the economic crisis, nobody wanted to listen.)”
The engineer and historian of engineering Henry Petroski presents a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts. The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes.
Another interesting argument along the same lines is that if we hadn't bombed Hiroshima and Nagasaki, there would have been no reason for Truman to stop MacArthur from using bombs a hundred times worse in the Korean conflict.
It's a sobering thought regardless of one's opinion on the atomic bombings in Japan. The lesson was going to be learned one way or the other, and arguably humanity got off easy.
> “But recall that this chapter is about layering, units, hierarchies, fractal structure, and the difference between the interest of a unit and those of its subunits. So it is often the mistakes of others that benefit the rest of us—and, sadly, not them. We saw that stressors are information, in the right context. For the antifragile, harm from errors should be less than the benefits. We are talking about some, not all, errors, of course; those that do not destroy a system help prevent larger calamities. The engineer and historian of engineering Henry Petroski presents a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts. The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes. (Note that the errors of naive stress testing and reliance on risk models were quite obvious at the time; as with the economic crisis, nobody wanted to listen.)”