Hanford, if you haven’t heard about it, is one of the world’s largest nuclear waste sites. The plant closed after more than forty years of plutonium refinement, first for World War II as part of the Manhattan Project, later as Cold War tensions ebbed and flowed (and nuclear stockpiles only flowed). The Department of Energy in 1989 took up the immense responsibility of tidying the place.
I spoke with a lot of people about the long-running efforts to remediate Hanford — the huge cost overruns, the chronic safety concerns (internal and external), the general wariness that there is no good technological solution within reach. (I did find differing opinions on this final point, but they were certainly the exception.) Nobody would go on-record. A sensitive issue, I was told, which is often gentle press euphemism for FUBAR.
At the end of January, the European Environment Agency released a very long report entitled “Late lessons from early warnings: science, precaution, innovation.” In brief, the report discusses how past mistakes in innovation and invention might inform a more cautious future arc of technology. A philosophical and far less wonkish discussion rooted in the same idea appeared here on March 14, 2011, three days after the Fukushima disaster. The gist:
Japan’s present disaster may or may not be transformative, but what it will be is a reminder of how often failure comes, and with consequences we should expect, but do not. I wonder if it is possible to imagine a world in which failure is normalized, part of the narrative, expected and in which we choose our strategies to return positively even when things, as they say, fall apart.
Without belaboring the point, perhaps the problem of Hanford is not the result of a few specific policies, the specific wartime urgency, the specifics of new work on new technology. Perhaps the problem of Hanford is much larger and far-reaching.