This was back when I still believed in Big Science and the space program excited and stimulated me. But it was still worth it...
Ah, I remember being a space groupie. My two favourite model kits as a kid were a Soyuz (complete with Yuri) and an Apollo lunar lander... of course I also had a USS Enterprise and a Klingon battle cruiser with pinlights that ran off an AA cell, but the real spacecraft were more fun to build :-) I was born in '58, the year of Sputnik; space exploration and me are siblings of a kind. Like Migeru I have lost the faith; and alas I would go even further and say I can't even feel that it was worth it. Sometimes I think where we might be now if all that money, brains and heroic effort had gone into sustainable technology R&D. Oh, the opportunity cost... Anyway I do remember the plutonium launch and the fuss about it. However, I did understand the protesters' point of view. I'll try to explain why, despite my long personal and professional embedment w/in Big Space Science, I felt they had a point.
Going back to a NYT summary of the controversy for reference:
Safety precautions are clearly needed because plutonium is highly toxic if inhaled or ingested, causing cancer that can be fatal.
According to the Energy Department, which made the power generators, the ways the risk has been reduced start with the plutonium itself, which is pressed into a pellet about the size of a marshmallow. The plutonium is in ceramic form, which makes it insoluble in water and unlikely to break into a fine dust that could be inhaled.
''It's similar to your ceramic kitchen cook wear,'' said James A. Turi, director of the office of special nuclear applications at the Energy Department. '
[...]
Mr. Turi said the Energy Department has done about 100 tests over a decade to check the safety of the plutonium units, subjecting them to a variety of conditions they might encounter in a shuttle accident. In one, a fragment of a shuttle booster rocket was attached to a rocket sled and slammed into a power generator at a velocity of 266 miles per hour, with no release of fuel. The tests, the Energy Department says, show the units are highly resistant to damage.
The nub of the controversy is how resistant. NASA has said the highest probability of launch-area release of plutonium due to a shuttle accident is less than 1 in 2,500. The anti-nuclear groups disagree, saying the odds are as great as 1 in 430.
The two sides also disagree on the medical effects of a plutonium release. The groups estimate that it could cause thousands of fatal cancers. But the space agency says so little plutonium would be released that there would be no additional cancer deaths.
NASA's health-effects estimate is much more optimistic than a Federal interagency panel that evaluated Galileo. It reported that a launching pad accident could release enough plutonium to cause 80 cancer deaths eventually. The panel said that if the probe re-entered the atmosphere as it swung by the Earth, up to 2,000 cancer deaths could be caused by released plutonium.
Steven Aftergood, a senior research analyst with the Washington-based Federation of American Scientists, said the real issue was how much risk was acceptable. ''My own judgment is that the risk is small,'' he said, ''and the scientific payoff is large.''
Boyoboy, this is a trip down Memory Lane. The first thing that jumped out at me at the time was the selfconsciously cosy metaphorical language being employed: about the size of a marshmallow (what could be more innocuous, soft and squishy, sweet and childishly domestic?), it's similar to your ceramic cookware ... trying to "talk down" to the public by assuring them that the technology was analogous to familiar and harmless domestic items. [In the light of recent studies on juvenile diabetes in the US, the marshmallow was maybe not such a good choice, but this was some years ago, before the Sugar Meme Wars.] This kind of soothe-speak always raises hackles...
NASA's stubborn insistence on a "success-oriented engineering" line, i.e. persisting in a highly optimistic worst case scenario despite the findings of the interagency panel, did nothing to boost credibility. The informed portion of the public is by now very familiar with soothing optimistic claims from experts, and the embarrassing history of error or deliberate falsehood in those claims. The Unsinkable Titanic, guaranteed to be proof against all collisions at sea. Thalidomide [cf Dark Remedy for the remarkable story of the coverup and the whistleblower Frances Kelsey of the FDA, who put her career on the line to protect pregnant woman from the drug's teratogenic side effects; she was later awarded a civilian service medal by Pres. Kennedy, a rather different treatment from what she would receive today I fancy.]... The Challenger crash with its sordid backstory of launch schedules forced to suit a PR agenda and defective O rings [Feynman tells the story very well in his autobiography]... my cultural hero E Tufte discusses the impact of Powerpoint on NASA institutional culture and its relation to the loss of Columbia... In fact NASA has dodged quite a few bullets over the years:
The Mercury flights witnessed three emergencies, all related to reentry and splashdown. The hatch on MR-4 was blown prematurely after splashdown and astronaut Gus Grissom had to exit hurriedly and swim clear of his sinking craft. An emergency on MA-6 made it necessary for John Glenn to reenter the atmosphere uncertain if his heat shield was sufficiently secure to remain in place. On MA-7 Scott Carpenter overshot the designated splashdown area and a full hour intervened before ground control could be sure that he had landed and exited his spacecraft safely.
The Gemini and Apollo programs also had their share of difficulties. On Gemini 8, astronauts Armstrong and Scott experienced the first emergency to occur in space. After docking with an Agena rocket, the vehicle began to spin out of control. The crewmembers were able to escape by firing their retrorockets, returning to Earth 2 days ahead of schedule. On Apollo 11, the first moon landing, Commander Neil Armstrong was forced to take over control of the lunar module to avoid descending into a giant crater; a crater near-miss was also experienced by the crew of Apollo 16. On returning to Earth, the crew of Apollo 15 experienced a rough landing when one of their vehicular parachutes failed to deploy during final descent. The most critical U.S. emergency to date occurred on Apollo 13. With the spacecraft almost a quarter of a million miles from Earth, an oxygen tank exploded. The astronauts moved to the lunar lander for emergency return to Earth. Again, possible damage to the heat shield added to the concern during reentry.
There were also some dangerous situations during the Skylab series. Skylab 1 (unmanned) arrived in orbit with its meteorite/ thermal shield torn away, with a solar wing broken off, and with the second solar wing jammed. The Skylab 2 crew had the unenviable job of trying to correct these problems so that the main habitat could be made operational. Astronaut Paul Weitz engaged in hazardous extravehicular activity in an unsuccessful attempt to release the jammed solar panel, after which Commander Charles Conrad attempted for 4 hr to dock with the damaged Skylab. The docking finally succeeded, but the crew never really knew until the end of the mission [219] whether or not they would be able to undock for the return trip home. Working in extreme heat, the crew managed to deploy a parasol to shield the vehicle from the Sun, allowing the Skylab missions to proceed. On Skylab 3 an emergency flight home was contemplated for a time when a leak was detected in the command-module thruster.
Anyway, the moral of the story is that everyone knows NASA makes mistakes. Sometimes big ones.
But no one got really bent outta shape -- politically -- over NASA's history of near-misses (though there was a moment of national mediated hysteria over Challenger which lasted about 2 metaphorical seconds), because the risk involved was almost exclusively to the astronauts and they, like test pilots, volunteered -- nay, would have fought tooth and nail for the chance to fly a mission if that was what it took (I have personally known some of the candidates). So it was a classic case of voluntary adventurist high risk: they knew the job was dangerous when they took it. They get the glory, and they take the risk. Innocent bystanders are not involved -- except in a more holistic sense of social opportunity cost, as Gil Scott-Heron bitterly pointed out in "Whitey's On The Moon" -- the showpiece Space Program that was supposed to prove the American willy far bigger and stiffer than the Soviet willy, diverted a lot of funding that could have been used to alleviate hunger and suffering in the US.
"This decision demands a major national commitment of scientific and technical manpower, material and facilities, and the possibility of their diversion from other important activities where they are already thinly spread. It means a degree of dedication, organization and discipline which have not always characterized our research and development efforts. It means we cannot afford undue work stoppages, inflated costs of material or talent, wasteful interagency rivalries, or a high turnover of key personnel.
"New objectives and new money cannot solve these problems. They could in fact, aggravate them further--unless every scientist, every engineer, every serviceman, every technician, contractor, and civil servant gives his personal pledge that this nation will move forward, with the full speed of freedom, in the exciting adventure of space." (Excerpt from "Special Message to the Congress on Urgent National Needs")
(
wiki)
At any rate the public was not exposed to risk -- not in an obvious way, not in a visible way -- in the course of space missions. This changed with the awareness of the plutonium power packs. And to understand the depth of feeling they engendered we have to talk about risk perception and ethics.
In my experience there seem to be four main factors in the perception of risk. One is the individual's personal "risk thermostat" -- some people are extremely risk averse and throw away a tuna sandwich if it has been sitting out at room temperature for five minutes; others will eat it day-old and take their chances. Risk thermostat is a function of idiosyncratic factors like upbringing and probably some genetic markers, but also of age group, gender, culture, social roles, etc. The next three factors all interact. They are: the probability involved (what is the statistical chance that X will happen); the severity or magnitude of negative event X; and the degree to which assumption of this risk is "necesssary" and/or "voluntary." A fifth factor I should at least mention is "exoticism" or unfamiliarity; we become inured to risk by familiarity with it, which is one of the "moral hazards" of highly hazardous occupations and installations. I consider it secondary to the main four, though others may disagree.
In other words, we sensibly do not worry about very low magnitude events with moderate to high probability; I could stub my toe if I do not wear steel toed shoes, but the magnitude of the event is minor so I will continue to wear sandals and not feel like much of a daredevil. We do not worry too much about high magnitude events with very low probability; if I were to be struck by a sizable meteorite, I would probably be killed or seriously injured, but the odds on this are so low that I do not skulk from one bit of shelter to another for fear of impact. But in each case I am making personal decisions about how much risk to assume -- there are protective measures I could take (though their effectiveness may not be what I imagine it to be, which is a whole separate topic!). The risk from second hand smoke is a textbook case: the worst-case event is dreadful (lung or other cancer) and quite probably life-shortening, the odds are relatively low, but the risk is imposed not voluntary. For this reason health-conscious nonsmokers tend to get angrier or more anxious about 2nd hand smoke than they do about other risks -- such as driving a car daily, perhaps -- which offer a higher statistical probability of a worst-case outcome. Moreover, the 2nd hand smoke exposure is not only involuntary, it is imposed by the wilful action of another, identifiable person or persons who are "exporting costs" onto the nonsmoker as they reap benefits which the nonsmoker not only does not share, but usually doesn't understand or approve of. [The perception of risk is highly coloured by moral and ethical stances which determine "justifiability."]
We can now apply this model to the plutonium launch and consider the sources of public grievance. The public was not consulted when the decision was made to launch plutonium, in previous missions or the mission over which the controversy erupted. Thus the risk was non-democratically imposed. There were assurances that the risk was very slight statistically... but 1/2500 is not all that reassuring. When questioned, average Americans think their risk of dying in a car crash this year is about 1/70000, but the real statistic is more like 1/7000. This from Larry Laudan (The Book of Risks, popularised but interesting actuarial stats), who generalises "Most of us tend to be comfortable with activities that carry annual risks of a more or less unpleasant nature smaller than 1 in 100,000 or 1 in 50,000. The slightly less risk-averse find that their habits and hobbies come with about a 1-in-10,000 chance of serious misfortune." 1/10,000 is four times "safer" than the best risk estimate NASA claimed at the time, and 1/10,000 is the comfort level of the less risk-averse quartile or so. The annual average risk (US figures) of death by cancer is about 1/500, which is enough to justify enormous NGO and governmental efforts to reduce this risk and high levels of personal fear and anxiety about the disease; this is slightly safer than the odds claimed by the protesters.
In other words the risk factors involved, even at the optimistic NASA estimate, were too high for the comfort zone of all but the most risk-tolerant demographic.
Moreover, the potential negative event was fairly major (2000 deaths is not so very much smaller than the 911 incident, which people got upset enough about to justify two invasions, invalidating much of the Constitution, and bankrupting the treasury). And the consequences were not limited to "volunteers" who signed up knowingly for the risk.
So if we step into the memespace of an angry protester threatening to sit on the launch pad, this person might expostulate: You tell me that the risk of a plutonium release that would actually kill people is very, very small and you know what you are doing. But my family lives in the potential contamination area if something goes wrong. We know that things have gone wrong before. We know that your "fail safe" systems have failed and people have died as a result. Look, if you come into my home and point a gun at my child and then tell me 'Don't worry, I have personally inspected this gun and I am an experienced marksman and I am sober and have no nervous tremors or motor control issues, and the odds that I will sneeze or have a seizure and accidentally pull this trigger or that the gun will spontaneously fire, are a million to one against,' I am still going to get bloody angry about this because who gave you the right to point this gun at my child in the first place? I never consented to this! And you are doing this just out of idle scientific curiosity, not to cure cancer or prevent a war or something that would be worth risking lives? Fuhgeddaboudit!
Severe negative potentiality, insufficiently low risk (even a million to one is not enough to prevent anger if the risk is imposed and the outcome severe), imposed vs voluntary, insufficient justification. Speaking as a nonsmoker :-) and as one who has seen some of the seamy underside of NASA and other big science institutions, I understand the protesters' point of view. I think they had a valid ethical complaint.
I think scientists like Aftergood, quoted above, did not help any with remarks like "My own judgment is that the risk is small and the scientific payoff is large." The inevitable perception was of him (and since he was an Official Spokesman, by implication "Scientists" in general) as an arrogant so&so who thinks he has the right to make life and death risk assessments for other people without consulting them, in the pursuit of his own personal obsession with astronomy and space science (which many people don't think are worth spending tax dollars on at all, let alone lives). Inevitably I'm reminded of Albright's dismissal of half a million Iraqi child deaths, "We think the price is worth it." Is it ever ethical for Person A to decide that it is "worth it" for Person B to pay a price for some supposed desideratum? especially if the desideratum benefits A a whole lot more than B? Should this not always be B's decision? Who gave the NASA team (or the science establishment) the right to take this risk with other people's lives, and then to proclaim that it was "worth it"? Bad PR, and more importantly, bad process.
[An aside: the economists' fantasy of "externalised costs" works very well to obfuscate the calculus of risk and ethics by pretending that toxicity and other negative byproducts just "go away" instead of, more accurately, "going elsewhere and being inflicted on others."]
A very vexing ethical question emerges. How, if we aspire to democracy and/or fairness, justice, glasnost, whatever we would like to call an "open society," do we ensure that risks which we assume as a polity are assumed only after full disclosure and full democratic process? How do we avoid the problem of governmental, military, scientific or technocratic elites or corporate marketeers subjecting the population to undisclosed risks? Believers in the Growth/Progress meme (not all, but some) sometimes talk about "the price of Progress" and dismiss broken lives, shattered livelihoods, destroyed lifeways, lost cultures and language, extinct species, and all the rest as an acceptable price for some allegedly general benefit. However it's notable that most of them are not the people paying the price, nor do they reasonably imagine themselves or their kin in that position. They are as we might say, all in favour of experimental aircraft but not keen on being test pilots :-) [Eisner, CEO of Disney, purveys corporate junk food to the public in vast quantity at his resorts and playlands and I'm sure insists that this is perfectly good stuff; but he keeps dedicated organic gardeners and nutritionists on his personal staff and himself eats very little that is not organically produced on his own land... just to be on the safe side.]
A dignitarian [I'm still chewing on that one] or equitable society, I think, needs to adopt a nuanced understanding of risk and
- (a) tolerate voluntary and self-afflicting risk, i.e. not become a smothering Nanny State always criminalising harmless (to others) behaviour For Our Own Good, yet
- (b) work strongly to ensure that risk is not "externalised," that risks are well understood, and that the populace has sufficient information and sufficient political power to do its own cost/benefit thinking and to reject risks which do not promise sufficient benefit to be "worth it."
In my personal utopia, people can bungee-jump, go swimming alone after a heavy lunch, skateboard and ride (motor)bikes without being fined for not wearing styrofoam bonnets, smoke whatever weed lights their pipe -- small risks should not be unduly magnified, and even taking large risks should be a human right; adventurism and even recklessness should be accepted as a social cost of freedom, so long as no other person is involuntarily placed at risk. But the reverse is also true: whoever profits from risk should share it.
In my personal utopia, the families of the board of directors and senior management of the waste incinerator or power or chem plant would be required by law to live fulltime in the toxicity footprint of their plant. Meat producers should have to eat their own products, and Auto industry CEOs and elites should have to drive their own cars. Railway officials should have to ride their own trains to work. They reap the profits, they should expect to share the risks.
This would probably lead to a sudden burst of enthusiasm for "green chemistry," not to mention wind and PV projects, among investors and upper management :-) Exposing others to risk without their knowledge or against their will should be a valid object of litigation and prohibition [i.e. though I fully support everyone's right to smoke or use whatever rec substances make them happy, I also support smoking bans in areas where people congregate out of necessity, like bus stations etc.]. This is not everyone's idea of utopia -- it is too socialist to be libertarian and too libertarian to be socialist -- but it would suit me fine and, I think, offer a lower untimely-mortality rate and more personal happiness than our present system :-)
Sheesh! so much for getting anything else done this evening. Migeru has a habit of raising questions/issues that throw my brain into Overdrive (not to mention the recent mug of Lapsang Souchon that just about made my teeth rattle -- surprised BATF hasn't put it on the CS list)... ah, caffeine and spirited debate, at least it's safer than vodka and handguns :-) thanks to Migeru for rattling my cage on this issue, which I've been meaning to write about but not getting around to.