Computer Bombs

by W. Wayt Gibbs
Scientific American,March, 1997

To those who handle nuclear weapons--and to anyone within several hundred kilometers of them--two questions are paramount. First, will a warhead, having been trucked around from one stockpile to another for 20 years, go off accidentally? Second, will it explode as intended when used in anger? The physicists at the U.S. Department of Energy's weapons laboratories responsible for certifying that hydrogen bombs are both safe and reliable have not been able, since 1992, to check their calculations by either damaging or detonating one underground. If the Senate ratifies, and India reverses its opposition to, the Comprehensive Test Ban Treaty signed by the U.S. last September, they may never be able to do so again. How will they know for certain?!

The DOE's answer, a plan called science-based stockpile stewardship, is to use the fastest supercomputers yet devised to simulate nuclear explosions along with all the important changes that occur to weapons as they age. The plan has stirred vigorous debate among arms-control advocates, military strategists and, most recently, university researchers, over whether the approach is cost-effective, feasible and wise.

The Doe expects that stockpile stewardship will cost about $4 billion a year-$400 million more than the DOE's annual weapons budget during the cold war, according to Christopher E. Paine, a nuclear arms analyst with the Natural Resources Defense Council. The agency intends to spend more than $2 billion on new experimental instruments, including the National Ignition Facility. These devices will attempt, using lasers, x-rays and electrical pulses, to measure how bomb components (except for the radioactive pits) behave in conditions similar to those in a nuclear explosion. Another $1 billion or so will go to the Accelerated Strategic Computing Initiative (ASCI) to buy three supercomputers, each of a different design, and to develop computer models based on, and tested against, experimental data. "This level of simulation requires high-performance computing far beyond our current level," the ASCI program plan asserts, because "these applications will integrate 3-D capability, finer spatial resolution and more accurate and robust physics."

Paine and others question that necessity. "Do we really need three machines?" he asks. "After all, the labs, using their existing computers and software, have certified that the nuclear stockpile is currently safe. ASCI presumes that we will detect problems never seen before that require much higher simulation capabilities to resolve. That is unsubstantiated. In fact, the data suggest that weapons become safer with age." They also grow less likely to detonate on command, however.

Robert B. Laughlin, a professor at Stanford University who has worked on bomb-related physics at Lawrence Livermore National Laboratory since 1981, worries that "computer programs can only simulate the stuff you know. Suppose you left a personal computer out in the rain for a year. Is there a program that can tell you whether it will still run? Of course not--it all depends on what happened to it." Likewise with nuclear warheads, he says: "Changes happen over time that you are not sure how to measure. Some matter, some don't. The problem is the things you didn't think to put in the simulation."

Indeed, skeptics note, some previous attempts to simulate very complex systems--such as the behavior of oil surfactants, the Ariane 5 rocket SIMULATION OF METAL EX and plasma fusion reactors-- showing the growth of failed to forecast the outcome of field tests, at great cost to those who relied on the simulations. The software codes developed since the 1950s to predict whether bombs will mushroom or fizzle "are full of adjustable parameters that have been fit to [underground test] data," Laughlin reports. "If the new codes don't match the old ones that correctly predicted experiment results"--and Laughlin bets that they won't-"the designers will simply throw them out."

To minimize the uncertainty in its models, the DOE is looking to academic engineers for help. In December the agency offered to sponsor two to five university research centers with up to $5 million a year and supercomputer access for each. "The goal isn't to get them to do our job," says Richard W. Watson, who is managing the program at Lawrence Livermore, "but to establish in the scientific community confidence in simulation as a valid third arm of science alongside theory and experiment." Although researchers will be allowed to publish all their work--none will be classified--the DOE is asking specifically for projects that focus on areas, such as material stress and the interior of stars, that are not too distant from its weapons work. (Most academic institutions generally forbid their staff from conducting weapons and other classified research on university time.)

Most schools have responded enthusiastically--of 10 contacted for this article, all planned to submit preliminary proposals . Some of the eagerness may reflect an imminent consolidation of National Science Foundation funding to the four federal computing centers. "If one center were cut off, ASCI would be there," concedes Malvin H. Kalos, director of the super-computing center at Cornell university. But many scientists welcome the intellectual challenge as well.

  • Subcrital Tests