01 May 2018 | Various Sources | Hawkins Bay Dispatch
The nuclear weapons tests of the United States were performed between 1945 and 1992 as part of the nuclear arms race. The United States conducted around 1,054 nuclear tests by official count, including 216 atmospheric, underwater, and space tests. Most of the tests took place at the Nevada Test Site (NNSS/NTS) and the Pacific Proving Grounds in the Marshall Islands and off Kiritimati Island in the Pacific, plus three in the Atlantic Ocean. Ten other tests took place at various locations in the United States, including Alaska, Nevada other than the NNSS/NTS, Colorado, Mississippi, and New Mexico.
30 April 2018 | Karina Toledo | Phys. org
The bombing of the Japanese cities Hiroshima and Nagasaki by the United States in 1945 was the first and only use of nuclear weapons against civilian targets. Researchers conducted a series of studies in its aftermath to measure the impact of the fallout, in terms of both the radiation dose to which the victims were exposed and the effects of this exposure on DNA and health in general.
Continuing research that started in the 1980s under the leadership of physicist Sérgio Mascarenhas at the University of São Paulo (USP) has resulted in an article in the journal PLOS ONE describing a method of precise measurement of the radiation dose absorbed by the bones of victims of the nuclear bombs dropped on Japan.
“We used a technique known as electron spin resonance spectroscopy to perform retrospective dosimetry. Currently, there’s renewed interest in this kind of methodology due to the risk of terrorist attacks in countries like the United States,” said Professor Oswaldo Baffa at the University of São Paulo.
“Imagine someone in New York planting an ordinary bomb with a small amount of radioactive material stuck to the explosive. Techniques like this can help identify who has been exposed to radioactive fallout and needs treatment.”
Angela Kinoshita, a professor at Universidade do Sagrado Coração in Bauru, São Paulo State, explained that the study is unique insofar as it used samples of human tissue from victims of the bomb dropped on Hiroshima.
“There were serious doubts about the feasibility of using this methodology to determine the radiation dose deposited in these samples, because of the processes involved in the episode,” she said. “The results confirm its feasibility and open up various possibilities for future research that may clarify details of the nuclear attack.”
In the 1970s, when he was teaching at the University of São Paulo’s São Carlos Physics Institute (IFSC-USP), Mascarenhas discovered that X-ray and gamma-ray irradiation made human bones weakly magnetic. The phenomenon, known as paramagnetism, occurs because the hydroxyapatite (crystalline calcium phosphate) in the mineral portion of bone tissue absorbs carbon dioxide ions, and when the sample is irradiated, the CO2 loses electrons and becomes CO2-. This free radical serves as a marker of the radiation dose received by the material.
“I discovered that we could use this property to perform radiation dosimetry and began using the method in archeological dating,” Mascarenhas recalled.
His aim at the time was to calculate the age of bones found in sambaquis (middens created by Brazil’s original inhabitants as mounds of shellfish debris, skeletons of prehistoric animals, human bones, stone or bone utensils, and other refuse) based on the natural radiation absorbed over centuries via contact with elements such as thorium that are present in the sand on the seashore.
On the strength of this research, he was invited to teach at Harvard University in the United States. Before leaving for the US, however, he decided to go to Japan to try to obtain samples of bones from victims of the nuclear bombs and test his method on them.
“They gave me a jawbone, and I decided to measure the radiation right there, at Hiroshima University,” he said. “I needed to prove experimentally that my discovery was genuine.”
Mascarenhas succeeded in demonstrating that a dosimetric signal could be obtained from the sample even though the technology was still rudimentary and there were no computers to help process the results. The research was presented at the American Physical Society’s annual March Meeting, where it made a strong impression. Mascarenhas brought the samples to Brazil, where they remain.
“There have been major improvements in the instrumentation to make it more sensitive in the last 40 years,” Baffa said. “Now, you see digitally processed data in tables and graphs on the computer screen. Basic physics has also evolved to the extent that you can simulate and manipulate the signal from the sample using computational techniques.”
Thanks to these advances, he added, in the new study, it was possible to separate the signal corresponding to the radiation dose absorbed during the nuclear attack from the so-called background signal, a kind of noise scientists suspect may have resulted from superheating of the material during the explosion.
“The background signal is a broad line that may be produced by various different things and lacks a specific signature,” Baffa said. “The dosimetric signal is spectral. Each free radical resonates at a certain point on the spectrum when exposed to a magnetic field.”
To make the measurements, the researchers removed millimeter-scale pieces of the jawbone used in the previous study. The samples were again irradiated in the laboratory using a technique called the additive dose method.
“We added radiation to the material and measured the rise in the dosimetric signal,” Baffa explained. “We then constructed a curve and extrapolated from that the initial dose, when the signal was presumably zero. This calibration method enabled us to measure different samples, as each bone and each part of the same bone has a different sensitivity to radiation, depending on its composition.”
Thanks to this combination of techniques, they were able to measure a dose of approximately 9.46 grays (Gy), which is high in Baffa’s view. “About half that dose, or 5 Gy, is fatal if the entire body is exposed to it,” he said.
The value was comparable with the doses obtained by other techniques applied to non-biological samples, such as measurement of the luminescence of quartz grains present in brick and roof tile fragments found at the bomb sites. According to the authors, it was also close to the results of biological measurement techniques applied in long-term studies using alterations in survivors’ DNA as a parameter.
“The measurement we obtained in this latest study is more reliable and up to date than the preliminary finding, but I’m currently evaluating a methodology that’s about a thousand times more sensitive than spin resonance. We’ll have news in a few months,” Mascarenhas predicted.
Between 1951 and 1963, the United States conducted over 100 above-ground nuclear tests at sites in Nevada. Meyers has found that as many as 690,000 casualties resulted after radioactive pollution made its way into the environment and the food supply.
Exposure to high levels of radiation can cause serious illness or death, and even low doses of radiation can cause cancer. The weapons developers were either unaware of these risks or just completely careless, and people working at test sites or living closeby were not the only ones in harm’s way. Ultimately, millions of people were exposed to “tremendous” amounts of radiation.
“Counter-intuitively, the areas where fallout had the largest impact on the crude death rate was not in the region surrounding the test site, but rather in areas with moderate levels of radioactive fallout deposition in the interior of the country,” said Meyers.
Early attempts to calculate the extent of the fallout from nuclear tests were inaccurate because they did not account for the collateral damage that happened over time and space.
“The largest health effects appear in areas far beyond the scope of previous scientific and medical studies,” explained Meyers. “The scientific and medical literature has studied the effects of atmospheric testing on populations residing in downwind counties in Arizona, Nevada, and Utah.”
After the nuclear tests, pollution was carried far away by winds. In addition, cows consumed irradiated grass which leaked radioactive material into the milk supply.
To analyze the number of deaths which resulted from nuclear testing, Myers used an empirical alternative approach which combined measures of radioactive fallout exposure from the National Cancer Institute and mortality data from across the continental United States.
Meyers found that radioactive fallout was to blame for between 340,000 and 690,000 deaths from 1951 to 1973, which is up to fourteen times more casualties than earlier research had demonstrated.
Meyers explained that putting a stop to nuclear testing inadvertently saved millions of Americans from additional exposure. He estimated that the Partial Nuclear Test Ban Treaty was responsible for saving between 11.7 and 24 million lives.
“The evidence presented in this paper reveals that the health cost of domestic nuclear
testing is both larger and more expansive than previously thought,” said Meyers.
“The mortality estimates may understate the magnitude of the true number of deaths attributable to nuclear testing and the magnitude of the health costs of this polluting defense policy. It is plausible that
these estimates are lower bounds of the true health effects.”
01 March 2002 | Rob Edwards |The New Scientist
Radioactive fall-out from the world’s nuclear weapons tests during the Cold War has killed 11,000 Americans with cancer, according to a new report by US scientists. Experts say that many thousands more are likely to have died in other countries.
The report, prepared by the US Department of Health and Human Services (DHSS) for Congress, is the first attempt to estimate the total number of cancers caused by the atmospheric testing programme. Between 1951 and 1963, 390 nuclear bombs were exploded above the ground, 205 by the US, 160 by the former Soviet Union, 21 by Britain and four by France.
The fall-out from these explosions circulated the globe and exposed the world’s population to radioactivity. Scientists have long assumed that this would result in extra cancers, but until now no government has tried to estimate how many.
The new report concludes that the number of fatal cancers attributable to global fall-out amongst Americans alive between 1951 and 2000 is 11,000. This includes deaths from leukaemia caused by exposure to strontium 90 and from a host of other cancers triggered by other isotopes.
“This is a useful estimate of the long term effects of global fall-out on the population of the US, but it is only part of the story,” says Dudley Goodhead, a leading radiation specialist with the Medical Research Council in Harwell, UK.
“Similar assumptions would lead to estimates of many thousands more cancers throughout the world because fall-out from the atmospheric tests was distributed globally,” he notes.
The sites where bombs were exploded included the Nevada desert in the US, Pacific islands and sites in Kazakhstan and Russia. Atmospheric testing was outlawed by the Partial Test Ban Treaty in 1963, although dozens of atmospheric tests have since been conducted by France and China.
The DHSS report, which was obtained by US senator Tom Harken, does not take fall-out from explosions since 1963 into account. Nor does it include fall-out from the seven atmospheric explosions detonated by the US prior to 1951 such as Hiroshima and Nagasaki in 1945.
The estimate of 11,000 fatal cancers also does not include internal radiation exposure caused by the breathing in or swallowing of radioactive particles. Because of this, the Institute for Energy and Environmental Research, in Takoma Park, Maryland, argues that the actual number of fatal cancers could be 17,000.
The US evidence is likely to provoke demands for other countries to face up to the death toll from nuclear tests. “It’s a horrific legacy,” says Sue Roff, a radiation researcher from the University of Dundee medical school. “The complacency of governments about acceptable levels of environmental radioactivity has been punctured by this authoritative report.”
A growing branch of empirical health economics combines data and rigorous econometric method to tease out the impact of diffuse but important environmental hazards on human health. In recent years, brilliant papers have appeared to examine the health impact of particulate pollution, nuclear accidents, and legal changes such as the Civil Rights Act of 1964. A terrific recent NBER working paper by Sandra Black, Aline Bütikofer, Paul J. Devereux, and Kjell G. Salvanes deserves to join that group.
These authors examined the serious harms inflicted on Norwegian pregnant women (and ultimately on Norwegian children) by Soviet nuclear testing in the late 1950s and early 1960s. As an added bonus, their paper provides further posthumous vindication — as if this were needed — of the great Russian physicist Andrei Sakharov.
Most of my students were born after the fall of the Berlin Wall. Many don’t know who Sakharov really was. That’s too bad. If one had to make a list of the greatest human beings of the last century, Sakharov would be close to the top of any reasonable list. Father of the Soviet hydrogen bomb, he became first among equals among the dissidents courageously fighting for human rights and liberal values in the former Soviet Union.
If one had to identify a single year when Sakharov moved from pampered and compliant technocrat, three-time hero of Socialist Labor, to active dissident, 1962 would probably be that year. It was then that Sakharov waged a lonely, unsuccessful fight to prevent redundant tests of Soviet nuclear bombs. Four years before, Sakharov had published one of the first studies seeking to forecast the contributions of nuclear weapons testing to human cancers.
Analyzing the diffusion of radioactive carbon, Sakharov forecast that the explosion of a one-megaton hydrogen bomb — equivalent in destructive power to 1 million tons of TNT — would cost more than 6,000 lives. These lives would be lost over many generations. These deaths would be hard to distinguish from the much larger number of unrelated fatal cancers. They would be no less important in the lives of real people.
At the time, the Soviet Union sought to demonstrate its resolve by exploding huge, militarily irrelevant hydrogen bombs. The largest of these explosions exceeded 50 megatons. According to Sakharov’s calculations, each such explosion would ultimately take hundreds of thousands of lives. He was a loyal Soviet citizen who believed that his country needed powerful weapons to deter western powers. He wanted this work conducted in safety, with restraint. He subsequently used his scientific influence to support a nuclear test ban, too.
Like the United States, the Soviet Union established two main labs designing these weapons. Inevitably, the two labs developed rather similar prototypes of a new bomb. Bureaucratic rivalry produced the predictable result: Each lab sought to explode its own bomb, although the resulting redundancy would expose millions of people to additional toxic fallout in return for very little military or scientific gain.
Sakharov frantically tried to stop one of the blasts — either one, he didn’t ultimately care. He appealed all the way up the chain to the leader of the Soviet Union, Nikita Khrushchev. Khrushchev frankly rebuffed Sakharov, sending a clear message that scientists should not meddle in larger policy matters. After the second explosion, Sakharov exclaimed, “A terrible crime had been committed, and I couldn’t prevent it…. I dropped my face on the table and wept.” (quote from Gennady Gorelik, “The World of Andrei Sakharov: A Russian Physicist’s Path to Freedom,” p. 288.)
The physicist’s anger and disappointment changed him. He began the journey from compliant technocrat to something more skeptical, to internal dissenter, and, eventually, to a political adversary of the Soviet regime.
Sakharov’s 1958 paper examined cancer. Black and her colleagues examined different consequences of nuclear testing that are scarcely less profound: The impact on Norwegian children of prenatal exposure to the fallout created by the frequent Soviet nuclear tests in the 1950s and early 1960s. Their specific findings are of interest. So is their clever and intricate study design.
Snippy ethics committees don’t allow researchers to conduct randomized human trials of nuclear fallout exposure. Black and colleagues did the next best thing. They exploited variations over time and space that led different pregnant women to receive different radioactive exposures depending rather randomly on where they live, the happenstance timing of nuclear tests, and local weather patterns. As the authors note:
The western Norwegian coast line was particularly exposed to atomic fallout coming from nuclear testing taking place in Novaya Zemlya in the Russian arctic archipelago, one of the most intense test regions between 1955 and 1962.
The fallout varies significantly by municipality and also over time. There was an international moratorium on nuclear testing from November 1958 to September 1961 so Norway received almost no fallout in the second half of 1959, in 1960, and throughout most of 1961. The partial test ban treaty in October 1963 led to very little fallout in 1964 or in subsequent years. However, there is significant fallout in 1957 and 1958 and, even more so, in 1962 and 1963 because the explosions after the expiration of the moratorium were much larger than before.
The figures below (reproduced by permission) illustrates the pattern.
Norway also kept amazingly accurate records. Between 1956 and 1984, the Norwegian military carefully monitored radioactivity in the air, in rain and snow, and on the ground on a daily basis at 13 stations across the country.
The authors cite estimates that Norwegians in the hardest-hit areas were exposed to an annual radiation dose equivalent to about twice the dose one would receive from a whole-body computed tomography (CT) scan, and about 60 times the external dose from an X-ray mammogram. That’s not homicidal, but it is harmful. We don’t routinely subject pregnant women to such things.
Meanwhile, a kind of linked super-census called the Norwegian Registry included data on citizens’ educational attainment, personal and family characteristics, employment, and earnings. Focusing on people born between 1956 and 1966, the authors identified the period of time in which each individual was in months three and four in-utero. Because Norwegian men had universal military service, the authors could obtain additional information on boys’ subsequent IQ score, psychological data, and height on service entry.
Given the availability of such data, the statistical analysis is pretty simple. The authors try various approaches, including to compare outcomes among siblings, one of whom was likely exposed to some form of fallout in utero and one was not.
Here’s what they found. Exposure to nuclear radiation during months three and four of pregnancy was associated with reduced educational attainment, high school completion, and adult earnings. Such exposures were also associated with reduced IQ scores among boys at 18 years of age. A one standard deviation increase in ground exposure reduced high school completion by about 1 percentage-point among men, and by about 2 percentage points among women.
I should mention that American and European atmospheric, ground, and underwater tests brought their own harms and environmental damage. The United States’s 15-megaton Castle Bravo was perhaps the most famous, inducing sometimes-fatal radiation sickness among Japanese fishermen and Pacific islanders.
The average effects found by Black and her colleagues don’t seem very large from the perspective of any single individual. These are more sobering when you consider that swathes of an entire country were exposed to toxic fallout over nearly a decade. Norway experienced 635,050 live births from 1956 to 1965. Radioactive fallout affected many of these children. The human health consequences remain with us. And of course the health consequences of these explosions (and everything in the production process leading up to them) were surely greater in the old Soviet Union.
Toxic plumes created 50 or 60 years ago still cast long shadows on human lives. We can’t change that. We can investigate what actually happened. We can also learn from the examples of Sakharov and others, who resisted normal bureaucratic pressures and professional incentives and remained willing to speak out. That wasn’t easy in 1962. It’s not so easy in 2013, either.