On the 26,000 tons of radioactive waste under Lake Powell

Jonathan P. Thompson

Beneath the murky green waters on the north end of Lake Powell, entombed within the tons of silt that have been carried down the Colorado River over the years, lies a 26,000-ton pile of un-remediated uranium mill tailings. It’s just one polonium-, bismuth-, thorium- and radium-tainted reminder of the way the uranium industry, enabled by the federal government, ravaged the West and its people for decades.

In 1949, the Vanadium Corporation of America built a small mill at the confluence of White Canyon and the Colorado River to process uranium ore from the nearby Happy Jack Mine, located upstream in the White Canyon drainage (and just within the Obama-drawn Bears Ears National Monument boundaries). For the next four years, the mill went through about 20 tons of ore per day, crushing and grinding it up, then treating it with sulfuric acid, tributyl phosphate and other nastiness. One ton of ore yielded about five or six pounds of uranium, meaning that each day some 39,900 pounds of tailings were piled up outside the mill on the banks of the river.

In 1953, the mill was closed down, and the tailings left where they sat — uncovered — as was the practice of the day. Ten years later, water began backing up behind the newly built Glen Canyon Dam. Federal officials decided just to let the reservoir’s waters inundate the tailings. There they remain today.

Those 26,000 tons of tailings likely make up just a fraction of the radioactive material contained in the silt of Lake Powell and Lake Mead. During the uranium days of the West, more than a dozen mills — all with processing capacities at least 10 times larger than the one at White Canyon — sat on the banks of the Colorado River and its tributaries. They did not dispose of their tailings in an environmentally responsible way.

At the Durango mill, the tailings were piled into a hill-sized mound just a stone’s throw from the Animas River. They weren’t covered or otherwise contained, so when it rained, tailings simply washed into the river. Worse, the mill’s liquid waste stream poured directly into the river at a rate of some 340 gallons per minute, or half-a-million gallons per day. It was laced not only with highly toxic chemicals used to leach uranium from the ore and iron-aluminum sludge, a milling byproduct, but also radium- tainted ore solids.

Radium is highly radioactive and a “bone-seeker,” meaning that when it’s ingested it makes its way to the skeleton, where it decays into other radioactive daughter elements, including radon, and bombards the surrounding tissue with alpha, beta and gamma radiation. According to the Toxic Substances and Diseases Registry, exposure leads to “anemia, cataracts, fractured teeth, cancer (especially bone cancer) and death.”

It wasn’t any better at any of the other mills. In the early 1950s, researchers from the U.S. Public Health Service sampled Western rivers and found that “the dissolved radium content of river water below uranium mills was increased considerably by waste discharges from the milling operations” and that “radium content of river muds below the uranium mills was 1,000 to 2,000 times natural background concentrations.”

That was just from daily operations. In 1960, one of the evaporation ponds at the Shiprock mill broke, sending at least 250,000 gallons of highly acidic raffinate (liquid waste), containing high levels of radium and thorium, into the river. None of the relevant officials were notified and individual users continued to drink the water, put it on their crops, and give it to their sheep and cattle. It wasn’t until five days later, after hundreds of dead fish had washed up on the river’s shores for sixty miles downstream, that the public was alerted to the disaster.

Of course, what’s dumped into the river at any millsite doesn’t stay there. It slowly migrates downstream. In the early 1960s, while Glen Canyon Dam was still being constructed, the Public Health Service did extensive sediment sampling in the Colorado River Basin, with a special focus on Lake Mead’s growing bed of silt, which had been piling up at a rate of 175-million tons per year since Hoover Dam started impounding water in 1935. The Lake Mead samples had higher-than-background levels of radium-226.

The report thus concluded: “The data have shown, among other things, that Lake Mead has been essentially the final resting place for the radium-contaminated sediments of the Basin. With the closure of Glen Canyon Dam upstream, Lake Powell will then become the final resting place for future radium contaminated sediments. The data also show that a small fraction of the contaminated sediment has passed through Lake Mead to be trapped by Lakes Mohave and Havasu.”

And so, the billions of tons of silt that have accumulated in Lake Mead and Lake Powell serve as archives of sorts. They hold the sedimental records of an era during which people, health, land and water were all sacrificed in order to obtain the raw material for nuclear weapons that are capable of destroying all of humanity.

Jonathan P. Thompson is an award-winning freelance author, journalist and editor, who writes about the American West, with an emphasis on energy development, pollution, land-use politics, and economics. His article is edited from the Bulletin of the Atomic Scientists, January 2, 2018, and was reprinted in PeaceMeal, March/April 2018.

Biologists say half of all species could be extinct by end of century

One in five species on Earth now faces extinction, and that will rise to 50 percent by the end of the century unless urgent action is taken. That is the stark view of the world’s leading biologists, ecologists and economists who gathered in February to determine the social and economic changes needed to save the planet’s biosphere.

“The living fabric of the world is slipping through our fingers without our showing much sign of caring,” said the organizers of the Biological Extinction conference held at the Vatican. The meeting was one of a series set up by the Vatican on ecological issues, which Pope Francis has deemed an urgent issue for the Catholic church.

Threatened creatures such as the tiger or rhino may make occasional headlines, but little attention is paid to the eradication of most other life forms, the organizers argued. But as the conference heard, those other animals and plants provide us with our food and medicine. They purify our water and air while also absorbing carbon emissions from our cars and factories, regenerating soil, and providing us with aesthetic inspiration.

“Rich western countries are now siphoning up the planet’s resources and destroying its ecosystems at an unprecedented rate,” said biologist Paul Ehrlich of Stanford University. “We want to build highways across the Serengeti to get more rare earth minerals for our cellphones. We grab all the fish from the sea, wreck the coral reefs and put carbon dioxide into the atmosphere. We have triggered a major extinction event. The question is: how do we stop it?”

Ehrlich pointed to the world’s spiralling population as a factor in the problem. United Nations statistics suggest that the global population will increase from the current 7.4 billion to 11.2 billion by 2100. Most of the added billions will appear in Africa, where the fertility rate is still twice that of the rest of the world.

The crucial point is to put the problem of biological extinctions in a social context, said economist Sir Partha Dasgupta of Cambridge University. “That gives us a far better opportunity of working out what we need to do in the near future.”

Ehrlich agreed: “If you look at the figures, it is clear that to support today’s world population sustainably — and I emphasize the word sustainably — you would require another half a planet to provide us with those resources. However, if everyone consumed resources at the U.S. level — which is what the world aspires to — you will need another four or five Earths.

“We are wrecking our planet’s life support systems,” he said. “We have the capacity to stop that. The trouble is that the danger does not seem obvious to most people, and that is something we must put right.”

– edited from an article by Robin McKie in The Guardian (U.K.), February 25, 2017
PeaceMeal, March/April 2017

(In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes.)

Scientists are frantically copying U.S. climate data, fearing it might vanish under Trump

Alarmed that decades of crucial climate measurements could vanish under a hostile Trump administration, scientists and database experts have begun a feverish attempt to copy reams of government data onto independent servers in hopes of safeguarding it from any political interference.

“Something that seemed a little paranoid to me before all of a sudden seems potentially realistic, or at least something you’d want to hedge against,” said Nick Santos, an environmental researcher at the University of California at Davis, who began copying government climate data onto a nongovernment server, where it will remain available to the public.

Trump has Cabinet members who have questioned the overwhelming scientific consensus about global warming. His transition team had asked Department of Energy officials for names of employees and contractors who have participated in international climate talks and worked on the scientific basis for Obama-era regulations of carbon emissions. One Trump adviser suggested that NASA no longer should conduct climate research and instead should focus on space exploration.

Michael Halpern, deputy director of the Center for Science and Democracy at the Union of Concerned Scientists, argued that Trump had appointed a “band of climate conspiracy theorists” to run transition efforts at various agencies, along with nominees to lead them who share similar views. “They have been salivating at the possibility of dismantling federal climate research programs for years. It’s not unreasonable to think they would want to take down the very data that they dispute,” Halpern said in an email.

Those moves stoked fears among the scientific community that Trump, who called the notion of man-made climate change “a hoax” and vowed to reverse environmental policies put in place by President Obama, could try to alter or dismantle parts of the federal government’s repository of data on everything from rising sea levels to the number of wildfires in the country. Those fears were validated when Trump nominated Scott Pruitt, the former attorney general of Oklahoma, who now heads the Environmental Protection Agency. Pruitt has a record of repeatedly suing the EPA over its regulations to protect the environment.

To be clear, neither Trump nor his appointees have said that the new administration plans to manipulate or curtail publicly available data. But some scientists aren’t taking any chances.

Climate data from NASA and the National Oceanic and Atmospheric Association already have been politically vulnerable. When Tom Karl, director of the National Centers for Environ-mental Information, and his colleagues published a study in 2015 seeking to challenge the idea that there had been a global warming “pause” during the 2000s, they relied, in significant part, on updates to NOAA’s ocean temperature data set, saying the data “do not support the notion of a global warming ‘hiatus.’”

Andrew Dessler, a professor of atmospheric sciences at Texas A&M University, said he doubts even the most hostile admin-istration would try to do away with existing climate data. “I think it’s much more likely they’d try to end the collection of data, which would minimize its value. Having continuous data is crucial for understanding long-term trends. ... If you can just get rid of the data [about long-term changes], you’re in a stronger position to argue we should do nothing about climate change.”

– edited from an article by Brady Dennis in The Washington Post, December 13, 2016
PeaceMeal, March/April 2017

(In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes.)

Sea-level rise ‘could last twice as long as human history’

Huge sea-level rises caused by climate change will last far longer than the entire history of human civilization to date, according to new research, unless the brief window of opportunity of the next few decades is used to cut carbon emissions drastically. Even if global warming is capped at governments’ target of 2C, which is already seen as difficult, 20 percent of the world’s population will eventually have to migrate away from coasts swamped by rising oceans. Cities including New York, London, Rio de Janeiro, Cairo, Calcutta, Jakarta and Shanghai would all be submerged.

“Much of the carbon we are putting in the air from burning fossil fuels will stay there for thousands of years,” said Prof. Peter Clark at Oregon State University, who led the new work. “People need to understand that the effects of climate change won’t go away, at least not for thousands of generations.”

“The long-term view sends the chilling message of what the real risks and consequences are of the fossil fuel era,” said Prof. Thomas Stocker at the University of Bern, Switzerland, and also part of the research team. “It will commit us to massive adaptation efforts so that, for many, dislocation and migration becomes the only option.”

The report, published in the journal Nature Climate Change, notes most research looks at the impacts of global warming by 2100 and so misses one of the biggest consequences for civili-zation — the long-term melting of polar ice caps and sea-level rise. This is because the great ice sheets take thousand of years to react fully to higher temperatures. The researchers say this long-term view raises moral questions about the kind of environ-ment being passed down to future generations.

The research shows that even with climate change limited to 2C by tough emissions cuts, sea level would rise by 25 meters (82 feet) over the next 2,000 years or so and remain there for at least 10,000 years — twice as long as human history. If today’s burning of coal, oil and gas is not curbed, the sea would rise by 50 meters (164 feet), completely changing the map of the world.

By far the greatest contributor to sea level rise — about 80 per-cent — would be the melting of the Antarctic ice sheet. Another study in Nature Climate Change published in February reveals that some large Antarctic ice sheets are dangerously close to losing the sea ice shelves that hold back their flow into the ocean.

Huge, floating sea ice shelves around Antarctica provide but-tresses for the glaciers and ice sheets on the continent. But when they are lost to melting, as happened the with the Larsen B shelf in 2002, the speed of flow into the ocean can increase eightfold.

Avoiding the long-term swamping of many of the world’s greatest cities is already difficult, given the amount of carbon dioxide already released into the atmosphere. “Sea-level rise is already baked into the system,” said Prof. Stocker, one of the world’s leading climate scientists.

However, the rise could be reduced and delayed if carbon is removed from the atmosphere in the future, he said. “If you are very optimistic and think we will be in the position by 2050 or 2070 to have a global-scale carbon removal scheme, which sounds very science fiction, you could pump down CO2 levels. But there is no indication that this is technically possible.” A further difficulty is the large amount of heat and CO2 already stored in the oceans.

Prof. Stocker stated, “The actions of the next 30 years are absolutely crucial for putting us on a path that avoids the [worst] outcomes and ensuring, at least in the next 200 years, the impacts are limited and give us time to adapt.”

“We are making choices that will affect our grandchildren’s grandchildren and beyond,” said Prof. Daniel Schrag at Harvard University. “We need to think carefully about the long time-scales of what we are unleashing.”

– edited from an article by Damian Carrington in The Guardian (U.K.), February 8, 2016
PeaceMeal, March/April 2016

(In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes.)

Damage from sinking land costing California billions

DOS PALOS, Calif. — A canal that delivers vital water supplies from Northern California to Southern California is sinking in places. So are stretches of a riverbed undergoing historic restoration. On farms, well casings pop up like mushrooms as the ground around them drops.

Four years of drought and heavy reliance on pumping of groundwater have made the land sink faster than ever in the Central Valley, requiring repairs to infrastructure that experts say are costing billions of dollars. This slow-motion land subsidence — more than one foot a year in some places — is not expected to stop anytime soon, experts say, nor will the expensive repairs.

“It’s shocking how a huge area is affected, but how little you can tell with your eye,” said James Borchers, a hydrogeologist, who studies subsidence and says careful monitoring is necessary to detect and address sinking before it can do major damage to costly infrastructure such as bridges and pipelines.

Land subsidence is largely the result of pumping water from the ground. As aquifers are depleted, the ground sags. The most severe examples today are in the San Joaquin Valley, where the U.S. Geological Survey in 1975 said half of the land is prone to sinking. USGS researchers later called it one of the “single largest alterations of the land surface attributed to humankind.” A sparse mountain snowpack in California’s driest four-year span on record has forced farmers in the Central Valley, the nation’s most productive agricultural region, to rely on ground-water to irrigate their crops. Drought has spawned a well-drilling boom, with some tapping ancient aquifers 3,000 feet down.

Decades of over-pumping and associated land subsidence is accelerating. Last year near in Corcoran, the land sank 13 inches in eight months, researchers at NASA’s Jet Propulsion Laboratory found by comparing satellite images collected over time. Parts of the California Aqueduct, a 400-mile canal that delivers water to Southern California, also sank by nearly 13 inches.

California became the last state in the West to regulate groundwater when Gov. Jerry Brown signed legislation last year, ending a Gold Rush-era policy that generally let property owners take as much as they wanted. But local agencies have until 2040 to put groundwater management plans into effect.

– edited from an article by Scott Smith of The Associated Press, December 28, 2015
PeaceMeal, March/April 2016

(In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes.)