Utah F.O.R.G.E.

Utah F.O.R.G.E.


The Utah FORGE Project

The Frontier Observatory for Geothermal Research

There is something deceptively simple about geothermal energy. The crushing force of gravity compacts the earth to the point where its molten metal center is 9,000 degrees Fahrenheit. Even thousands of miles out near the surface, the temperature is still hundreds of degrees.

In some places, that heat reaches the surface, either as lava flowing up through volcanic vents, or as steaming water bubbling up in hot springs. In those places, humans have been using geothermal energy since the dawn of time.

But what if we could drill down into the rock and, in essence, create our own hot spring? That is the idea behind “enhanced geothermal systems,” and the most promising such effort in the world is happening in Beaver County.

Called Utah FORGE (Frontier Observatory for Geothermal Research), the site 10 miles north of Milford is little more than a drill pad and a couple of buildings on Utah School and Institutional Trust Lands Administration land. But it is the U.S. Department of Energy’s foremost laboratory for enhanced geothermal research, and the University of Utah is the scientific overseer. Seven years ago, the U of U’s proposal won out in a national competition against three of the DOE’s own national laboratories.

“If you have to pick the best area in the country to build an EGS plant, you’re going to be driven to Milford. DOE recognized that in 2015,” said Joseph N. Moore, a University of Utah Professor with the Department of Geology & Geophysics and the principal investigator for Utah FORGE.

Professor Joseph N. Moore

Among the advantages:

  • It’s in a known area of thermal activity. Nearby is Roosevelt Hot Springs, and a small nearby geothermal plant has been producing electricity for about 30,000 homes for years.
  • It has hundreds of cubic miles hot granite below the surface with no water flowing through it.
  • There is accessible water that can’t be used for drinking or agriculture because it contains too many naturally occurring minerals. But that water can be used for retrieving heat from underground.
  • It has access to transmission lines. Beaver County is home to a growing amount of wind and solar power generation, helping access to consumers.

DOE has invested $50 million in FORGE, and now it’s adding another $44 million in research money. The U of U is soliciting proposals from scientists.

“These new investments at FORGE, the flagship of our EGS research, can help us find the most innovative, cost-effective solutions and accelerate our work toward wide-scale geothermal deployment and support President Biden’s ambitious climate goals,” said Energy Secretary Jennifer Granholm.

The idea is to drill two deep wells more than a mile down into solid granite that registers around 400 degrees. Then cold water is pumped down one well so hot water can be pulled out through the second well. One of those wells has been drilled, and the second is planned for next year.

But if it’s solid rock, how does the water get from one well to the other? The scientists have turned to a technology that transformed the oil and gas industry: hydraulic fracturing, also known as “fracking.” They are pumping water down under extremely high pressures to create or expand small cracks in the rock, and those cracks allow the cold water to flow across the hot rock to the second well. They have completed some hydraulic fracturing from the first well.

Moore is quick to point out that using a fracturing process for geothermal energy does not produce the environmental problems associated with oil and gas fracking, largely because it doesn’t generate dirty wastewater and gases. Further, the oil released in the fracturing can lubricate underground faults, and removing the oil and gas creates gaps, both of which lead to more and larger earthquakes.

Energy Secretary Jennifer Granholm

The fracturing in enhanced geothermal does produce seismic activity that seismologists are monitoring closely, Moore said, but the circumstances are much different. In geothermal fracturing, there is only water, and it can be returned to the ground without contamination. And producing fractures in an isolated piece of granite is less likely to affect faults. The hope, he said, is that once there are enough cracks for sufficient flow from one pipe to the other, it can produce continuous hot water without further fracturing.

And it never runs out. Moore said that even 2% of the available geothermal energy in the United States would be enough the power the nation by itself.

This next round of $44 million in federal funding is about taking that oil and gas process and making it specific to enhanced geothermal. That includes further seismic study, and coming up with the best “proppant” — the material used to keep the fracture open. Oil and gas use fracking sand to keep the cracks open, and the higher temperatures of geothermal make that challenging.

“FORGE is a derisking laboratory,” said Moore, meaning the U of U scientists, funded by the federal government, are doing some heavy lifting to turn the theory of EGS into a practical clean-energy solution. He said drilling wells that deep costs $70,000 a day. They drill 10 to 13 feet per hour, and it takes six hours just to pull out a drill to change the bit, something they do every 50 hours. That early, expensive work makes it easier for private companies to move the technology into a commercially viable business. Moore said all of the research is in the public domain.

Moore said FORGE doesn’t employ many full-time employees in Beaver County at this point, but it has used local contractors for much of the work, and it has filled the county’s hotel rooms for occasional meetings. High school students have also been hired to help with managing core samples from the deep wells.

“They’ve collaborated really well with the town,” said Milford Mayor Nolan Davis. Moore and others have made regular presentations to his city council, and they’ve sponsored contests in the high school to teach students about geothermal energy. People in town, Davis said, are well aware that the world is watching Utah FORGE, and there is hope geothermal energy will become a larger presence if and when commercial development begins. “We hope they can come in and maybe build several small power plants.”

Davis also noted that the power from Beaver County’s solar and wind plants are already contracted to California. “We’d like to get some power we can keep in the county.”

 

by Tim Fitzpatrick, first published @ sltrib.com

Tim Fitzpatrick is The Salt Lake Tribune’s renewable energy reporter, a position funded by a grant from Rocky Mountain Power. The Tribune retains all control over editorial decisions independent of Rocky Mountain Power.

This story is part of The Salt Lake Tribune’s ongoing commitment to identify solutions to Utah’s biggest challenges through the work of the Innovation Lab.

 

>> HOME <<

 

Nuclear Recycling

Nuclear Recycling


Spent nuclear fuels pose a major environmental concern. Can they be recycled?

A significant problem with the use of nuclear reactors is what’s left behind — the nuclear waste from spent fuel rods. Where to dispose of this waste has been the source of much controversy.

But instead of just burying the spent fuel rods, what if you could somehow recycle them to be used again? University of Utah researchers will be working with a team from the Idaho National Laboratory (INL) to develop an innovative yet simple process of recycling metal fuels for future advanced nuclear reactors. These reactors are designed to be safer than existing reactors, more efficient at producing energy, and cheaper to operate. The team was awarded a three-year, $2.1 million grant from the U.S. Department of Energy’s ARPA-E program for the project.

Michael Simpson

“With current light water-cooled nuclear reactors, you use the fuel for only about five years, then what do you do with it? Where do you dispose it? We currently have no place to put it other than on the site of the nuclear power plant that used it,” says University of Utah Materials Science and Engineering professor Michael Simpson, who will lead the U team supporting the project. “A better idea is to use a physical or chemical process to make the fuel usable in the reactor again.”

According to the Department of Energy, there is currently no permanent repository for spent radioactive fuel rods, so the more than 83,000 metric tons of nuclear waste are stored in more than 75 reactor sites around the U.S. in either steel-lined concrete pools of water or in steel and concrete containers. They will stay there until a consolidated interim storage facility or permanent site is established.

A key step to solving this problem is to demonstrate and commercialize advanced nuclear reactors such as the sodium cooled fast reactor (SFR) that features metallic uranium fuel designed with recycling in mind. Simpson will collaborate with the INL team that originally conceived of the method, which involves a dynamic heat treatment of the spent fuel rods from SFRs. In theory this will cause unrecyclable waste to be separated from the fuel materials that can be used again. Simpson says the remaining waste that needs to be disposed of in this process would be at least an “order of magnitude” less in volume than the original untreated amount. Furthermore, they will be able to utilize the large fraction of fissionable material to produce power that would otherwise be thrown away.

“We reduce the volume of nuclear waste that has to be disposed of, and we get more energy in the long run,” he says.

The U team will develop a computational model of the separation of the different metals in the heating process and collect data from a new furnace system that will be designed and purchased with the funding from the grant to validate the model.

Spent nuclear fuel at the Hanford nuclear site.

Simpson expects the first advanced nuclear reactors that could use this recycling process could go online by the 2030s. Currently, there are 94 commercial nuclear reactors in the U.S. based on light water reactor technology that all told generate nearly 20% of the nation’s total energy each year. Some advanced reactors such as SFRs could use a fuel that is more suitable for recycling, as will be demonstrated in this project.

“This process will help pave the way for sustainable nuclear energy with minimal environmental impact and allow the U.S. to produce more energy while better addressing the global warming issue,” Simpson says. “We want to transition away from coal and natural gas to renewable and nuclear energy for producing electricity. This allows us to continue to use nuclear energy without worrying about this unsolved nuclear waste problem. Instead of just directly disposing it, we can recycle most of it and produce much less nuclear waste.”

The INL/University of Utah project is one of 11 to receive a total of $36 million for research from ARPA-E to increase the deployment and use of nuclear power as a reliable source of clean energy while limiting the amount of waste produced from advanced nuclear reactors.

This project is just the newest collaboration between researchers from the U’s College of Engineering and College of Mines and Earth Sciences with INL scientists who are developing new technologies for nuclear energy, communications, power grids, and more.

Last month, the University of Utah and INL announced a new formal research partnership between both institutions that will explore deeper research collaborations and expand opportunities for students, faculty, and researchers.

 

 

First published @ mse.utah.edu

 

>> HOME <<

 

Toxic Dust Hot Spots

Toxic Dust Hot Spots


Kevin Perry

Where is Great Salt Lake's toxic dust most likely to originate?

Professor Kevin Perry believes there are many "trigger points" that indicate when there is something wrong with the Great Salt Lake.

For instance, anyone who has come to the lake for recreation has recently found an inability to launch watercraft as the lake levels continue to reach all-time lows. Struggles for the vital brine shrimp industry and a possible collapse of the lake's base food chain are other alarms on the horizon, says Perry, a professor of atmospheric science at the University of Utah.

Toxic dust from the drying lakebed ultimately became one of the first alarms that captivated researchers, though. The Great Salt Lake contains arsenic and other metals that are naturally occurring, while some researchers say could even be human-caused. And as the lake shrinks, it has exposed some 800 square miles of exposed lakebed, equivalent to the entire surface area of Maui.

Researchers are starting to identify places around the dried-up lake that are most likely to produce dust that is ultimately carried into Utah communities, Perry says. He pinpoints Farmington Bay in Davis County, Bear River Bay near Brigham City and Ogden, and the lake's northwest boundary in a remote part of Box Elder County as the three largest dust "hot spots."

Fragile eroding surface crust - Kevin Perry

These three locations have the highest potential of sourcing dust all over northern Utah for years to come unless there's a dramatic turnaround in the lake levels, Perry explained Tuesday evening in a presentation about dust concerns to the Utah Legislature's bipartisan Clean Air Caucus.

But before rushing into a panic, Perry told lawmakers there is still so much more research needed to fully understand the dust carried out of the dried Great Salt Lake, including if and how much of a role it plays in long-term health concerns.

Dust Hot Spots
There are certain spots within the 800 square miles of exposed lakebed with a higher potential to produce dust that is carried into Utah communities during storms. While winds typically impact areas east of the lake, like Wasatch Front communities, weather patterns can blow the dust into areas all over northern Utah.

"Everybody along the Wasatch Front (and Tooele Valley) is impacted at certain times," Perry said after Tuesday's meeting.

Perry's research over the years has focused on identifying the frequency that dust is exposed in the atmosphere and also the concentration levels of dust in the air that Utahns breathe to understand public health impacts. It's helped him figure out the areas where dust is more likely to be picked up.

Soil with higher amounts of erodible material like silt and clay are more likely to be picked up into the air. Farmington Bay, Bear River Bay and the "extreme" northwest quadrant of the lake have the highest levels of silt and clay of any exposed lakebeds, where the materials make up at least 10% of the soil samples. Most of it arrives from the lake's tributaries like the Jordan, Bear and Weber rivers.

Map of Dust Hot Spots - Kevin Perry

They are the same areas where the lake's surface crust is vulnerable. Perry explains that only about 9% of the lakebed is actively producing dust because three-fourths of the lake is currently protected by a crust, such as the natural salt pan that protects the lakebed from breaking.

The dust coming from the remaining quarter either doesn't have crust or the crust is considered erodible. Human activity from illegal motor vehicle riding on the exposed lakebed is one reason for this crust breaking, and dust can blow freely in the wind once the surface erodes.

Again, Farmington and Bear River bays emerge as hot spots, as well as Gilbert and Gunnison bays on the western corners of the lake. And while most of the lakebed is protected now, the amount of protection decreases every year it is exposed because of how fragile the crust is, Perry adds.

The Air Quality Threat
This dust is a problem just because of its ability to raise particulate matter levels, something Utahns are accustomed to hearing about from wildfires and during winter inversions that threaten Utah's air quality. But Perry cautions it is too early to know what the true human impact of the dust will be.

The lakebed contains levels of arsenic, lanthanum, lithium, zirconium, copper and other metals above the Environmental Protection Agency's residential and industrial standards. Of those, arsenic, which can increase the risk of a few diseases when there is chronic exposure, has the highest levels compared to EPA standards, according to Perry.

However, it is not very clear how much of it people are actually breathing in during a wind event. The dose levels, a calculation of concentration, frequency and bioavailability, are needed to fully understand the true human risk associated.

This data is collected by the Utah Division of Air Quality but Perry says it hasn't been analyzed to this point because of the cost: $27,500 per site annually. Until that is available, researchers don't really know any component in the dose level equation, including how many days of the year dust ends up in surrounding communities or if some communities have disparities compared to others.

This is why Perry emphasizes that what is in the dust should be considered a "potential concern." He likens this uncertainty to driving on an unfamiliar mountain road in the dark. Motorists are more likely to slow down and focus on the road ahead of them when they perceive a risk of driving off the roadway.

The same idea applies to the science of the Great Salt Lake.

"What we've done here is identify a risk," Perry says. "The risk is exposure to (the) heavy metal arsenic, and so what we need to do is step back and try and understand the significance of that risk. ... We need to do more research, we need to take more measurements but we need to be vigilant because there is a threat out there. We need to determine if that threat will be realized or not."

Representative Ray Ward

This is not a problem that might happen in the future, the lake is three-fourths of the way gone today and we really, really need to have a sustained focus on it over a longer period of time to make sure we put enough water into it.  - Rep. Ray Ward, R-Bountiful

 

Rep. Ray Ward, R-Bountiful, a member of the Clean Air Caucus, said after the meeting that the presentation didn't immediately spark any new bill ideas for the future; however, he said, it emphasizes the need for new state appropriations, which may include the cost of analyzing the air quality data for Great Salt Lake dust.

The Easiest Solution
But how does Utah avoid this potential concern? The easiest solution is refilling the lake, though, that's still a daunting task considering all the upstream water diversions that take water out of the lake and that Utah is in the middle of a two-decade-long megadrought. This says everything about how challenging it is to mitigate dust once a lakebed is exposed.

There are dozens of global examples of what can go wrong when a lake dries out but Owens Lake in California is the one that Perry pointed lawmakers to on Tuesday. The lake began to dry up when Los Angeles officials began diverting the lake's water sources into the Los Angeles Aqueduct.

This is not a problem that might happen in the future, the lake is three-fourths of the way gone today and we really, really need to have a sustained focus on it over a longer period of time to ... make sure we put enough water into it.  - Rep. Ray Ward, R-Bountiful

California leaders have since spent over $2 billion trying to mitigate the health concerns associated with the dried lake dust. They eventually determined the only feasible solution was to refill the lake, Perry explains.

This solution could take a long time to solve Utah's problems, though. Of the Great Salt Lake's four major concern areas, Perry considers Farmington Bay as the easiest to mitigate simply because it requires the least amount of water to help cover the surface area. The lake needs to gain about 10 feet of water to mitigate dust concerns in the bay, but that could take decades to happen, barring an unforeseen shift in trends.

"Which means that we're going to be plagued by dust coming off the Great Salt Lake not just for a few years but likely for decades," he said.

That said, he's more optimistic about this solution now than just three years ago. He's seen Utahns show more interest in reducing water waste and state leaders take larger steps toward water conservation compared to the past. Tuesday's meeting featured four experts explaining ways to improve water quantity and air quality around the lake.

Ward agrees that the state is going to need to more than just refill the lake once to resolve the lake's issue. The Utah Legislature directed $40 million toward getting more water to the lake in this year's legislative session. More money and projects are needed to ensure water is flowing to the Great Salt Lake, Ward acknowledges.

But it's time and money worth spending given the known and potential risks Utah faces as the lake dries up.

"The big picture is we're in trouble with the lake right now," he said. "This is not a problem that might happen in the future, the lake is three-fourths of the way gone today and we really, really need to have a sustained focus on it over a longer period of time to ... make sure we put enough water into it."

 

by Carter Williams, first published @ KSL.com.

At-Risk Forests

At-Risk Forests


Global analysis identifies at-risk forests.

Forests are engaged in a delicate, deadly dance with climate change, hosting abundant biodiversity and sucking carbon dioxide out of the air with billions of leafy straws. They can be a part of the climate solution as long as global warming, with its droughts, wildfires and ecosystem shifts, doesn’t kill them first.

In a study published in Science, William Anderegg, the inaugural director of the University of Utah’s Wilkes Center for Climate Science and Policy, and colleagues quantify the risk to forests from climate change along three dimensions: carbon storage, biodiversity and forest loss from disturbance, such as fire or drought. The results show forests in some regions experiencing clear and consistent risks. In other regions, the risk profile is less clear, because different approaches that account for disparate aspects of climate risk yield diverging answers.

 

William Anderegg

“Large uncertainty in most regions highlights that there's a lot more scientific study that's urgently needed.”

 

An international team

Anderegg assembled a team including researchers from the United Kingdom, Germany, Portugal and Sweden.

“I had met some of these folks before,” he says, “and had read many of their papers. In undertaking a large, synthetic analysis like this, I contacted them to ask if they wanted to be involved in a global analysis and provide their expertise and data.”

Their task was formidable –assess climate risks to the world’s forests, which span continents and climes and host tremendous biodiversity while storing an immense amount of carbon. Researchers had previously attempted to quantify risks to forests using vegetation models, relationships between climate and forest attributes and climate effects on forest loss.

“These approaches have different inherent strengths and weaknesses,” the team writes, “but a synthesis of approaches at a global scale is lacking.” Each of the previous approaches investigated one dimension of climate risk: carbon storage, biodiversity, and risk of forest loss. For their new analysis, the team went after all three.

Three dimensions of risk

“These dimensions of risk are all important and, in many cases, complementary. They capture different aspects of forests resilience or vulnerability,” Anderegg says.

  • Carbon storage: Forests absorb about a quarter of the carbon dioxide that’s emitted into the atmosphere, so they play a critically important role in buffering the planet from the effects of rising atmospheric carbon dioxide. The team leveraged output from dozens of different climate models and vegetation models simulating how different plant and tree types respond to different climates. They then compared the recent past climate (1995-2014) with the end of the 21st century (2081-2100) in scenarios of both high and low carbon emissions. On average, the models showed global gains in carbon storage by the end of the century, although with large disagreements and uncertainty across the different climate-vegetation models. But zooming in to regional forests and taking into account models that forecast carbon loss and changes in vegetation, the researchers found higher risk of carbon loss in southern boreal (just south of the Arctic) forests and the drier regions of the Amazon and African tropics.
  • Biodiversity: Unsurprisingly, the researchers found that the highest risk of ecosystems shifting from one “life zone” to another due to climate change could be found at the current boundaries of biomes – at the current transition between temperate and boreal forests, for example. The models the researchers worked from described changes in ecosystems as a whole and not species individually, but the results suggested that forests of the boreal regions and western North America faced the greatest risk of biodiversity loss.
  • Disturbance: Finally, the authors looked at the risk of “stand-replacing disturbances,” or events like drought, fire or insect damage that could wipe out swaths of forest. Using satellite data and observations of stand-replacing disturbances between 2002 and 2014, the researchers then forecast into the future using projected future temperatures and precipitation to see how much more frequent these events might become. The boreal forests, again, face high risk under these conditions, as well as the tropics.

“Forests store an immense amount of carbon and slow the pace of climate change,” Anderegg says. “They harbor the vast majority of Earth's biodiversity. And they can be quite vulnerable to disturbances like severe fire or drought. Thus, it's important to consider each of these aspects and dimensions when thinking about the future of Earth's forests in a rapidly changing climate.”

Future needs

Anderegg was surprised that the spatial patterns of high risk didn’t overlap more across the different dimensions.

“They capture different aspects of forests' responses,” he says, “so they wouldn't likely be identical, but I did expect some similar patterns and correlations.”

Models can only be as good as the basis of scientific understanding and data on which they’re built and this study, the researchers write, exposes significant understanding and data gaps that may contribute to the inconsistent results. Global models of biodiversity, for example, don’t incorporate dynamics of growth and mortality or include the effects of rising CO2 directly on species. And models of forest disturbance don’t include regrowth or species turnover.

“If forests are tapped to play an important role in climate mitigation,” the authors write, “an enormous scientific effort is needed to better shed light on when and where forests will be resilient to climate change in the 21st century.”

Key next steps, Anderegg says, are improving models of forest disturbance, studying the resilience of forests after disturbance, and improving large-scale ecosystem models.

The recently-launched Wilkes Center for Climate Science and Policy at the University of Utah aims to provide cutting-edge science and tools for decision-makers in the US and across the globe. For this study, the authors built a visualization tool of the results for stakeholders and decision-makers.

Despite uncertainty in the results, western North America seems to have a consistently high risk to forests. Preserving these forests, he says, requires action.

“First we have to realize that the quicker we tackle climate change, the lower the risks in the West will be,” Anderegg says. “Second, we can start to plan for increasing risk and manage forests to reduce risk, like fires.”

Find the full study here.

 

by Paul Gabrielsen, first published in @theU.