Ron Perla, 2024 Distinguished Alumnus

Avalanche Escape Artist


September 4, 2024
Above: Ron Perla in the 1960s at a creep gage, built by U Geophysics' Bob Smith and team, ready to be covered with snow on a test slope next to the Alta Avalanche Study Center.

“I out-swam a size three avalanche down a gulley that had been artillery blasted,” reports Ron Perla to Wildsnow, a ski and snow reporting site. “It was my introduction to the post-control release.”

Ron Perla working on slab above Alta village, 1968. Credit: Charles Bradley, Montana State University

Recipient of the 2024 Distinguished Alumni award from the Department of Atmospheric Sciences, Perla graduated in 1971 with his PhD from the University of Utah in meteorology. As a snow scientist, he conducted research into avalanches and is well-known for discovering “the thirty-degree threshold,” where slopes of thirty degrees or more are much likelier to cause avalanches.

Perla worked at Alta Ski Resort as a member of the ski patrol and in 1966 became a part-time snow ranger and part-time research assistant at the U.S. Forest Service (USFS) Alta Avalanche Study Center. As a research assistant to Ed LaChapelle, Perla researched slab properties, factors that contribute to an avalanche and rescue methods, among other things.

Early in the morning and during intense storms, snow rangers blast the mountain to reduce the risk of avalanches. Between these times, Ed LaChapelle allowed Perla to take classes at the U. From 1967 to 1971 Perla commuted between Alta and the university. He split his time between snow rangering and his PhD program supervised by Professor Shih-Kung Kao and included classes in meteorology and applied mechanics. These classes are fundamental disciplines for avalanche research.

Perla’s advisor, along with the Department of Meteorology's chair Don Dickson, understood the unique combination of university study and avalanche study. Kao was a world-class specialist in atmospheric dynamics, turbulence and diffusion while Dickson was a highly decorated World War II pilot with hands-on meteorology experience. He helped Perla obtain a research grant from the Rockefeller Foundation and arranged for the donation of an old Alta ski lifts building which was turned into a mountain meteorology lab.

Models of moving avalanches

Perla has also extensively researched snow structure as well as models of moving avalanches. His current research involves quasi-three-dimensional modeling of the internal structure of a moving avalanche, from start to stop and has modeled moving snow in many different ways. His first model (1980) followed the mass-center of moving snow, and in 1984 his model assumed the avalanche as a collection of starting particles. The current model assumes the avalanche consists of snow parcels moving turbulently in three layers.

Ron Perla, U.S. Forest Service, 1968.

Along with his research, Perla has spent a lifetime in the snow. An avid skier and mountaineer, he partnered with Tom Spencer (U alum in mathematics) in 1961 for the first ascent of Emperor Ridge on Mt. Robson, the highest point in the Canadian Rockies. He also established a new route on the north face of the Grand Teton in Wyoming and a first ascent of the popular “Open Book” route on Lone Peak in the Wasatch Mountains.

“In 1967, I was working as a USFS Snow Ranger near the top of Mt. Baldy,” Perla says. “The cornice broke off prematurely, and I fell into a Baldy chute. The cornice blocks triggered a large avalanche. I was tumbled around with no chance of 'swimming,' and somehow I missed all of the rocks. Just before I lost consciousness under the snow, I managed to thrust an arm up to the surface. I was found quickly.”

Collective consciousness

Perla is an honorary member of the American Avalanche Association as well as a member of multiple different snow and ice committees, such as the Snow, Ice, and Permafrost committee for the American Geophysical Union.

After earning his PhD at the U, Perla moved to Fort Collins, Colorado as a research meteorologist for the USFS. In 1974, he moved to Alberta, Canada to work for the National Hydrology Research Institute. He has remained in Alberta since.

Perla is a significant reason why we understand snow science and avalanches and why backcountry education has improved to help keep those who recreate in areas with snowfall — skiers, mountaineers, snowshoers and ice climbers — safe.

“Despite the enormous increase in backcountry use, despite increasing behavior to ski and ride lines we could never imagine in the 1960s, avalanche fatalities are not increasing to match those trends,” Perla says in an interview with Wildsnow. "Surely, associations, centers, websites, and educators, in general, are responding to match those trends. Surely it’s also because today’s risk-takers are increasingly more skillful backcountry skiers, riders, and [,as in Perla's harrowing experience on Mt. Baldly,] escape artists."

He continues, adding that "[e]quipment is improving. ...But there’s something else: call it collective consciousness in the backcountry. An increasing number of backcountry users correlates with increasing observations and tests. Thus, safety can be enhanced by numbers if there is increased communication... ."

You can read Ron Perla's interview with Wildsnow here.

by CJ Siebeneck

Urban ‘Cool Zones’

Urban 'Cool Zones'


August 14, 2024
Above: A poster created by Salt Lake County to promote cool zones. Credit: KSLNewsRadio

Daniel Mendoza brings science (and change) to the people.

Daniel Mendoza

A research associate professor in the Department of Atmospheric Sciences at the University of Utah, Daniel Mendoza is not your typical academic scientist. With an impressive list of publications, averaging a new paper each month, academic scholarship is only one of his accomplishments. Mendoza has become an environmental social justice advocate, leveraging his research to get the attention of politicians and legislatures. The intersection between what’s happening in the atmosphere and what’s happening on the ground in people’s lives is where Mendoza readily enters.

This summer, Salt Lake has fallen victim to heat waves that mirror those throughout the United States. According to the CDC, extreme heat kills around a thousand people in the U.S. each year, more than any other natural-occurring factor. Effects from the heat are easily felt, but more insidious are the effects from increased concentrations of air pollutants, namely ozone. 

Mendoza explains in an interview with @theU’s Lisa Potter that “ozone is dangerous because it basically causes a sunburn in your lungs that impacts respiratory and cardiovascular health.”

In a recent study, Mendoza and his team asked the question, “can cool zones protect individuals from heat and poor air quality?” “Cool zones” are public buildings that serve as environmental refuges for vulnerable people during periods of extreme heat. Places like recreation centers or libraries are good examples of cool zones; Mendoza chose the Millcreek Library as the location for his case study. 

Obviously cool zones protect individuals from heat with the use of air conditioning, but the study found that the Millcreek Library also reduced exposure to atmospheric ozone by around 80%. 

Given their demonstrated efficacy, Mendoza is now critical of the current scope of cool zones. “We should be thinking about how to make these centers more accessible, for example, keeping them open for longer hours to protect people during the hottest parts of the day.” Many heat refuges close around 2-3 p.m. and aren’t open on weekends.

What people believe

Daniel Mendoza in the 2021 documentary "AWAiRE" that explores the impacts of air quality along the Wasatch Front. Credit: AWAIRE.

Mendoza understands that data alone is not convincing enough to enact change outside of the scientific community. “About 50% of people in the U.S. believe in climate change, but 100% believe in lung cancer, which is why I wanted to pivot from more climate drivers and greenhouse gas emissions and products towards more health criteria,” he says. Furthermore, he continues, “...150% of people believe in the dollar. I mean that’s ultimately what drives policy, what drives a lot of decision making.” 

It was during his Pulmonary and Critical Care Medicine Fellowship program at the U when Mendoza learned more about how to tie in the social and basic sciences with the health sciences. He finished the program in 2020 after completing a capstone project looking at the impact of air pollution on school absences. 

On “orange” or “red” air quality index (AQI) days, students are often still sent outside for recess, resulting in many children experiencing respiratory symptoms and needing to be sent home. Missing school every so often because the air quality is poor doesn’t sound like a huge issue, but it adds up to impact the student as well as the school, its district and the city where they live, he explains.

“When you have repeat absenteeism, then the potential to graduate is much lower, the potential to go to college is much lower, then your tax base is lower,” says Mendoza. Increased school absences cost the city around half a million dollars a year in terms of reduced workforce, education costs and healthcare costs. 

The solution to this pervasive issue of children being sent home because of the deleterious effects of bad air was surprisingly simple: emergency asthma inhalers in every classroom, right next to the Epinephrine Auto-Injectors branded “EpiPens” Says Mendoza, “I worked with Representative Mark Wheatley,” chair for the Utah Asthma Task Force, “and we passed a law…. Utah became the 14 (or 15th) state that has emergency asthma inhalers in every single school.” 

Now on bad air days, instead of sending a student home, students can use the rescue inhaler and remain at school, placing less of an economic burden on the city and giving themselves more time to learn. It’s a health-issue solution based on atmospheric data that changes policy and in turn saves taxpayer dollars. 

Empowering the Community 

Mendoza soon discovered what others had already discovered or at least suspected, that certain populations in the city were more endangered than others. What distinguished those populations was lower-income brackets and racial and ethnic inequities. When he first moved to Salt Lake City, Mendoza was excited about the buzz around air quality. “I thought, this is great. My research is going to be welcomed by the community,” he recalls. Instead, he discovered that these events were forgetting a key part of the problem: the people who are most impacted. 

Mendoza started attending community-based informational gatherings about climate change and the environment. “All of these events are held east of State Street. They were all in English. No one looked like me. Then at the end of the talk, the conclusion was ‘buy electric vehicles and solar panels and we’ll save the world together.’ Well that doesn’t work for everyone.” 

Not only is there a disparity in the communities affected by poor air quality, there is an inequality in accessible solutions to the problem. “For most of them, air quality is not a top priority… they don’t have the luxury of learning like we do,” says Mendoza of those who are most likely to be impacted by bad air quality. 

The first step in empowering the community and addressing this imbalance was to bring science to them. Mendoza began organizing outreach events, this time on the west side of State Street, held in both Spanish and English. 

“We provide them with actionable solutions. For example, we partnered with Utah Clean Energy, and we did an LED exchange where people bought in their normal light bulbs,” he says. Another switch he facilitated was to low-flow showerheads. 

And yet another initiative included furnace filter exchange with 100 homes in Salt Lake County. When indoor air was tested for 43 different potential problematic elements, researchers found elevated levels of uranium, lanthanides, arsenic and lead, “all the nasties.” 

Those “nasties” come from a variety of sources. “If you’re close to a highway, for example, you [breathe in] more of aluminum, associated with brake wear,” says Mendoza of the indoor air quality study, the first study of its kind. “When was the last time you sat outside for eight hours? You spend 90% of your time indoors and 60% of your time in your home, roughly speaking.” 

“The people that we really are very concerned about are, for example, the delivery drivers, who are constantly in that traffic, road construction workers as well. Those people are breathing [in] literally every single car’s tailpipe.” 

‘Run back inside’

Inequities in who breathes bad air requires that one looks closely at why and how bad air gets ingested. “Those with more and better resources can think about these issues involving bad air and what used to be only seasonal atmospheric inversions along the Wasatch Front, and then “just run back inside and we’re fine. But very few studies have been done on these concentrated pollution sources, again in conjunction with what they may be exposed to ‘naturally.’” 

From the 2021 documentary "AWAiRE." Credit: AWAIRE.

Those studies are being done by Mendoza and others and then made actionable on-the- ground initiatives involving switching out devices that are less effective and cost more money in populations who are most threatened by breathing bad air. 

These simple switches in affordable fixtures, for example, have tangible and meaningful impacts that inspire other actions, other policy decisions leading to better health outcomes. 

“Participants in these gatherings  soon became community leaders to help others improve their situation,” says Mendoza, another favorable result to his work. And then there is the financial incentive, that tongue-in-cheek statistic that 150% of people do in fact “believe in the dollar.” 

“These community members, they have to earn income to survive,” he reminds us. “They see their electric bills go down, they see their heating bills go down, they see their water bills go down, and they realize ‘Oh,okay, so it works. Let me tell all my friends about it.’”

Costs of inaction

Policy-makers and the public in general often look at the costs of solutions to problems that require action but sometimes they forget about the costs of inaction

Regardless of whether the focus of a study is cool zones, compounding wildfire emissions, or, most recently a recent study on the eBus project, a main tool for fine scale carbon emissions measurements in urban environements, Mendoza approaches each new inquiry with the same goal: “I want to make sure that my science gets understood by the general public. I want to write in as plain English as possible, because ultimately, I want to enact change, I want my work to do change.” 

Mendoza challenges the stereotypical ideal of a mad scientist locked away in a lab and detached from reality. Instead, he is present on campus, in the community, and at the state capitol building using science to advocate for justice.

Daniel Mendoza holds joint positions as research associate professor in atmospheric sciences; adjunct associate professor in internal medicine; and adjunct associate professor in City & Metropolitan Planning at the University of Utah.

by Lauren Wigod 

Read more on the 2021 documentary "AWAiRE," featuring Daniel Mendoza in @TheU

 

Solving the Puzzle of Utah’s Summer Ozone

Solving the Puzzle of Utah's Summer Ozone


July 29, 2024
Above: A view of Salt Lake City shot from NOAA’s research aircraft. Credit: NOAA.

The Salt Lake Valley’s summertime ozone pollution is a complicated puzzle because so many different kinds of emissions contribute to the problem, which in turn is affected by the time of day or year, the weather and many other factors.

Without knowing which emissions are most culpable or understanding the role of the region’s topography, solutions to Utah’s ozone mess will remain elusive. In collaboration with University of Utah faculty and funding from the state, the National Oceanic and Atmospheric Administration (NOAA) is helping find answers.

A team of NOAA scientists is in Salt Lake City for the next few weeks gathering masses of air quality data that is expected to yield new insights that could help bring relief. Building on a long record of air quality data compiled by U scientists and the Utah Division of Air Quality (DAQ) over several years, this new snapshot data is hoped to illuminate what is driving elevated ozone levels along the Wasatch Front, according to Steven Brown, one of the NOAA research chemists leading the Utah Summer Ozone Study.

John Lin, professor of atmospheric sciences, on the roof of the Browning building where a phalanx of air quality monitoring instruments are stationed. Photo credit: Brian Maffly.

“Every city in the United States has an ozone problem, but every city is also different in terms of the sources that contribute to that ozone. And Salt Lake is no exception in that regard,” Brown said. “We’re certainly trying to understand the influence of wildfires. But then you’ve got this mix of industrial and urban sources in a valley with very unusual meteorology. We’re trying to characterize all those sources. What does that meteorology look like, and how do those things combine to produce the unique ozone problem that affects Salt Lake City?”

NOAA’s multi-platform study is being coordinated with the U’s Utah Atmospheric Trace Gas & Air Quality (UATAQ)) lab, headed by John Lin, a professor of atmospheric sciences. Also involved is Lin’s colleague Gannet Hallar, whose students are launching weather balloons and providing weather forecast briefings most days of the study to support NOAA’s regular overflights.While Utah has made strides reducing the severity of its particulate pollution-trapping winter inversions, summertime ozone has worsened to the point that Salt Lake City is out of attainment of the federal standard.

The primary ozone precursors are volatile organic compounds, or VOCs, which are emitted from countless sources—including oil refineries, gas stations, wildfire, paints, even personal care products, like deodorant—and nitrogen oxides, or NOx, a product of combustion.

Photons are needed to break up certain molecules, so the reactions typically will not happen without sunlight,” said John Lin, the associate director of the Wilkes Center for Climate Science & Policy. “It essentially chops up those chemical bonds. Then ozone reacts with other things and levels get lower at night.”

Read the full article by Brian Maffly in @TheU.

Satellite measurements of carbon emissions

Monitoring urban Carbon emissions at the global scale


July 30, 2024
Above: A map of the 77 cities at which the urban emissions monitoring framework was applied.

“We’re starting to see a globally consistent system to track [carbon] emission changes take shape,” says atmospheric scientist John Lin.

Faculty in the University of Utah's Department of Atmospheric Sciences, Lin is co-author of a paper in the journal Environmental Research Letters about a new satellite-based system for measuring CO2 emissions in support of global collective climate mitigation actions. As nations and cities continue to state their intentions to decarbonize for the purpose of becoming, in their activities, carbon-neutral, “we want to be able to see it happen from space.” 

Now we have a system to do so. 

That system is the culmination from standing on the shoulders of previous data scientists. It’s a story about how data is collected, interpreted and expanded through new technologies. It’s also about how this recursive process — now turbocharged with the advent of machine learning and AI — creates a space for potential application, innovation and policy that can change our world for the better, including mitigating carbon emissions that are warming our earth at a startling and deleterious rate.

But before any attempt can be made to save the planet, scientists have to secure a consistent measurement framework to better understand what’s happening as well as where it’s happening and how much.

The Backstory

John Lin

The backstory to this tale first begins in the Pacific Ocean. Tracking carbon emissions dates back decades to a single site in Hawai’i where, on a largely inactive volcano on the Big Island, instruments measured carbon dioxide in the atmosphere. At a high elevation, the site was very good at characterizing broad scale changes in carbon dioxide, globally, a “poster child for climate change because over time,” explains Lin who is also associate director of the Wilkes Center for Climate Science and Policy, “we know that from these Hawai’i  measurements, CO2 has this distinct cycle, seasonally, but then this upward trend due to all of us burning fossil fuels.”

Human-caused carbon emissions are not only leading to CO2 buildup everywhere in the atmosphere but the issue is widespread in public discourse. Whether it’s on the micro level of mitigating one’s personal “carbon footprint” by taking the bus, or on the meta level of international initiatives like the Kyoto Accords or the United Nations-brokered Paris Agreement, the effects of carbon emissions are on everyone’s mind. A cascade of cities and whole nations have established goals for mitigating emissions, but their estimates of carbon emissions have been relying on data that are inconsistent and sometimes missing altogether in parts of the world. 

That cities have singly established and even accelerated their carbon-neutral goals is a good thing, considering that over 70 percent of human-emitted CO2 into the atmosphere stems from cities around the globe.

Tracking progress toward city-scale emissions reduction targets is essential by providing “actionable information for policy makers,” the paper states. This while the authors acknowledge that earlier measurements and claims from municipal entities are based on “self-reported emissions inventories,” whose methodology and input data often differ from one another. These practices hamper “understanding of changes in both city-scale emissions and the global summation of urban emissions mitigation actions.”

Orbiting Carbon Observatory

This is where outer space in general comes into play and, in particular, the Orbiting Carbon Observatory (OCO). The NASA mission is designed to make space-based observations of carbon dioxide in Earth’s atmosphere to better understand the characteristics of climate change. After a literal “failure to launch” in 2009, NASA successfully placed a satellite (OCO2) in 2014 with equipment measuring CO2 emissions from space. Satellite-transmitted data promised to be an independent way to calculate, globally, emissions from cities. Not surprisingly, it has taken a while to learn how to use the data. In 2020 a graduate student in Lin’s research group, Dien Wu, developing early methods, did exactly that, looking comprehensively at a total of twenty cities around the world.

Based on essentially the same data set used by Lin and Wilmot in their current paper, but with fewer years, Wu was able to get estimates of the amounts of human emitted CO2 from OCO2 satellite transmissions. Separating out what carbon human activity is emitting to the atmosphere versus those from urban vegetation has now been determined through an expansion of the analyses over the additional years by Lin’s team of researchers, including a later graduate student by the name of Kai Wilmot, co-author of the current study.

In this round, four times as many urban areas as Wu studied and distributed over six continents, have now been assessed. This plant/human conundrum is further complicated by vegetation outside the city which has very different characteristics from vegetation inside the city. The difference creates patterns of CO2  that have to be taken out to distill the human component.

Strangely beautiful animations

Kai Wilmot

In short, Lin and company’s findings, published in Environmental Research Letters, represents a new capacity based on recent developments in modeling. And the animations of the assembled and interpreted satellite CO2 data delivered by the team are startling, even strangely beautiful. In one chart the left side displays latitude vs CO2. “This narrow swath,” explains Lin, indicates “each time … [the satellite] orbits. There's this narrow slice of data that becomes available.”

Using that data, he continues, “the NASA scientists can construct this nice animation of CO2 change in each latitude band over time.” Lin points to what he calls “ridges and valleys” on the the chart that represent the seasonal cycle, and he personifies the entire Earth as if it is “breathing in the carbon dioxide through photosynthesis during the summer growing season and then releasing it in the winter. They have these very sharp ridges — high CO2, low CO2, higher CO2 [the breaths] — but overall, the rug is going up, because we're emitting carbon dioxide into the atmosphere.”

Here, researchers are only looking at a small fraction of data points, the ones that intersect the targeted cities. They then do a more detailed look at whether they’re seeing a signal or not and whether they’re getting enough data.

“Personally,” says Wilmot, “I think the particularly neat aspect of this work is the capacity for global application. Leveraging satellite data and atmospheric modeling, we are able to gain some insight into urban emissions at cities around the world. We can see interactions between these emissions and socioeconomic factors, and we can identify large changes in emissions over time.”

 

The possibilities of creating more rigorous models, and more revealing data about how much cities emit carbon to the atmosphere are tantalizing. And so are the findings of the research. “This kind of information can be used by cities and the UN process,” Lin says. “But I’m pretty sure what they want is something more dynamic through time, how these emissions evolve. And also, probably more frequent updates.” As it was in this study, researchers had to aggregate multiple years of data to get enough points for each city. “So the challenge, I think, is to be able to track more dynamically these emissions over time.”

More to come

NASA’s next iteration of the Orbiting Carbon Observatory — OCO3 — has already been successfully docked on the International Space Station, although it was de-installed for a period of time recently to allow another instrument to carry out measurements. (It turns out that prime real estate on the crowded station is, well, at a premium.) But new data is forthcoming. 

Meantime, researchers have their work cut out for themselves in the data crunching/parsing/interpreting part of this saga. Scientists typically accrue data far faster than they are able to use and interpret them . . . and create cool animations for general consumption.

A log-log plot of the scaling relationship between direct emissions per capita and effective population density for all 77 cities.

“Naturally,” concludes Lin, “to bend the curve in terms of trying to reduce carbon emissions in cities is a primary focus. And there's a lot of excitement and social energy around reducing carbon emissions in cities, including here in Salt Lake. Many mayors have pledged carbon reduction plans, and the University of Utah has their own [pledge]. Lots of cities have very ambitious goals to reduce carbon.”

For Wilmot, this project will only add to the increased “social energy” around the issue of carbon emission mitigation. Satellite measuring will help identify a path toward monitoring urban emissions at the global scale in order to identify effective policy levers for emissions reductions. “Of course, realizing this monitoring ability is contingent on further development of the modeling, satellite observations, and a number of necessary input datasets,” he says. “So by no means am I saying that we are there already.” 

Clearly, this research has shown that the co-authors’ designed, multi-component satellite framework is capable of monitoring CO2 emissions across urban systems and identifying relevant driving factors. Their analysis not only pulled out data of the emissions from individual cities, but, because it is global, they could then do pattern analyses. In fact, the researchers, using an established relationship between emission-per-capita vs population density were able to plot from the data what happened, emissions-wise, during the COVID shutdown.

But, as co-author Kai Wilmot infers about work yet to be done, the ending to this story — from the Hawaiian Islands to outer space — is one of not-quite-yet “mission accomplished.”

“It’s more like mission half-accomplished,” John Lin concedes, “which is often the case in research.”

By David Pace

Read the complete paper in Environmental Research Letters.  

 

Scientists use AI to predict a wildfire’s next move

Scientists use AI to predict
a wildfire's next move


July 29, 2024

University of Utah Atmospheric Scientist Derek Mallia joins seven other researchers at University of Southern California and elsewhere in developing a new method to accurately predict wildfire spread.

By combining satellite imagery and artificial intelligence, their model offers a potential breakthrough in wildfire management and emergency response.

Detailed in an early study proof published in Artificial Intelligence for the Earth Systems, the USC model uses satellite data to track a wildfire's progression in real time, then feeds this information into a sophisticated computer algorithm that can accurately forecast the fire's likely path, intensity and growth rate.

Above : DEREK VINCENT MALLIA, Department of Atmospheric Sciences.

The study comes as California and much of the western United States continues to grapple with an increasingly severe wildfire season. Multiple blazes, fueled by a dangerous combination of wind, drought and extreme heat, are raging across the state. Among them, the Lake Fire, the largest wildfire in the state this year, has already scorched over 38,000 acres in Santa Barbara County.

Reverse-engineering wildfire behavior with AI

The researchers began by gathering historical wildfire data from high-resolution satellite images. By carefully studying the behavior of past wildfires, the researchers were able to track how each fire started, spread and was eventually contained. Their comprehensive analysis revealed patterns influenced by different factors like weather, fuel (for example, trees, brush, etc.) and terrain.

They then trained a generative AI-powered computer model known as a conditional Wasserstein Generative Adversarial Network, or cWGAN, to simulate how these factors influence how wildfires evolve over time. They taught the model to recognize patterns in the satellite images that match up with how wildfires spread in their model.

They then tested the cWGAN model on real wildfires that occurred in California between 2020 and 2022 to see how well it predicted where the fire would spread.

Read the rest of the story in ScienceDaily.

Rethinking Carbon Offsets

Rethinking the Carbon Offsets Market


July 18, 2024

 

Around 1989 an energy company was trying to see if they could plant trees in Guatemala and then use the absorption of carbon from those trees to offset their emissions of a new coal-fired power plant in the United States.

Libby Blanchard

It was the dawn of carbon-off-setting, emitting one place and then reducing or removing emissions elsewhere and calling that climate neutral.

Following the Kyoto Protocol negotiations in 1996/97, industrialized countries, including the U.S., picked up on the idea of carbon crediting and carbon off-setting and explored flexible market mechanisms that, according to Libby Blanchard, would potentially make it more economically feasible for industrialized countries to meet the goals and carbon-reduction metrics of the 2015 United Nations-brokered Paris Agreement.

Three and half decades after that first experiment in Guatemala with carbon off-sets the idea seems to have hit an inflection point. “A carbon credit becomes an offset when it’s used to trade against emissions somewhere else,” reiterates Blanchard, a postdoctoral research associate at the Wilkes Center for Climate Science & Policy and the School of Biological Sciences here at the University of Utah. “And a carbon credit is supposed to be one ton of carbon dioxide equivalent reduced or removed from the atmosphere over a predetermined period of time. The big problem with carbon credits is a large majority are not real or are what we call over-credited, or both, meaning that they’re not representing or are over-representing the amount of carbon dioxide equivalent actually reduced or removed for the atmosphere."

In this episode of the Talking Climate podcast, produced by the Wilkes Center for Climate Sciences & Policy, Ross Chambless, Wilkes Center community engagement manager, interviews Blanchard on a new “Contribution Approach” replacement of the struggling carbon offsets market.

Read more about the Nature-based climate solutions in an article published in One Earth.

Listen to the full podcast and view the transcript.

Watch a video with Libby Blanchard below.

 

 

 

Restoring the GSL & Environmental Justice

THe social & Ecological IMPACTS of GSL REstoration


June 24, 2024
Above: Satellite image of the Great Salt Lake

 

Inland seas around the world are drying up due to increasing human water use and accelerating climate change, and their desiccation is releasing harmful dust that pollutes the surrounding areas during acute dust storms.

Using the Great Salt Lake in Utah as a case study, researchers show that dust exposure was highest among Pacific Islanders and Hispanic people and lower in white people compared to all other racial/ethnic groups, and higher for individuals without a high school diploma. Restoring the lake would benefit everyone in the vicinity by reducing dust exposure, and it would also decrease the disparities in exposure between different racial/ethnic and socioeconomic groups. These results are reported June 21 in the journal One Earth, co-authored by University of Utah researchers in the College of Science and the College of Social & Behavioral Sciences. 

"People here in Utah are concerned about the lake for a variety of reasons -- the ski industry, the brine shrimp, the migratory birds, recreation -- and this study adds environmental justice and the equity implications of the drying lake to the conversation," says first author and sociologist Sara Grineski of the University of Utah. "If we can raise the levels of the lake via some coordinated policy responses, we can reduce our exposure to dust, which is good for everyone's health, and we can also reduce the disparity between groups."

The Great Salt Lake has been steadily drying since the mid-1980's, exposing its dry lakebed to atmospheric weathering and wind. Previous studies have shown that dust emissions from drying salt lakes produce fine particulate matter (PM2.5), which is associated with numerous health effects and is the leading environmental cause of human mortality worldwide.

"We know that the dust from these drying lakes is very unhealthy for us, so the question becomes, what does that mean in terms of people's exposure to the dust, and what does it mean in terms of inequalities in exposure to that dust," says Grineski. "Are some people more likely to have to suffer the consequences to a greater degree?"

To answer this question, Grineski teamed up with a multidisciplinary group of, among others, U atmospheric scientists, geographers, and biologists, including Derek V. Mallia, Timothy W. Collins, Malcolm Araos, John C. Lin, William R.L. Anderegg and Kevin Perry.

You can read the full story in ScienceDaily.
Read more about this research in an article by Brian Maffly in @TheU,  and stories in The Standard Examiner and at Fox 13.

L.S. Skaggs Applied Science Building Named at the U

L.S. SKAGGS APPLIED SCIENCE BUILDING NAMED AT THE U


May 28, 2024
Above:  Rendering of the new L.S. Skaggs Applied Science Building

The ALSAM Foundation has made a substantial gift toward the latest addition to the science campus at the University of Utah: the L.S. Skaggs Applied Science Building.

The 100,000-square-foot building will include modern classrooms and instruction spaces, cutting-edge physics and atmospheric science research laboratories, and faculty and student spaces. Scientists in the new building will address urgent issues, including energy, air quality, climate change, and drought. The building’s naming honors L.S. “Sam” Skaggs, the philanthropist and businessman whose retail footprint spread across the Mountain West and the U.S.

Building Construction -  April 30, 2024

Expressing profound gratitude for the transformative gift, Peter Trapa, Dean of the College of Science, shared, “We deeply appreciate The ASLAM Foundation’s extraordinary generosity. This gift is a testament to the value the organization places on higher education and its transformational impact on students and communities. It continues the Skaggs family's legacy in Utah and at our state’s flagship university. The new L.S. Skaggs Applied Science Building, a beacon of scientific innovation, will play an essential role in educating students in STEM programs throughout the University of Utah. This much-needed building allows the U to expand its STEM capacity and continue to serve our region’s expanding workforce needs.”

The construction of the L.S. Skaggs Applied Science Building is part of the Applied Science Project, which also includes the renovation of the historical William Stewart Building. The overall project is scheduled to be completed by next summer. Combined with the Crocker Science Center and a new outdoor plaza abutting the historic Cottam’s Gulch, the three buildings and outdoor space will comprise the Crocker Science Complex named for Gary and Ann Crocker.

The Skaggs family has a long history of supporting universities through The ALSAM Foundation, including the University of Utah. Other ALSAM Foundation-supported projects at the U include the L.S. Skaggs Pharmacy Research Institute, housed in the Skaggs Pharmacy Building, and the Aline S. Skaggs Biology Building, named after Mr. Skaggs’s wife.

The ALSAM Foundation issued the following statement, “The ALSAM Foundation and the members of the Skaggs family are pleased to continue the legacy of Mr. Skaggs at the University of Utah.  The Applied Science Project will benefit STEM education which was one of the goals of Mr. Skaggs.”

 

 

Researchers Look to Origins of New Particle Formation

RESEARCHERS LOOK TO ORIGINS OF NEW PARTICLE FORMATION


May 24, 2024
Above: ARM’s ArcticShark soars overhead, capturing measurements to document new particle formation and turbulence in the atmospheric boundary layer. Photo is by Tomlinson.

FIRST USER-DRIVEN ARCTICSHARK CAMPAIGN TAKES FLIGHT IN OKLAHOMA

In the complex dance of atmospheric processes affecting Earth’s energy balance, new particle formation (NPF) is emerging as a center-stage performer—one that helps determine, on a global scale, how clouds absorb and reflect solar radiation. While some aerosols found in the atmosphere are emitted directly as particles from natural or human sources, other aerosols form in the atmosphere from condensation of gases, such as sulfuric acid, that were themselves emitted by various sources. Scientists are studying how often NPF occurs in the atmosphere, and how it contributes to the formation of cloud condensation nuclei. These seed-like particles are where water vapor condenses to make clouds and precipitation.

Gerardo Carrillo-Cardenas (left) and Gannet Hallar, posing together on the University of Utah campus, are co-leading a field campaign that uses ARM’s ArcticShark uncrewed aerial system (UAS) in Oklahoma. Photo is courtesy of Hallar.

On May 6, 2024, a small research team from the University of Utah launched Turbulent Layers Promoting New Particle Formation, an Atmospheric Radiation Measurement (ARM) user facility field campaign designed to help scientists better understand the relationship between turbulence and NPF.

“This campaign is unique,” says co-principal investigator Gannet Hallar, a fan of the low- and slow-flying measurement platform. “We will be able to observe these atmospheric processes on the ground and in the air.” Working with Hallar, an ARM data veteran, is her PhD student and co-principal investigator Gerardo Carrillo-Cardenas. They are starting with an established fact: that within the lower troposphere, commonly called the atmospheric boundary layer, turbulent mixing can help initiate NPF.

Hallar and Carrillo-Cardenas are building upon previous work (Siebert et al. 2004Wehner et al. 2010, and Wu et al. 2021) that considered the possibility of particle formation from intense mixing between the residual layer and the growing atmospheric boundary layer. “We are really seeking to understand how the movement of the atmosphere itself, at a small scale, impacts the formation of aerosols,” says Hallar, “and what chemical components are needed to spark that formation.”

The ArcticShark is equipped with an aerosol instrument package to collect the data needed to address the campaign’s science questions. This package includes a portable optical particle spectrometer and a miniaturized scanning electrical mobility sizer. The Utah team is also taking advantage of the SGP’s ground-based Aerosol Observing System, basic meteorological measurements, regular radiosonde launches, and remote sensing instruments, such as Raman lidars and ceilometers.

The U.S. Department of Energy’s Atmospheric System Research (ASR) program is funding the project. The objective of the ASR project is to examine ARM data globally and better understand NPF’s contribution to cloud condensation nuclei.

Read the full article by Mike Wasem, Staff writer, Pacific Northwest National Laboratory in ARM: Dept. of Energy.

U of U Part of $6.6M National Weather Forecasting Initiative

U of U Included in $6.6M National Weather Forecasting Initiative


The partnership with NOAA, other universities aims to improve predictive weather models

The University of Utah is one of a six-institution consortium recommended to receive up to $6.6 million from the National Oceanic and Atmospheric Administration (NOAA) to improve weather forecasting through enhanced data assimilation methods. 

The new Consortium for Advanced Data Assimilation Research will support six institutions that have been recommended to receive funding and will work together collaboratively under the new Consortium for Advanced Data Assimilation Research and Education (CADRE).  CADRE is led by the University of Oklahoma and includes Colorado State University, Howard University, University of Maryland, Pennsylvania State University and the University of Utah.

Dr. Zhaoxia Pu

"This NOAA funding allows our researchers to collaborate with leading experts across the country to tackle a key challenge in data assimilation methodology," said Atmospheric Sciences Professor Zhaoxia Pu, the Principal Investigator of the University of Utah for CADRE. "By improving data assimilation techniques, we can help make more accurate weather forecasting."

Data assimilation combines observational data sources like satellite, surface, air and ocean measurements with numerical weather prediction models to generate comprehensive analyses of evolving weather systems. This blending of information better estimates the atmospheric states and corrects forecast models in real-time, thus enhancing projections of weather extremes such as storm paths, intensities and precipitation.

Despite major forecasting accuracy improvements in recent decades, upgraded data assimilation methods are needed to leverage new technological capabilities like artificial intelligence. The CADRE consortium will focus its efforts on advancing the data assimilation components of NOAA's Unified Forecast System (UFS), a community-based, coupled, comprehensive Earth-modeling system.

Pu’s team will be focusing their research on the coupled data assimilation efforts to improve weather forecasting from short-range to sub-seasonal to seasonal time scales. Atmospheric processes are significantly influenced by interactions with the land and ocean. Pu’s team will develop effective coupled data assimilation methods to better represent the land-atmosphere-ocean interactions within NOAA's UFS. Pu will also dedicate time to training graduate students through research projects, outreach activities with NOAA Laboratories and the University of Reading, UK, and through on-campus lectures on data assimilation methods. Students from the City College of New York will also participate in training activities.

"Data assimilation is a comprehensive scientific topic involving various types of data, data science and numerical modeling strategies. I welcome interactions and collaborations in atmospheric sciences, mathematics, physics and AI data science disciplines both on campus and beyond," Pu stated.

The $6.6 million will be funded by the Inflation Reduction Act and is part of the Biden Administration's Investing in America initiative. To learn more about this announcement, read the official NOAA release here

By Bianca Lyon