Urban ‘Cool Zones’

Urban 'Cool Zones'


August 14, 2024
Above: A poster created by Salt Lake County to promote cool zones. Credit: KSLNewsRadio

Daniel Mendoza brings science (and change) to the people.

Daniel Mendoza

A research associate professor in the Department of Atmospheric Sciences at the University of Utah, Daniel Mendoza is not your typical academic scientist. With an impressive list of publications, averaging a new paper each month, academic scholarship is only one of his accomplishments. Mendoza has become an environmental social justice advocate, leveraging his research to get the attention of politicians and legislatures. The intersection between what’s happening in the atmosphere and what’s happening on the ground in people’s lives is where Mendoza readily enters.

This summer, Salt Lake has fallen victim to heat waves that mirror those throughout the United States. According to the CDC, extreme heat kills around a thousand people in the U.S. each year, more than any other natural-occurring factor. Effects from the heat are easily felt, but more insidious are the effects from increased concentrations of air pollutants, namely ozone. 

Mendoza explains in an interview with @theU’s Lisa Potter that “ozone is dangerous because it basically causes a sunburn in your lungs that impacts respiratory and cardiovascular health.”

In a recent study, Mendoza and his team asked the question, “can cool zones protect individuals from heat and poor air quality?” “Cool zones” are public buildings that serve as environmental refuges for vulnerable people during periods of extreme heat. Places like recreation centers or libraries are good examples of cool zones; Mendoza chose the Millcreek Library as the location for his case study. 

Obviously cool zones protect individuals from heat with the use of air conditioning, but the study found that the Millcreek Library also reduced exposure to atmospheric ozone by around 80%. 

Given their demonstrated efficacy, Mendoza is now critical of the current scope of cool zones. “We should be thinking about how to make these centers more accessible, for example, keeping them open for longer hours to protect people during the hottest parts of the day.” Many heat refuges close around 2-3 p.m. and aren’t open on weekends.

What people believe

Daniel Mendoza in the 2021 documentary "AWAiRE" that explores the impacts of air quality along the Wasatch Front. Credit: AWAIRE.

Mendoza understands that data alone is not convincing enough to enact change outside of the scientific community. “About 50% of people in the U.S. believe in climate change, but 100% believe in lung cancer, which is why I wanted to pivot from more climate drivers and greenhouse gas emissions and products towards more health criteria,” he says. Furthermore, he continues, “...150% of people believe in the dollar. I mean that’s ultimately what drives policy, what drives a lot of decision making.” 

It was during his Pulmonary and Critical Care Medicine Fellowship program at the U when Mendoza learned more about how to tie in the social and basic sciences with the health sciences. He finished the program in 2020 after completing a capstone project looking at the impact of air pollution on school absences. 

On “orange” or “red” air quality index (AQI) days, students are often still sent outside for recess, resulting in many children experiencing respiratory symptoms and needing to be sent home. Missing school every so often because the air quality is poor doesn’t sound like a huge issue, but it adds up to impact the student as well as the school, its district and the city where they live, he explains.

“When you have repeat absenteeism, then the potential to graduate is much lower, the potential to go to college is much lower, then your tax base is lower,” says Mendoza. Increased school absences cost the city around half a million dollars a year in terms of reduced workforce, education costs and healthcare costs. 

The solution to this pervasive issue of children being sent home because of the deleterious effects of bad air was surprisingly simple: emergency asthma inhalers in every classroom, right next to the Epinephrine Auto-Injectors branded “EpiPens” Says Mendoza, “I worked with Representative Mark Wheatley,” chair for the Utah Asthma Task Force, “and we passed a law…. Utah became the 14 (or 15th) state that has emergency asthma inhalers in every single school.” 

Now on bad air days, instead of sending a student home, students can use the rescue inhaler and remain at school, placing less of an economic burden on the city and giving themselves more time to learn. It’s a health-issue solution based on atmospheric data that changes policy and in turn saves taxpayer dollars. 

Empowering the Community 

Mendoza soon discovered what others had already discovered or at least suspected, that certain populations in the city were more endangered than others. What distinguished those populations was lower-income brackets and racial and ethnic inequities. When he first moved to Salt Lake City, Mendoza was excited about the buzz around air quality. “I thought, this is great. My research is going to be welcomed by the community,” he recalls. Instead, he discovered that these events were forgetting a key part of the problem: the people who are most impacted. 

Mendoza started attending community-based informational gatherings about climate change and the environment. “All of these events are held east of State Street. They were all in English. No one looked like me. Then at the end of the talk, the conclusion was ‘buy electric vehicles and solar panels and we’ll save the world together.’ Well that doesn’t work for everyone.” 

Not only is there a disparity in the communities affected by poor air quality, there is an inequality in accessible solutions to the problem. “For most of them, air quality is not a top priority… they don’t have the luxury of learning like we do,” says Mendoza of those who are most likely to be impacted by bad air quality. 

The first step in empowering the community and addressing this imbalance was to bring science to them. Mendoza began organizing outreach events, this time on the west side of State Street, held in both Spanish and English. 

“We provide them with actionable solutions. For example, we partnered with Utah Clean Energy, and we did an LED exchange where people bought in their normal light bulbs,” he says. Another switch he facilitated was to low-flow showerheads. 

And yet another initiative included furnace filter exchange with 100 homes in Salt Lake County. When indoor air was tested for 43 different potential problematic elements, researchers found elevated levels of uranium, lanthanides, arsenic and lead, “all the nasties.” 

Those “nasties” come from a variety of sources. “If you’re close to a highway, for example, you [breathe in] more of aluminum, associated with brake wear,” says Mendoza of the indoor air quality study, the first study of its kind. “When was the last time you sat outside for eight hours? You spend 90% of your time indoors and 60% of your time in your home, roughly speaking.” 

“The people that we really are very concerned about are, for example, the delivery drivers, who are constantly in that traffic, road construction workers as well. Those people are breathing [in] literally every single car’s tailpipe.” 

‘Run back inside’

Inequities in who breathes bad air requires that one looks closely at why and how bad air gets ingested. “Those with more and better resources can think about these issues involving bad air and what used to be only seasonal atmospheric inversions along the Wasatch Front, and then “just run back inside and we’re fine. But very few studies have been done on these concentrated pollution sources, again in conjunction with what they may be exposed to ‘naturally.’” 

From the 2021 documentary "AWAiRE." Credit: AWAIRE.

Those studies are being done by Mendoza and others and then made actionable on-the- ground initiatives involving switching out devices that are less effective and cost more money in populations who are most threatened by breathing bad air. 

These simple switches in affordable fixtures, for example, have tangible and meaningful impacts that inspire other actions, other policy decisions leading to better health outcomes. 

“Participants in these gatherings  soon became community leaders to help others improve their situation,” says Mendoza, another favorable result to his work. And then there is the financial incentive, that tongue-in-cheek statistic that 150% of people do in fact “believe in the dollar.” 

“These community members, they have to earn income to survive,” he reminds us. “They see their electric bills go down, they see their heating bills go down, they see their water bills go down, and they realize ‘Oh,okay, so it works. Let me tell all my friends about it.’”

Costs of inaction

Policy-makers and the public in general often look at the costs of solutions to problems that require action but sometimes they forget about the costs of inaction

Regardless of whether the focus of a study is cool zones, compounding wildfire emissions, or, most recently a recent study on the eBus project, a main tool for fine scale carbon emissions measurements in urban environements, Mendoza approaches each new inquiry with the same goal: “I want to make sure that my science gets understood by the general public. I want to write in as plain English as possible, because ultimately, I want to enact change, I want my work to do change.” 

Mendoza challenges the stereotypical ideal of a mad scientist locked away in a lab and detached from reality. Instead, he is present on campus, in the community, and at the state capitol building using science to advocate for justice.

Daniel Mendoza holds joint positions as research associate professor in atmospheric sciences; adjunct associate professor in internal medicine; and adjunct associate professor in City & Metropolitan Planning at the University of Utah.

by Lauren Wigod 

Read more on the 2021 documentary "AWAiRE," featuring Daniel Mendoza in @TheU

 

Don’t Let This Blow You Away: Yellowstone’s Steam Threat

Don't Let This Blow You Away: Yellowstone's Steam Threat


July 29, 2024
Above: Yellowstone National Park officials survey damage near Biscuit Basin from a hydrothermal explosion that occurred Tuesday morning, July 23. Photo courtesy NPS/Jacob W. Frank

A hydrothermal explosion on July 23 at Yellowstone National Park sent visitors running for cover as steam shot into the air and rocks rained down on a popular viewing area.

The blast occurred about 10 a.m. local time near the Black Diamond Pool in Biscuit Basin, about two miles northwest of Old Faithful. No injuries were reported.

“Steam explosions like Tuesday’s incident have long been considered one of the most significant hazards posed to Yellowstone visitors,” says Tony Lowry, associate professor in Utah State University’s Department of Geosciences. “Biscuit Basin has had smaller, but still dangerous, events in the recent past.”

USU alum Jamie Farrell, research associate professor in the University of Utah’s Department of Geology and Geophysics and chief seismologist of the U.S. Geological Survey’s Yellowstone Volcano Observatory, says it was “very lucky” no one was hurt in today’s blast.

“Hydrothermal explosions happen quite frequently in the park, though they often occur in the uninhabited back country," says Farrell, who earned a bachelor’s degree in geology from Utah State in 2001. Farrell says the blasts aren’t volcanic eruptions and no magma is involved.“These incidents occur when very hot, mineral-laden water builds up and clogs the plumbing, so to speak; pressure builds up and is forced upward through pre-existing fractures to erupt at the surface,” he says.

Read the full article by Mary-Ann Muffoletto, Utah State University. 

Solving the Puzzle of Utah’s Summer Ozone

Solving the Puzzle of Utah's Summer Ozone


July 29, 2024
Above: A view of Salt Lake City shot from NOAA’s research aircraft. Credit: NOAA.

The Salt Lake Valley’s summertime ozone pollution is a complicated puzzle because so many different kinds of emissions contribute to the problem, which in turn is affected by the time of day or year, the weather and many other factors.

Without knowing which emissions are most culpable or understanding the role of the region’s topography, solutions to Utah’s ozone mess will remain elusive. In collaboration with University of Utah faculty and funding from the state, the National Oceanic and Atmospheric Administration (NOAA) is helping find answers.

A team of NOAA scientists is in Salt Lake City for the next few weeks gathering masses of air quality data that is expected to yield new insights that could help bring relief. Building on a long record of air quality data compiled by U scientists and the Utah Division of Air Quality (DAQ) over several years, this new snapshot data is hoped to illuminate what is driving elevated ozone levels along the Wasatch Front, according to Steven Brown, one of the NOAA research chemists leading the Utah Summer Ozone Study.

John Lin, professor of atmospheric sciences, on the roof of the Browning building where a phalanx of air quality monitoring instruments are stationed. Photo credit: Brian Maffly.

“Every city in the United States has an ozone problem, but every city is also different in terms of the sources that contribute to that ozone. And Salt Lake is no exception in that regard,” Brown said. “We’re certainly trying to understand the influence of wildfires. But then you’ve got this mix of industrial and urban sources in a valley with very unusual meteorology. We’re trying to characterize all those sources. What does that meteorology look like, and how do those things combine to produce the unique ozone problem that affects Salt Lake City?”

NOAA’s multi-platform study is being coordinated with the U’s Utah Atmospheric Trace Gas & Air Quality (UATAQ)) lab, headed by John Lin, a professor of atmospheric sciences. Also involved is Lin’s colleague Gannet Hallar, whose students are launching weather balloons and providing weather forecast briefings most days of the study to support NOAA’s regular overflights.While Utah has made strides reducing the severity of its particulate pollution-trapping winter inversions, summertime ozone has worsened to the point that Salt Lake City is out of attainment of the federal standard.

The primary ozone precursors are volatile organic compounds, or VOCs, which are emitted from countless sources—including oil refineries, gas stations, wildfire, paints, even personal care products, like deodorant—and nitrogen oxides, or NOx, a product of combustion.

Photons are needed to break up certain molecules, so the reactions typically will not happen without sunlight,” said John Lin, the associate director of the Wilkes Center for Climate Science & Policy. “It essentially chops up those chemical bonds. Then ozone reacts with other things and levels get lower at night.”

Read the full article by Brian Maffly in @TheU.

Satellite measurements of carbon emissions

Monitoring urban Carbon emissions at the global scale


July 30, 2024
Above: A map of the 77 cities at which the urban emissions monitoring framework was applied.

“We’re starting to see a globally consistent system to track [carbon] emission changes take shape,” says atmospheric scientist John Lin.

Faculty in the University of Utah's Department of Atmospheric Sciences, Lin is co-author of a paper in the journal Environmental Research Letters about a new satellite-based system for measuring CO2 emissions in support of global collective climate mitigation actions. As nations and cities continue to state their intentions to decarbonize for the purpose of becoming, in their activities, carbon-neutral, “we want to be able to see it happen from space.” 

Now we have a system to do so. 

That system is the culmination from standing on the shoulders of previous data scientists. It’s a story about how data is collected, interpreted and expanded through new technologies. It’s also about how this recursive process — now turbocharged with the advent of machine learning and AI — creates a space for potential application, innovation and policy that can change our world for the better, including mitigating carbon emissions that are warming our earth at a startling and deleterious rate.

But before any attempt can be made to save the planet, scientists have to secure a consistent measurement framework to better understand what’s happening as well as where it’s happening and how much.

The Backstory

John Lin

The backstory to this tale first begins in the Pacific Ocean. Tracking carbon emissions dates back decades to a single site in Hawai’i where, on a largely inactive volcano on the Big Island, instruments measured carbon dioxide in the atmosphere. At a high elevation, the site was very good at characterizing broad scale changes in carbon dioxide, globally, a “poster child for climate change because over time,” explains Lin who is also associate director of the Wilkes Center for Climate Science and Policy, “we know that from these Hawai’i  measurements, CO2 has this distinct cycle, seasonally, but then this upward trend due to all of us burning fossil fuels.”

Human-caused carbon emissions are not only leading to CO2 buildup everywhere in the atmosphere but the issue is widespread in public discourse. Whether it’s on the micro level of mitigating one’s personal “carbon footprint” by taking the bus, or on the meta level of international initiatives like the Kyoto Accords or the United Nations-brokered Paris Agreement, the effects of carbon emissions are on everyone’s mind. A cascade of cities and whole nations have established goals for mitigating emissions, but their estimates of carbon emissions have been relying on data that are inconsistent and sometimes missing altogether in parts of the world. 

That cities have singly established and even accelerated their carbon-neutral goals is a good thing, considering that over 70 percent of human-emitted CO2 into the atmosphere stems from cities around the globe.

Tracking progress toward city-scale emissions reduction targets is essential by providing “actionable information for policy makers,” the paper states. This while the authors acknowledge that earlier measurements and claims from municipal entities are based on “self-reported emissions inventories,” whose methodology and input data often differ from one another. These practices hamper “understanding of changes in both city-scale emissions and the global summation of urban emissions mitigation actions.”

Orbiting Carbon Observatory

This is where outer space in general comes into play and, in particular, the Orbiting Carbon Observatory (OCO). The NASA mission is designed to make space-based observations of carbon dioxide in Earth’s atmosphere to better understand the characteristics of climate change. After a literal “failure to launch” in 2009, NASA successfully placed a satellite (OCO2) in 2014 with equipment measuring CO2 emissions from space. Satellite-transmitted data promised to be an independent way to calculate, globally, emissions from cities. Not surprisingly, it has taken a while to learn how to use the data. In 2020 a graduate student in Lin’s research group, Dien Wu, developing early methods, did exactly that, looking comprehensively at a total of twenty cities around the world.

Based on essentially the same data set used by Lin and Wilmot in their current paper, but with fewer years, Wu was able to get estimates of the amounts of human emitted CO2 from OCO2 satellite transmissions. Separating out what carbon human activity is emitting to the atmosphere versus those from urban vegetation has now been determined through an expansion of the analyses over the additional years by Lin’s team of researchers, including a later graduate student by the name of Kai Wilmot, co-author of the current study.

In this round, four times as many urban areas as Wu studied and distributed over six continents, have now been assessed. This plant/human conundrum is further complicated by vegetation outside the city which has very different characteristics from vegetation inside the city. The difference creates patterns of CO2  that have to be taken out to distill the human component.

Strangely beautiful animations

Kai Wilmot

In short, Lin and company’s findings, published in Environmental Research Letters, represents a new capacity based on recent developments in modeling. And the animations of the assembled and interpreted satellite CO2 data delivered by the team are startling, even strangely beautiful. In one chart the left side displays latitude vs CO2. “This narrow swath,” explains Lin, indicates “each time … [the satellite] orbits. There's this narrow slice of data that becomes available.”

Using that data, he continues, “the NASA scientists can construct this nice animation of CO2 change in each latitude band over time.” Lin points to what he calls “ridges and valleys” on the the chart that represent the seasonal cycle, and he personifies the entire Earth as if it is “breathing in the carbon dioxide through photosynthesis during the summer growing season and then releasing it in the winter. They have these very sharp ridges — high CO2, low CO2, higher CO2 [the breaths] — but overall, the rug is going up, because we're emitting carbon dioxide into the atmosphere.”

Here, researchers are only looking at a small fraction of data points, the ones that intersect the targeted cities. They then do a more detailed look at whether they’re seeing a signal or not and whether they’re getting enough data.

“Personally,” says Wilmot, “I think the particularly neat aspect of this work is the capacity for global application. Leveraging satellite data and atmospheric modeling, we are able to gain some insight into urban emissions at cities around the world. We can see interactions between these emissions and socioeconomic factors, and we can identify large changes in emissions over time.”

 

The possibilities of creating more rigorous models, and more revealing data about how much cities emit carbon to the atmosphere are tantalizing. And so are the findings of the research. “This kind of information can be used by cities and the UN process,” Lin says. “But I’m pretty sure what they want is something more dynamic through time, how these emissions evolve. And also, probably more frequent updates.” As it was in this study, researchers had to aggregate multiple years of data to get enough points for each city. “So the challenge, I think, is to be able to track more dynamically these emissions over time.”

More to come

NASA’s next iteration of the Orbiting Carbon Observatory — OCO3 — has already been successfully docked on the International Space Station, although it was de-installed for a period of time recently to allow another instrument to carry out measurements. (It turns out that prime real estate on the crowded station is, well, at a premium.) But new data is forthcoming. 

Meantime, researchers have their work cut out for themselves in the data crunching/parsing/interpreting part of this saga. Scientists typically accrue data far faster than they are able to use and interpret them . . . and create cool animations for general consumption.

A log-log plot of the scaling relationship between direct emissions per capita and effective population density for all 77 cities.

“Naturally,” concludes Lin, “to bend the curve in terms of trying to reduce carbon emissions in cities is a primary focus. And there's a lot of excitement and social energy around reducing carbon emissions in cities, including here in Salt Lake. Many mayors have pledged carbon reduction plans, and the University of Utah has their own [pledge]. Lots of cities have very ambitious goals to reduce carbon.”

For Wilmot, this project will only add to the increased “social energy” around the issue of carbon emission mitigation. Satellite measuring will help identify a path toward monitoring urban emissions at the global scale in order to identify effective policy levers for emissions reductions. “Of course, realizing this monitoring ability is contingent on further development of the modeling, satellite observations, and a number of necessary input datasets,” he says. “So by no means am I saying that we are there already.” 

Clearly, this research has shown that the co-authors’ designed, multi-component satellite framework is capable of monitoring CO2 emissions across urban systems and identifying relevant driving factors. Their analysis not only pulled out data of the emissions from individual cities, but, because it is global, they could then do pattern analyses. In fact, the researchers, using an established relationship between emission-per-capita vs population density were able to plot from the data what happened, emissions-wise, during the COVID shutdown.

But, as co-author Kai Wilmot infers about work yet to be done, the ending to this story — from the Hawaiian Islands to outer space — is one of not-quite-yet “mission accomplished.”

“It’s more like mission half-accomplished,” John Lin concedes, “which is often the case in research.”

By David Pace

Read the complete paper in Environmental Research Letters.  

 

Scientists use AI to predict a wildfire’s next move

Scientists use AI to predict
a wildfire's next move


July 29, 2024

University of Utah Atmospheric Scientist Derek Mallia joins seven other researchers at University of Southern California and elsewhere in developing a new method to accurately predict wildfire spread.

By combining satellite imagery and artificial intelligence, their model offers a potential breakthrough in wildfire management and emergency response.

Detailed in an early study proof published in Artificial Intelligence for the Earth Systems, the USC model uses satellite data to track a wildfire's progression in real time, then feeds this information into a sophisticated computer algorithm that can accurately forecast the fire's likely path, intensity and growth rate.

Above : DEREK VINCENT MALLIA, Department of Atmospheric Sciences.

The study comes as California and much of the western United States continues to grapple with an increasingly severe wildfire season. Multiple blazes, fueled by a dangerous combination of wind, drought and extreme heat, are raging across the state. Among them, the Lake Fire, the largest wildfire in the state this year, has already scorched over 38,000 acres in Santa Barbara County.

Reverse-engineering wildfire behavior with AI

The researchers began by gathering historical wildfire data from high-resolution satellite images. By carefully studying the behavior of past wildfires, the researchers were able to track how each fire started, spread and was eventually contained. Their comprehensive analysis revealed patterns influenced by different factors like weather, fuel (for example, trees, brush, etc.) and terrain.

They then trained a generative AI-powered computer model known as a conditional Wasserstein Generative Adversarial Network, or cWGAN, to simulate how these factors influence how wildfires evolve over time. They taught the model to recognize patterns in the satellite images that match up with how wildfires spread in their model.

They then tested the cWGAN model on real wildfires that occurred in California between 2020 and 2022 to see how well it predicted where the fire would spread.

Read the rest of the story in ScienceDaily.

The Hidden Space Race and Vardeny’s Spintronic Revolution

The Hidden Space Race and Vardeny's Spintronic Revolution


July 19, 2024
Above: Valy Vardeny, Distinguished Professor of Physics & Astronomy, Photo Credit: Dung Hoang

Vardeny was a pioneer of organic spin waves known as “Spintronics.” Spin waves transfer information much faster with far less heat.

When Neil Armstrong and Buzz Aldrin landed on the moon fifty years ago, Zeev Valentine Vardeny was a young man living in Israel. The “space race” was palpable at the time. The “race” for ever-increasing technological innovation is profoundly felt in Israel. Putting brain power to work to maintain Israel’s safety is nothing short of a national mission.

Distinguished Professor of Physics & Astronomy Zeev Valentine Vardeny at the University of Utah in is certainly an All-Star of physics. While most Utahns have never heard of him, Vardeny opened up an entirely new branch of physics. He has helped innovate significant advances leading to OLED (organic LEDs), organic spin-wave and technology. If these aren’t familiar then next time you look at your organic LED flat-screen TVs or put your 96 gig flash memory card in your computer, just know that Vardeny and his work are a key part of that technology.

His field of Solid State Physics refers to how electrons behave when traveling through materials. Electrons flow through all of our electrical devices to provide them power. Computers transmit information and energy, but they also produce heat.

Vardeny says, " Using spintronic technology will help pave the way for vast changes in computer abilities that are known as quantum computers. First off. In regular computers the bits of regular computers are either a one or a zero. But if you have a quantum computer the bits can have infinite possibilities. There is an infinite number of numbers between zero and one.”

The Department of Defense is spending a lot of money is in using quantum computers and spin waves to create an entirely new form of communication.

You can read more about Vardeny and his research at the U in Utah Stories , Science Direct and Mirage News.

Ants and Trees: A Tale of Evolutionary Déjà Vu in the Rainforest

Ants and Trees: A Tale of Evolutionary Déjà Vu in the Rainforest


July 19, 2024
Above: Rodolfo Probst leads field research with U undergraduates in Costa Rica in March.

U biologist Rodolfo Probst finds multiple ant species that have independently evolved the same specialized relationship with understory trees

Ants are famous for their regimented and complex social behaviors. In the tropics, they are also famous for forming mutualisms with plants. Certain species of trees have conspicuous hollow swellings that house ants, often feeding the ants with specialized ant food. In return, the ants are pugnacious bodyguards, swarming out to aggressively defend the plant against enemies. Scientists have observed these mutualisms for centuries, but an enduring question is how these intriguing interactions evolved in the first place.

That remains a mystery, but new research led by University of Utah field biologist Rodolfo Probst offers insights that could broaden our understanding of ant-plant symbioses.

Published last week in the Proceedings of the Royal Society B, his research focused on an ant genus called Myrmelachista. Most Myrmelachista species nest in dead or live stems of plants, without any specialized mutualistic association. But one group of species in Central America was known to nest only in the live stems of certain species of small understory trees, in a specialized symbiosis similar to other ant-plant mutualisms. These tiny yellow ants hollow out the stems without harming the host plants, and can be found throughout Central America.

Jack Longino. Credit: Rodolfo Probst

Probst made a remarkable discovery. Using DNA sequence data to unravel their evolutionary history, he found that these nine species occurred as two clusters in different parts of the evolutionary tree. That means that this complex relationship, with all its distinctive characteristics, evolved twice from non-specialist ancestors.

His two coauthors are renowned entomologist Jack Longino, better known among U students as The Astonishing Ant Man for his expertise and vast personal collection of ant specimens kept on campus, and former U School of Biological Sciences’ postdoctoral researcher Michael Branstetter, now with U.S. Department of Agriculture’s Pollinating Insect Research Unit at Utah State University.

Probst is a postdoctoral researcher in the School of Biological Sciences and the university’s Science Research Initiative, or SRI, and was recently recognized with the Outstanding Postdoctoral Researcher Award by the College of Science. Through the SRI, Probst has involved U undergraduates in his research. For example, students accompanied Probst and Longino to Costa Rica with funding support from the U’s Wilkes Center for Climate Science & Policy.

With continuing help from SRI undergraduates, Probst is looking to conduct whole genomic sequencing to tease out the genes involved in ant-plant associations, looking “under the hood” of a phenomenon that has intrigued naturalists for centuries.

Read more about the story on ants and trees by Brian Maffly @TheU.

A Framework for Cancer Ecology and Evolution

A Framework for Cancer Ecology and Evolution


July 17, 2024

Why do the vast majority of cancers arise late in subjects’ lives?

Fred Alder. Credit: Mathew Crawley

A traditional explanation in the development of cancers, known as the somatic theory, is a paradigm focused on mutations in individual cells. In this theory a cascade of approximately six mutational changes in a single cell is the source that triggers cancer.  This theory explains the rapidly increasing “power function” that describes how cancer incidence increases with age.

But this power function which lines up with cancer’s six classic hallmarks is now being challenged by a different paradigm that casts doubt on the primacy of individual cells in cancer development. It also challenges the notion that cancer marks a strict change between “normal” and aberrant tissues, particularly as the body ages.

In a paper published today in The Royal Society Interface out of the United Kingdom, “A modeling framework for cancer ecology and evolution” is explored by University of Utah mathematics professor Frederick Adler with a joint appointment in biology.

Cancer's complexity

 

Adler says he has struggled for a long time to come up with an alternative modeling approach for cancer that has the flexibility to capture the complexity of cancer, while standing by the dictum that cancer cells are still cells. “It involved a plane trip where I worked out an extremely complicated approximate version of the method before figuring out, on solid ground, that the exact version was thoroughly simple.” 

Simple didn’t just mean elegant, but also getting results in a reasonable amount of time by optimizing code, something he can appreciate as the current Director of the busy School of Biological Sciences, one of the largest academic units at the University of Utah. 

The dynamics of escape in a person with imperfect initial control.  We see replacement by increasingly dark shades of gray that indicate cells that are growing faster and faster, leading to an increase in the total cell population (black line at top) above the healthy level (horizontal orange line).

Adler’s findings build on those of others that countermand the primacy of individual cells. These include observations of mutations common in non-cancerous tissues, and sometimes more common than in nearby cancers. “This implies,” the paper states, “that cancers depend on interactions with the surrounding tissue.” A second emphasis on cancer ecology and evolution is now highlighting “the ecology of nutrients, acids and physical factors and the role of cell interactions.”

“Detailed study of adults shows that few if any of their cells are ‘normal,’” says Adler. “Tissues are instead made up of lineages with ever-increasing numbers of aberrant traits, many of which promote excess growth. The vast majority of these incipient growths are contained by controls within those cells and by other cells.”

In Adler’s parsing of the ecological paradigm, senescence theory plays a critical role, focusing on the breakdown of the system of controls within and around individual cells. “[M]any cancers,” for example, “develop much later than their originating oncogenic mutations.” Furthermore, mutant cells in his models are restrained “by systems that remove their growth advantage, but which can weaken with age due to changes such as impaired intercellular communication. Remarkable data on genetic diversity in healthy tissues show that cancer-related mutations are ubiquitous, and often under positive selection despite not being associated with progression to cancer.”

Overview of CAGRM framework. Cells include an arbitrary number of potential lineages, beginning with all cells in the unmutated lineage C0 and evolving first into C1 and eventually a branching evolutionary tree of lineages here indicated collectively by Ci. There are four forms of regulation (indicated by flat-headed arrows): contact inhibition by other cells (C ), inhibition by antigrowth factor (A), depletion of growth factor (G) and depletion of resources (R). Mutualist cells can aid cell replication by suppressing antigrowth factor or by supplementing growth factor or resources.

Tracking the dual nature of cells

 

In the paper, Adler first presents a modeling framework which incorporates evolution, stochasticity (a measure of how random a process is, or the quality of lacking a predictable order or plan) and control and breakdown of control. Using a differential equation, the model then tracks the dual nature of individual cells as ecological competitors for resources and space. 

Using this framework Adler then tested whether the ecological model of cancer initiation generates realistic age-incidence patterns similar to the somatic mutation theory. Another test was made to determine how initial defects in the control systems accelerate the process. 

In this comprehensive systems view, cancer, and an incipient cancer in particular, is not an invader. “It is a set of cells,” the paper reads, “that escape the many layers of internal and tissue level regulation, and then grow to damage the host. The success of a cancer, or equivalently the failure of the regulatory system, requires that the cancer co-opts or evades the systems of control and repair.”

This model/framework, according to the author, assumes a particular structure of the control system but has capacity for “several other extensions” to make it more “realistic.” Those extensions would address, for example, cell differentiation and a clearer class of driver mutations for the genetic model of quantitative trait. Another might address why the mutualist cells in the tests maintain a constant phenotype in spite of what we know about how cells alter behavior in cancer’s presence.

Statistically, we understand that cancer emerges more frequently in older individuals. But how and why is what Adler is attempting to determine. His model, says Adler, “reproduces the rapid increase of cancer incidence with age, identifies the key aspects of control, and provides a complement to the focus on mutations that could lead to new treatment strategies.”

Fred Adler points out that the control system in the model differs greatly across species in concert with their body size and lifespan, thus revealing a paradox known as Peto’s:  cancer rates are similar across organisms with a wide range of sizes and lifespan. “This robustness,” concludes the paper, “is a special case of the principle that all biological systems must be overbuilt to deal with uncertainty.” Referencing Shakespeare’s Hamlet, Adler states that this development in excess of demand exists “to survive ‘the thousand natural shocks that flesh is heir to’… This model seeks to place those shocks in the ecological and evolutionary context that makes long life possible.” 

 

by David Pace

Read about Fred Adler's related work in modeling cancer development, specifically with breast cancer.

 

What do cycling and rocks have in common?

What do cycling and rocks have to do with each other?


July 15, 2024

University of Utah geologists Peter Lippert and Sean Hutchings are helping bring attention to the hidden star of a major sporting event this summer.

I’m not talking about the Olympics, but the Tour de France, which kicked off on June 29 in Florence, Italy and will finish July 21 in Nice, France. This is the first time the iconic bicycle race won’t finish in Paris, due to the city hosting the Summer Olympics.

The star they’re highlighting rises above the competition, literally. It’s also below and all around. 

Peter Lippert and Sean Hutchings

The Geo Tour de France project (Geo TdF) is a blog exploring the geology of the various stages of the bike race. Lippert and Hutchings are two of the five North American contributors to the blog this year. They covered Stage 14, a 152-kilometer ride through the Pyrenees held Saturday and won Saturday by overall race leader Tadej Pogacar of Slovenia in just over four hours.

“The centerpiece of the stage is the Col du Tourmalet, a very famous fabled climb in the Tour de France that has lots of amazing history,” said Lippert, an associate professor in the Department of Geology & Geophysics and director of the Utah Paleomagnetic Center. “This is going to be one of the really decisive stages of the Tour this year.”

The entire race covers 3,500 kilometers (2,175 miles) in 21 stages.

“I’ve always loved this project, because it’s just such a fun way to share our science and share how we see the world with the public and particularly a public that’s probably not often thinking about the geology,” Lippert said. 

For Lippert and Hutchings, as well as many of their peers across the world, geology and cycling go hand in hand.

“Riding a bike up and down a mountain gives you a lot of time to see how the mountains put together the rocks you’re riding over in the landscapes that you’re on,” Lippert said. “We’re both trained geologists for most of our lives so it’s hard not to always be thinking about [geology].”

Utah in particular boasts captivating and diverse geological features.

“It’s mountain biking Candyland around here,” said Hutchings, a graduate research assistant in the U of U Seismograph Stations. “It’s fun to be able to climb up to the top of the hill and it’s hard to not interact with rocks on the way as well.”

“You have this new identity with the landscape you’re on if you’re able to understand what’s going on beneath your feet and what made the landscape,” Lippert said. “I think cycling is a really great high impact sense of place type of experience. You’re going a little bit slower. You get to look around.”

Geo Tour de France project 

This same sentiment was the original inspiration for Geo TdF project creator Douwe van Hinsbergen, professor of geology at the Netherlands’ Utrecht University.

“He wanted to explore a different way of sharing geology with the public,” Lippert said. “This is a total goldmine.’

Fans who watch the livestream of the race are inadvertently watching hours of spectacular geological features. The Geo TdF project enhances the viewing experience by telling geological stories that ground the competition in the larger history of the landscape. 

Lippert first contributed to the blog two years ago, and this time around included Hutchings. The pair worked together during Hutchings’ bachelor’s degree at the U and often bike together.

“I know nothing about Pyrenean geology, so this was a great learning opportunity for me,” Hutchings said. “For graduate school, I’ve dipped more into the seismology realm, so getting back to my geology roots was a fun exercise.”

Col du Tourmalet. Photo credit: Gilles Guillamot, Wikimedia Commons

Tectonic training camp 

Stage 14 passed through Pyrenees, the mountains on France’s border with Spain, with an average grade of 7.9%. That’s just under 95 miles at an average grade more than twice as steep as the incline from President’s Circle to the Natural History Museum of Utah. 

“Let’s think big” is what Lippert and Hutchings thought when they were presented with the opportunity to cover this pivotal stage of the race.

“I mean the Tour de France is big, the Pyrenees are big, tectonics are big. Sean is more of a geophysicist working with earthquakes and things like that,” Lippert said. “My expertise is in collisional mountain builds, like what happens when oceans close and mountains form. So we thought let’s just go back to basics and keep it big.” 

What could be bigger than beginning with the ancient supercontinent Pangea? For their portion of the project, Lippert and Hutchings focused on the creation of the Pyrenees mountain range which began with the separation of Pangea and subsequent plate collisions, a process they describe as a “tectonic training camp.” 

A Wealth of information

Some readers might be wondering if these passionate geologists will eventually run out of topics to discuss, even though the Tour course changes each year. Lippert and Hutchings aren’t concerned about that at all. 

“One nice thing about geology is that rocks usually stay put and you can go back to check them out year after year. So the rocks don’t change, but the way that we can talk about them does. The limit is our creativity now, what the rocks can provide, because they’re full of really good stories,” Lippert said. “There’s a wealth of information that a single rock can tell you. Where it came from, and the time it took to get there, and what it looked like at the time.” 

By Lauren Wigod

 

A once-in-a-career discovery: the black hole at Omega Centauri’s core

A once-in-a-career discovery: the black hole at Omega Centauri’s core


July 11, 2024
Above: The likely position of Omega Centauri star cluster’s intermediate black hole. Closest panel zooms to the system.
PHOTO CREDIT: ESA/HUBBLE & NASA, M. HÄBERLE (MPIA)

Omega Centauri is a spectacular collection of 10 million stars, visible as a smudge in the night sky from Southern latitudes.

Through a small telescope, it looks no different from other so-called globular clusters; a spherical stellar collection so dense towards the center that it becomes impossible to distinguish individual stars. But a new study, led by researchers from the University of Utah and the Max Planck Institute for Astronomy, confirms what astronomers had argued about for over a decade: Omega Centauri contains a central black hole.The black hole appears to be the missing link between its stellar and supermassive kin—stuck in an intermediate stage of evolution, it is considerably less massive than typical black holes in the centers of galaxies. Omega Centauri seems to be the core of a small, separate galaxy whose evolution was cut short when it was swallowed by the Milky Way.

“This is a once-in-a-career kind of finding. I’ve been excited about it for nine straight months. Every time I think about it, I have a hard time sleeping,” said Anil Seth, associate professor of astronomy at the U and co-principal investigator (PI) of the study. “I think that extraordinary claims require extraordinary evidence. This is really, truly extraordinary evidence.” A clear detection of this black hole had eluded astronomers until now. The overall motions of the stars in the cluster showed that there was likely some unseen mass near its center, but it was unclear if this was an intermediate-mass black hole or just a collection of the stellar black holes. Maybe there was no central black hole at all.

A medium Level panel zoom of the Omega Centauri star cluster’s intermediate black hole likely position. PHOTO CREDIT: ESA/HUBBLE & NASA, M. HÄBERLE (MPIA)

“Previous studies had prompted critical questions of ‘So where are the high-speed stars?’ We now have an answer to that, and the confirmation that Omega Centauri contains an intermediate-mass black hole. At about 18,000 light-years, this is the closest known example for a massive black hole,” said Nadine Neumayer, a group leader at the Max Planck Institute and PI of the study. For comparison, the supermassive black hole in the center of the Milky Way is about 27,000 light-years away.

A range of black hole masses

In astronomy, black holes come in different mass ranges. Stellar black holes, between one and a few dozen solar masses, are well known, as are the supermassive black holes with masses of millions or even billions of suns. Our current picture of galaxy evolution suggests that the earliest galaxies should have had intermediate-sized central black holes that would have grown over time, gobbling up smaller galaxies done or merging with larger galaxies.

Such medium-sized black holes are notoriously hard to find. Although there are promising candidates, there has been no definite detection of such an intermediate-mass black hole—until now.

“There are black holes a little heavier than our sun that are like ants or spiders—they’re hard to spot, but kind of everywhere throughout the universe. Then you’ve got supermassive black holes that are like Godzilla in the centers of galaxies tearing things up, and we can see them easily,” said Matthew Whittaker, an undergraduate student at the U and co-author of the study. “Then these intermediate-mass black holes are kind of on the level of Bigfoot. Spotting them is like finding the first evidence for Bigfoot—people are going to freak out.”

Read more about the Discovery @TheU.

Read more about the story at NASA, Deseret News, ABC4 Utah and ESA/Hubble releases.