New strategy could lower blood pressure in low-income patients

A new strategy that trains health care providers to deliver more comprehensive, team-based care significantly lowers blood pressure in low-income patients compared to a “usual care” approach.

Uncontrolled hypertension, the leading preventable risk factor for cardiovascular disease and premature deaths worldwide, disproportionately affects low-income populations.

The researchers conducted an 18-month clinical trial with 1,272 hypertension patients at 36 federally qualified health centers in Louisiana and Mississippi. Most of the patients, 73%, had an annual family income level below $25,000. They presented their findings at the American Heart Association Scientific Sessions in Philadelphia.

Half of the clinics were assigned to provide enhanced usual care, in which health care providers received training on recent American Heart Association hypertension guidelines. The other half were assigned to a multi-faceted intervention in which nurses, pharmacists, or medical assistants were trained to coach patients about lifestyle modifications and medication adherence. Patients were also taught to conduct at-home blood pressure monitoring.

Those treated by the multi-faceted approach saw a systolic blood pressure drop of 16 mmHg compared to a 9 mmHg drop in blood pressure in the usual care group.

“This study shows that multi-level intervention is necessary to get blood pressure under control in low-income populations. One doctor alone is not enough. It requires teamwork,” says principal investigator Jiang He, professor of epidemiology at Tulane University School of Public Health and Tropical Medicine.

Importantly, the multi-faceted intervention clinics were trained to follow SPRINT (Systolic Blood Pressure Intervention Trial) treatment protocols, which set a goal of reducing hypertension patients’ systolic blood pressure to below 120 mmHg, compared to the AHA’s goal of less than 130 mmHg. Hypertension is defined as a blood pressure of 130 mmHg or higher.

The consistent at-home blood pressure monitoring empowered patients to engage in blood pressure management, He says. Those treated by the multi-faceted approach also saw a higher adherence to medication, at-home monitoring, and health education.

In addition to being prevalent in low-income groups, hypertension is more common in Black Americans than any other group.

More effective treatment of hypertension can reduce this health disparity, He says. The success of the multi-faceted approach in Mississippi and Louisiana, where hypertension is especially high in low-income areas, also shows the approach is scalable and can be effectively implemented in other low-income communities.

“Our study is one of the first clinical trials to show the multi-faceted implementation strategy can improve hypertensions control in federally qualified health centers among low-income patients and shows this program can be easily applied at similar health centers nationwide,” He says.

 

Source: Tulane University

source

Air pollution can take a toll on kids’ test scores

Air pollution may negatively affect children’s standardized test scores, a new study shows.

For the study, researchers used data from the North Carolina Education Research Data Center to track 2.8 million public school students in North Carolina from 2001 to 2018 and measured their exposure to PM2.5, also known as fine particulate matter, found in polluted air.

While previous research has shown an association with adverse outcomes on academic performance in children, it has relied on relatively small or less representative samples and faced challenges in accounting for unobserved confounders.

“The biggest strength of this study is that we [tracked] every student in North Carolina in those years, for the whole time period that they were in the public schools,” says Emma Zang, an assistant professor of sociology, biostatistics, and global affairs at Yale University and coauthor of the study published in JAMA Network Open.

“I think it is really incredible because it’s actually the first study that uses this kind of population data, covering everybody. Air pollution has been shown to affect a lot of things, but the effect of air pollution on students’ academic performance is still relatively new.”

The study also found that PM2.5 levels disproportionately affected test scores of ethnic minorities and girls.

“Females and racial ethnic minorities face structural sexism and structural racism,” says Zang. “There are a lot of policies that are not friendly towards females and ethnic minorities. So, when they’re exposed to the same level of air pollution, they don’t have the resources to buffer the negative influences.”

More privileged populations, Zang says, may have more resources that allow them to live in a better environment, such as in houses with air purifiers.

“I think this is another point which has been found in previous studies, but I’m not sure it is well-known to the general public,” Zang says.

Future research in this area, the researchers say, would involve looking at whether the findings are applicable to other regions, and also the reasons behind the racial and ethnic disparities and social and sexual differences.

“What we want to emphasize here is that despite the relatively low level of ambient PM2.5 pollution in the US compared to other countries, there are still significant adverse health impacts,” says senior author Kai Chen, an assistant professor of epidemiology (environmental health) at the Yale School of Public Health.

“Students living in areas below the current air quality standard of annual PM2.5 concentration (12 micrograms per cubic meter) are still adversely impacted by air pollution when it comes to their test scores,” says Chen, who is also director of research at the Yale Center on Climate Change and Health. “We should aim to strengthen the annual PM2.5 standard to better protect our children.”

Pam Hung Lam of Duke University is the study’s lead author. Additional coauthors are from Penn State and Nanjing University.

The Institution for Social and Policy Studies at Yale supported the work.

Source: Christina Frank for Yale University

source

Sick house finches skip social distancing and get close

Unlike other social animals who passively or actively isolate themselves when sick, house finches gravitates toward healthy flock mates when they are sick, even more so than healthy birds do.

Social distancing when sick has become second nature to many of us in the past few years, but the new research shows some appear to take a different approach.

In particular, the study, published in the journal Ecology and Evolution, found sick finches want to eat together with their flock.

“The recent pandemic years of isolating and quarantining have shown us that social distancing to avoid getting sick can also have detrimental aspects for group living animals,” says Marissa Langager, a PhD student in the biological sciences department in the College of Science at Virginia Tech, whose research interests are social behavior and disease ecology.

“The costs of going solo may be particularly high for sick animals especially if they rely on their healthy groupmates to help them find food or avoid predators. Ultimately, this might be the reason that finches become even more social when sick, inadvertently putting their healthy flock mates at risk because bird feeders, where house finches like to gather to feed, are a major means of spreading disease.”

Few studies prior to this one directly examined how acute infections caused by contagious pathogens influence social preferences, but instead focused more generally on why some animals evolve to become social in the first place and how social living benefits them.

This research, Langager says, sheds light on how social animals behave when sick and can inform other studies in the field.

“Since all social animals—including humans—get sick, it is important to understand the costs and benefits of group living more broadly,” she says. “We may be able to use this information to predict disease spread in social animals. And it can also help us understand when and where we might expect healthy animals to evolve the ability to avoid sick groupmates who remain in the group at risk to their healthy groupmates.”

Because of the unexpected results of the study, Langager wanted to know more about what exactly might be leading the sick finches to increased preferences for eating with a social group and is exploring this further in her doctoral dissertation.

“Maintaining social relationships can take a lot of energy for the birds I study. So if these birds are putting forth the energy to keep hanging around their social groups even when they are sick, it is most likely because of the benefit to them,” she says.

Langager has devised several experiments that will test whether group membership affects a sick bird’s behavior by changing how it responds to a predator and affecting its ability to successfully forage for food.

Lanager’s advisor, Daana Hawley, and James S. Adelman of the University of Memphis are coauthors of the study.

Source: Virginia Tech

source

How a favorite dinosaur food survived mass extinction

An ancient lineage of plants called cycads, a favored food of grazing dinosaurs, survived extinction by grabbing nitrogen from the air.

The palm-like plants helped sustain dinosaurs and other prehistoric animals during the Mesozoic Era, starting 252 million years ago, by being plentiful in the forest understory.

Like their lumbering grazers, most cycads have gone extinct. Their disappearance from their prior habitats began during the late Mesozoic and continued into the early Cenozoic Era, punctuated by the cataclysmic asteroid impact and volcanic activity that mark the K-Pg boundary 66 million years ago. However, unlike the dinosaurs, somehow a few groups of cycads survived to the present.

A new study, published in the journal Nature Ecology & Evolution, concludes that the cycad species that survived relied on symbiotic bacteria in their roots, which provide them with nitrogen to grow. Just like modern legumes and other plants that use nitrogen fixation, these cycads trade their sugars with bacteria in their roots in exchange for nitrogen plucked from the atmosphere.

What originally interested lead author Michael Kipp is that the tissues of nitrogen-fixing plants can provide a record of the composition of the atmosphere they grew up in. He combines geochemistry with the fossil record to try to understand the Earth’s climate history.

Knowing already that modern cycads are nitrogen-fixers, Kipp began analyzing some very old plant fossils during his PhD work at the University of Washington to see if he could get a different look at ancient atmospheres. Most of the old cycads revealed that they weren’t nitrogen-fixers, but these also turned out to be the extinct lineages.

“Instead of being a story about the atmosphere, we realized this was a story about the ecology of these plants that changed through time,” says Kipp, who spent nearly a decade on this finding, first at the University of Washington and then as a postdoctoral researcher at CalTech.

Kipp will join Duke University this year as an assistant professor of earth and climate sciences in the Nicholas School of the Environment to continue using the fossil record to understand Earth’s climate history so that we can understand its possible future.

Much of what we know about ancient atmospheres comes from chemical studies of ancient sea life and sediments, Kipp says. Applying some of those methods to terrestrial plants is a new wrinkle.

“Going into the project, there were no published nitrogen isotope data from fossilized plant foliage,” Kipp says. It took a while for him to fine-tune the method and to secure samples of precious plant fossils that museum curators were reluctant to see vaporized to get the data.

“In the few fossil samples that are of surviving (cycad) lineages, and that are not so old—20, 30 million years—we see the same nitrogen signature as we see today,” Kipp says. That means their nitrogen came from symbiotic bacteria. But in the older and extinct cycad fossils, that nitrogen signature was absent.

What is less clear is how nitrogen fixation helped the surviving cycads. It may have helped them weather the dramatic shift in climate or it may have allowed them to compete better with the faster-growing angiosperm plants that flourished after the extinction, “or it could be both.”

“This is a new technique that we can do a lot more with,” Kipp says.

The Paleontological Society, the University of Washington Royalty Research Fund, and NASA Exobiology funded the work.

Source: Duke University

source

Grizzly bear internal clocks keep ticking in hibernation

The internal clocks of grizzly bears appear to keep ticking through hibernation, according to a genetic study.

This persistence highlights the strong role of circadian rhythms in the metabolism of many organisms, including humans.

The findings confirm observational evidence that bears’ energy production still waxes and wanes in a daily pattern even as they slumber for several months without eating.

The researchers also found that during hibernation the amplitude of the energy production was blunted, meaning the range of highs and lows was reduced. The peak also occurred later in the day under hibernation than during the active season, but the daily fluctuation was still there.

“This underscores the importance of the circadian rhythms themselves—that they give organisms the flexibility to still function in a state as extreme as a hibernating bear,” says Heiko Jansen, a professor in the integrative physiology and neuroscience department at Washington State University and senior author of the study in the Journal of Comparative Physiology B.

Other research has shown that circadian rhythms, the 24-hour physical cycles common to most living animals on Earth, have ties to metabolic health. In humans, major disruptions to these patterns, such as occur in night shift work, have been linked to metabolic problems like weight gain and higher prevalence of diabetes.

In some sense, bears are extreme shift workers, taking as much as six months off when they hibernate. Researchers like Jansen’s team are attempting to figure out how they engage in seemingly unhealthy habits of gaining excessive weight then going without food and not moving much for several months—all without detrimental effects like loss of bone mass or diseases like diabetes.

Unlike hibernating rodents who are almost comatose, bears do move around occasionally during this dormant period. Through observation studies of grizzly bears at the Washington State Bear Center, researchers found these movements tend to follow a circadian rhythm with more activity during the day than at night.

In the current study, the researchers looked to see if that circadian rhythm was expressed on the cellular level. They took cell samples from six bears during active and hibernating seasons, then cultured those cells to conduct an array of genetic analyses.

To mimic hibernation, the researchers examined the cells at the bears’ typical lowered body temperature during hibernation at about 34 degrees Celsius (93.2 degrees Fahrenheit) and compared that to 37 C (98.6 F) during the active season.

They found thousands of genes were expressed rhythmically in hibernating bear cells. This translated into rhythms of energy through rise and fall of production of adenosine triphosphate or ATP, the body’s cellular source of energy. ATP was still produced in a daily pattern under hibernation but the production had a blunted amplitude, lower peaks and valleys. The highest production point also shifted to later in the day under hibernation than under active season conditions.

Maintaining a circadian rhythm requires some energy itself. The researchers believe that by altering this rhythm some during hibernation may allow bears to still get some energetic benefit of the daily cycle without as much of cost—which likely helps them survive going without food for months.

“It’s like setting a thermostat. If you want to conserve some energy, you turn down the thermostat, and this is essentially what the bears are doing,” Jansen says. “They’re using the ability to suppress the circadian rhythm, but they don’t stop the clock from running. It’s a really novel way of fine-tuning a metabolic process and energy expenditure in an animal.”

Additional coauthors are from Washington State and the University of California, Santa Cruz.

The National Science Foundation and the Bear Research and Conservation Endowment at Washington State funded the work.

Source: Washington State

source

Team discovers new sulfur cycle in Lake Superior

Researchers have discovered a new kind of sulfur cycle in Lake Superior.

Their findings, published in Limnology and Oceanography, focus attention on the role organic sulfur compounds play in this biogeochemical cycle.

The yellow element sulfur is a vital macronutrient, and geochemist Alexandra Phillips is trying to understand how it cycles through the environment. Specifically, she’s curious about the sulfur cycle in Earth’s ancient ocean, some 3 billion years ago.

Fortunately, the nutrient-poor waters of Lake Superior offer a welcome glimpse into the past. “It’s really hard to look back billions of years,” says Phillips, a former postdoctoral researcher at UC Santa Barbara and University of Minnesota, Duluth. “So this is a great window.”

From sulfate to hydrogen sulfide

The sulfate ion (SO4) is the most common form of sulfur in the environment, and a major component of seawater. In the bottoms of oceans and lakes, where oxygen becomes unavailable, some microbes make their living by turning sulfate into hydrogen sulfide (H2S). The fate of this hydrogen sulfide is complex; it can be consumed quickly by microorganisms during respiration, or it can be retained in sediments for millions of years. Converting sulfate into hydrogen sulfide is a time-honored profession; genomic evidence suggests microbes have been doing it for at least 3 billion years.

But scientists believe sulfate didn’t become abundant until around 2.7 to 2.4 billion years ago, when photosynthetic activity of newly evolved cyanobacteria began pumping massive amounts of oxygen into the ocean and atmosphere. So where were these ancient microbes getting their sulfate?

Mulling over this quandary, Phillips turned her attention toward organic sulfur, molecules in which sulfur is bound to a carbon compound. These include sulfo-lipids, and sulfur amino acids. In the modern ocean, sulfate is almost a million times more abundant than organic sulfur. “But in a system where there’s not very much sulfate, all of a sudden organic sulfur matters a lot more,” she says.

“For a long time, our thinking was dominated by what we could learn from modern oceans, which are sulfate-rich,” says senior author Sergei Katsev, a professor at University of Minnesota’s Large Lakes Observatory. Katsev served as the senior scientist of the National Science Foundation-funded project. “Understanding early Earth, however, requires looking at processes that emerge when sulfate is scarce, and this is where organic sulfur can change the whole paradigm.”

The handy thing about Lake Superior

It just so happens that Lake Superior has very little sulfate, nearly a thousand times less than the modern ocean. “In terms of sulfate, Lake Superior looks a lot closer to the ocean billions of years ago and may help us understand processes we can’t go back in time to observe directly,” Phillips says. The early oceans had very little sulfate because there was much less free oxygen available to form SO4.

The great lake serves as an analog for the ancient ocean, enabling Phillips to see how the sulfur cycle may have been playing out back then under similar chemistries. She had three questions in mind:

  • If sulfate reduction is happening, which microbes are responsible?
  • If organic sulfur is fueling this process, what types of compounds do microbes prefer?
  • And, what happens to the hydrogen sulfide that’s produced?

Phillips and her collaborators headed out to Lake Superior to trace organic sulfur from source to sink. The team took water and sediment samples back to the lab for analysis from two sites: one with plentiful oxygen in the sediment and one without. Sulfate reduction usually occurs in anoxic parts of the environment. Oxygen is a great resource, so organisms prefer to use oxygen instead of sulfate when they can. The team used shotgun metagenomics to look for microbes with genes involved in sulfate reduction. And they found plenty, precisely in the layer where sulfate levels peaked in the sediment. In all, they identified eight sulfate-reducing taxa.

The researchers then set off to determine what variety of organic sulfur the microbes preferred. They gave different forms of organic sulfur to separate microbial communities and observed the results. The authors found the microbes produced most of their sulfate from sulfo-lipids, rather than the sulfur amino acids. Although this process takes some energy, it’s much less than the microbes can get from the subsequent reduction of sulfate to hydrogen sulfide.

Not only were the sulfo-lipids preferred for this process, but also they were more abundant in the sediment. Sulfo-lipids are produced by other microbial communities, and drift to the lake bottom when they die.

With the “who” and the “how” answered, Phillips turned her attention to the fate of the hydrogen sulfide. In the modern ocean, hydrogen sulfide can react with iron to form pyrite. But it can also react with organic molecules, producing organic sulfur compounds. “And we found that there is a ton of organic matter sulfurization in the lake, which is really surprising to us,” she says. “Not only is organic sulfur fueling the sulfur cycle as a source, but it’s also an eventual sink for the hydrogen sulfide.”

A new cycle

This cycle—from organic sulfur to sulfate to hydrogen sulfide and back—is completely new to researchers. “Scientists studying aquatic systems need to start thinking about organic sulfur as a central player,” Phillips says. These compounds can drive the sulfur cycle in nutrient-poor environments like Lake Superior, or even the ancient ocean.

This process may also be important in systems with high sulfate. “Organic sulfur cycling, like what we see in Lake Superior, is probably ubiquitous in marine and freshwater sediments. But in the ocean sulfate is so abundant that its behavior swamps out most of our signals,” says senior author Morgan Raven, a biogeochemist at UC Santa Barbara. “Working in low-sulfate Lake Superior lets us see how dynamic the sedimentary organic sulfur cycle really is.” Organic sulfur seems to serve as an energy source for microbial communities as well as preserve organic carbon and molecular fossils. Combined, these factors could help scientists understand the evolution of early sulfur-cycling microorganisms and their impact on Earth’s chemistry.

Some of the earliest biochemical reactions likely involved sulfur, Phillips adds. “We’re pretty sure that sulfur played an important role in really early metabolisms.” A better understanding of the sulfur cycle could provide insights on how early lifeforms harnessed this type of redox chemistry.

Source: UC Santa Barbara

source

Do you crunch, chew, suck, or smoosh food? It affects what you like

Mouthfeel of food determines whether people go back for seconds, according to a new study.

Texture has been one of the trends in food product messaging for several years, says Rhonda Miller, Texas A&M AgriLife research faculty fellow and meat science professor in the Texas A&M College of Agriculture and Life Sciences Department of Animal Science in Bryan-College Station.

Miller is applying her mouthfeel research to products in the beef industry to determine how to improve consumption. She conducted a three-phase mouth behavior sStudy involving how four types of eaters consume beef and steak.

People manipulate food in their mouths differently—some use their molars and chew; some people manipulate the food with their tongue. Chewers and crunchers like to use their teeth to break down foods. Suckers and smooshers manipulate food between their tongue and the roof of their mouth.

But these texture terms are not universally understood—a “good crunch” to a cruncher is much different to a chewer.

“Most people don’t even realize they are manipulating their food in their mouth,” Miller says.

But Miller does, as she operates the Sensory Science Evaluation Laboratory, conducting research on a variety of meat and food products, evaluating them for flavor and palatability.

Mouthfeel and food preferences

Little is known about what drives people’s preferences, but everyone is born with a preference for texture, Miller says. Texture is a strong driver of rejection of a food item. Researchers are interested in whether texture affects purchasing habits regarding food products.

“In general, people have a very low texture awareness,” she says. “They talk about flavor, but not texture, because we have a low awareness of how to verbalize that.”

Miller breaks down the mouthfeels a little more: chewers and crunchers have the same mouth motions, but chewers are less vigorous in their chew and eat food more slowly, while crunchers eat food forcefully. Crunchers are often accused by others of being too loud. They crunch until the food is gone. Smooshers use their tongue and the roof of their mouth; suckers appropriately suck the flavor out before chewing.

She says the US population consists of about 8% suckers, 43% chewers, 33% crunchers, and 16% smooshers. The study also showed that suckers reject products at a 45% level, while smooshers reject at 29%, crunchers at 16%, and chewers at 10%.

Her study revealed that many times, products are made without considering consumers’ sensory behaviors.

“But we know there are some that are: for instance, granola bars—do you want them crunchy or chewy? You can look at the package merchandising and see they know there is a difference in what their consumer wants,” she says.

“So as meat scientists, our concern is, especially when beef prices are high, retailers want to know how they can get consumers to buy beef one more time a month,” Miller says.

Beefy findings

Miller found interesting differences in the way chewers, crunchers, smooshers, and suckers experience hamburgers and steaks based on the way the meat was processed prior to cooking.

Ground beef burger patties were rated on descriptive textures such as surface roughness, firmness, connective tissue amount, cohesive mass, particle size and chewiness. Consumers identified factors that influenced their evaluation.

  • Chewers must have flavorful burgers, no soggy buns, no rubbery feel or gristle, and the patty can’t be dry or too greasy.
  • Crunchers want a burger that is not too dry or raw, not chewy, crumbly, or chunky, no soggy bun and the meat can’t stick to their teeth.
  • Smooshers want a juicy, well-seasoned patty, no gristle, not congealed or sludgy, and no residue feel in their mouth.
  • Suckers defined their ideal burger as juicy, not too chewy, but not crumbly, and the seasoning should come before cooking.

The goal of this study was to determine how fat content affected consumers’ perceptions. Chewers and smooshers found higher-fat patties less tough and chewy, with crunchers saying 93% lean beef was too dry. Higher fat was associated with higher tenderness. For the suckers, it wasn’t about fat content, but rather whether the meat was chopped or ground.

The verdict on chopped beef patties—chewers said lean chopped patties were tougher, crunchers said they were less juicy, smooshers said they were greasy, and suckers said they were dry. The final outcome was that ground beef patties from the chuck are less polarizing across the mouth behavior groups compared to ground beef patties made from other lean sources.

“We learned a lot, and I walked away with an ‘aha’ moment,” Miller says. “The ideal patty is easy to bite and stays together well. Also, we learned that chewers do not like McDonalds.”

When it came to steaks, the higher-marbled steaks were liked by consumers across each mouth behavior group, but for different reasons. The aging process produces big gaps among mouth behaviors.

“We’ve been a little stale in how we as meat scientists think,” Miller says. “This study has helped me think outside of the box—but I don’t have any definitive answers yet.”

The research appears in Meat Science.

Source: Texas A&M University

source

Atomic ‘hula’ turns rare-earth crystal into magnet

A study finds that when the atomic lattice in a rare-earth crystal becomes animated with a corkscrew-shaped vibration known as a chiral phonon, the crystal is transformed into a magnet.

Quantum materials hold the key to a future of lightning-speed, energy-efficient information systems. The problem with tapping their transformative potential is that, in solids, the vast number of atoms often drowns out the exotic quantum properties electrons carry.

Rice University researchers in the lab of quantum materials scientist Hanyu Zhu found that when they move in circles, atoms can also work wonders.

According to a study published in Science, exposing cerium fluoride to ultrafast pulses of light sends its atoms into a dance that momentarily enlists the spins of electrons, causing them to align with the atomic rotation. This alignment would otherwise require a powerful magnetic field to activate, since cerium fluoride is naturally paramagnetic with randomly oriented spins even at zero temperature.

“Each electron possesses a magnetic spin that acts like a tiny compass needle embedded in the material, reacting to the local magnetic field,” says materials scientist and coauthor Boris Yakobson. “Chirality—also called handedness because of the way in which left and right hands mirror each other without being superimposable—should not affect the energies of the electrons’ spin. But in this instance, the chiral movement of the atomic lattice polarizes the spins inside the material as if a large magnetic field were applied.”

Though short-lived, the force that aligns the spins outlasts the duration of the light pulse by a significant margin. Since atoms only rotate in particular frequencies and move for a longer time at lower temperatures, additional frequency- and temperature-dependent measurements further confirm that magnetization occurs as a result of the atoms’ collective chiral dance.

“The effect of atomic motion on electrons is surprising because electrons are so much lighter and faster than atoms,” says Zhu, chair and an assistant professor of materials science and nanoengineering. “Electrons can usually adapt to a new atomic position immediately, forgetting their prior trajectory. Material properties would remain unchanged if atoms went clockwise or counterclockwise, i.e., traveled forward or backward in time—a phenomenon that physicists refer to as time-reversal symmetry.”

The idea that the collective motion of atoms breaks time-reversal symmetry is relatively recent. Chiral phonons have now been experimentally demonstrated in a few different materials, but exactly how they affect material properties is not well understood.

“We wanted to quantitatively measure the effect of chiral phonons on a material’s electrical, optical and magnetic properties,” Zhu says. “Because spin refers to electrons’ rotation while phonons describe atomic rotation, there is a naive expectation that the two might talk with each other. So we decided to focus on a fascinating phenomenon called spin-phonon coupling.”

Spin-phonon coupling plays an important part in real-world applications like writing data on a hard disk. Earlier this year, Zhu’s group demonstrated a new instance of spin-phonon coupling in single molecular layers with atoms moving linearly and shaking spins.

In their new experiments, Zhu and the team members had to find a way to drive a lattice of atoms to move in a chiral fashion. This required both that they pick the right material and that they create light at the right frequency to send its atomic lattice aswirl with the help of theoretical computation from the collaborators.

“There is no off-the-shelf light source for our phonon frequencies at about 10 terahertz,” explains Jiaming Luo, an applied physics graduate student and the lead author of the study. “We created our light pulses by mixing intense infrared lights and twisting the electric field to ‘talk’ to the chiral phonons. Furthermore, we took another two infrared light pulses to monitor the spin and atomic motion, respectively.”

In addition to the insights into spin-phonon coupling derived from the research findings, the experimental design and setup will help inform future research on magnetic and quantum materials.

“We hope that quantitatively measuring the magnetic field from chiral phonons can help us develop experiment protocols to study novel physics in dynamic materials,” Zhu says. “Our goal is to engineer materials that do not exist in nature through external fields—such as light or quantum fluctuations.”

The research had support from the National Science Foundation, the Welch Foundation, and the Army Research Office.

Source: Rice University

source

Context matters for brain parts and motor tasks

The same motor task can make different groups of brain cells light up, a study shows.

Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight away, or you might look both ways before you cross.

In either scenario, you see the light change, and you cross the street. But the context is different; in one case, you didn’t think twice. In the other, you waited, looked to the left and right, and then stepped into the street.

New research shows that even though those two scenarios involve the same action, they light up parts of the brain in different ways. The discovery may help researchers understand how other structures of the brain work and even develop new algorithms for self-driving cars.

Researchers have known that certain brain activity when you see the light change and when you step out into the street are the same no matter the context—there’s a known “pathway” that a neuron’s activity travels.

Neeraj Gandhi, a bioengineering professor in the University of Pittsburgh’s Swanson School of Engineering wanted to know if anything happens along that pathway between the time you see the light change—a stimulus—and the moment you step into the street—an action. Or does the pathway for “crossing the street” look the same, no matter the context?

When measuring the activity of neurons in a part of the brain called the superior colliculus, which governs reactions to visual stimuli, the team found bursts in different groups of brain cells when a task was immediate and when it was delayed.

“If there are two different contexts, even though you’re making exactly the same movement, the neural activity in the brain is different,” Gandhi says. “In addition to the motor/action command, there is other activity there that tells you something about what’s going on cognitively in a given structure.”

Gandhi and his team report their findings in the Proceedings of the National Academy of Sciences.

From an engineering standpoint, Gandhi says, the finding may have implications for algorithm design. For instance, a similarly designed system could serve as a framework for an autonomous vehicle system that could accelerate when a light turns green, but also delay that action if it senses something in the crosswalk. The system could analyze the object and, if the coast is clear, it could then begin to drive.

Lead author Eve Ayar, a PhD student at Carnegie Mellon University and a member of Gandhi’s lab, says their results may have implications for better understanding the mechanisms underlying executive function—and ways in which it may be impaired.

“There are a lot of disorders out there where people are unable to take in that sensory stimulus in [their] environment and make some kind of movement or action in response to that,” Ayar says. Soon researchers may be able to build models to understand how these systems work and the ways in which they can be disrupted.

“I think this is valuable not only for better understanding this structure of the brain, but potentially it will help us understand how other regions in the brain are operating as well,” Ayar says.

Source: University of Pittsburgh

source

Cleaning up water pollution can nudge housing prices

Federal grants aimed at water pollution remediation in Great Lakes Areas of Concern, or AOCs, had a notable and statistically significant effect on housing prices within about a 12-mile radius of specific regions on all five lakes.

Water pollution is a major issue throughout the United States. The US government has spent more than $1.23 billion since 2004 on the cleanup of toxic pollutants in waterways around the Great Lakes region alone.

Published in the Journal of Public Economics, the research shows that the initial designation of AOCs lowered property values by an average of $25,700 per house. However, the subsequent awarding of federal grants to clean up these areas raised property values by an average of $27,295 per house—resulting in a net-positive benefit of the AOC program of $1,595 per house.

“Our research finding that the benefits of the cleanups outweigh the costs is important,” says study coauthor Robyn Meeks, assistant professor at the Sanford School of Public Policy at Duke University.

Meeks points out this research has implications for other states and regions as well.

“People in every state are thinking about the quality of water where they live. These might be concerns about pollution of drinking water or the surface waters that are crucial for people’s fishing, swimming, boating, and livelihoods more broadly. This paper demonstrates that people value clean water, as evidenced in the changes in housing prices that we see in response to the cleanup,” she says.

The paper details key findings around the impact of the AOC listing, the effect of the federal grants on housing prices, and the impact of the AOC grants on the local economy.

“Despite concerns about various sources and types of pollution impacting water quality in the US, and a substantial amount of federal funding allocated towards mitigating the pollution, there is relatively little research on the economic benefits of efforts to clean up water pollution. This paper provides such evidence, and several key findings: how people value clean water and that the benefits of cleaning up the pollution exceed the cost,” Meeks says.

Study coauthors are from the University of Alabama and the University of Michigan.

Source: Duke University

source