Record-breaking heat ups awareness of weather trends

Experiencing days in which the temperature exceeds previous highs for that time of year affects people’s perception of weather trends, a study finds.

Published in Scientific Reports, the study finds that living in an area with record-breaking heat effectively increases perceptions that the weather is getting hotter.

In December 2022, the authors surveyed a nationally representative sample of 1,605 United States adults to determine whether more frequent record-breaking weather events affect weather change perceptions. The participants were asked, “To the best of your knowledge, how did excessive daytime heat across the United States in 2022 compare with previous years?”

“What matters to people is if one of those days gets the big red stamp that says ‘record-breaking.’”

Timothy Hyde, postdoctoral fellow in the University of Pennsylvania’s Annenberg School of Public Policy and Dolores Albarracín, professor at Penn and director of the Science of Science Communication Division linked answers to this question with meteorological data collected by the National Climatic Data Center from 1949, when meteorologists first implemented a reliable record of climatic data, to 2022. Doing so allowed the researchers to determine which days in 2022 constituted a heat record in a particular area before correlating heat recordings with perceptions that temperatures were higher relative to previous years.

The study found that while record-breaking heat days have little or no effect on beliefs in the existence of climate change, they do affect evaluations of how much hotter the weather has become compared to previous years. This effect of record-heat days is such that the difference in answers between a respondent who experienced no record-breaking heat days and another who experienced 16 record-breaking heat days is as large as the average difference in responses between independent and Democratic respondents.

Whether a day has reached a record-breaking temperature in a locale can only be known after the fact. The average person cannot determine whether any particular day is a record heat day by simply walking outside. Instead, individuals must learn that a day broke the existing heat record by checking the news. Nor is the absolute temperature itself the key metric; what is important is whether the temperature surpasses records set in the same place over the last 72 years. “It doesn’t matter whether one day is 100 degrees and the next is 101. What matters to people is if one of those days gets the big red stamp that says ‘record-breaking,’” Albarracín says.

“Just because a record heat day can change a person’s opinions on the weather doesn’t mean it holds the key to changing opinions on climate change.”

The study also found that other indicators of climate change, including average heat levels, non-record-breaking extreme heat days, and severe weather, did not significantly affect the respondents’ views.

The season in which the record heat days occur is also associated with effects. Since the hottest days of the year typically occur in the summer, the scholars hypothesized that heat records during that time of year would have the most significant impact, as those days would have the highest temperatures a person would see in a year. Instead, they found that record-heat days in winter affect people’s perceptions of worsening heat most strongly. “This may be due to the difference in media coverage of temperature during different times of the year,” Hyde says. “That is likely because it sticks out to people when media discusses how warm a day is during what is supposed to be the coldest time of the year.”

While exposure to record-heat events has been found to cause people to believe that the temperature is hotter than in previous years, it had little effect on perceptions of the existence of climate change. These beliefs are less likely to change, the researchers say, because they are associated with political stances. “Since discussing changes in weather patterns is not necessarily political, it is reasonable to assume that people are more willing to update their opinions on weather change than climate change,” Hyde notes.

“Just because a record heat day can change a person’s opinions on the weather doesn’t mean it holds the key to changing opinions on climate change,” adds Albarracín. “Even if it did, we could not expose populations to extreme heat as we see fit to generate change. The most critical element is understanding how record-breaking heat days change opinions and then seeing whether communications about these events can change people’s beliefs.”

Exposure to a record-breaking heat day in a local area significantly affects individuals’ perceptions of increasing temperature, supporting the hypothesis that record-heat days can drive changes in the general perception that the weather is worsening.

“Since an individual can only recognize record-heat days through media reports and only understand the severity of them through media coverage, we know that continued research into how record-heat days change can’t only be about record-heat days. Understanding media reports on them will also be important for learning how to affect climate change perceptions,” Hyde says.

Source: Penn

source

ESA’s Euclid space telescope obtaining “magnificent” test images despite a few finetuning hiccups

“After a dozen years designing and developing Euclid, it is exhilarating and very moving to see these first images. But you can’t just release them to the public; it takes time to get scientific validation. We expect the first science publication in January.”

Giuseppe Racca, Euclid project manager at ESA

PARIS — Launched atop a SpaceX Falcon 9 rocket from Cape Canaveral on July 1, the European Space Agency’s 2-ton Euclid space observatory is intended to scrutinize the universe in search of answers to the question of how undetectable dark matter and dark energy have been shaping the universe for billions of years. 

It took a month for Euclid to arrive at the Earth-sun L-2 Lagrange point, a gravitationally stable spot 1.5 million kilometers from the Earth in the direction opposite the sun. Once there, the Thales Alenia Space-built spacecraft was expected to undergo a two-month commissioning phase before beginning science operations. 

However, problems were detected during instrument-performance verification that, if unresolved, could prevent the telescope from providing the highest-resolution images of the deep universe in all conditions.

The hiccups were serious enough for ESA to suspend the commissioning process while engineers sought remedies to the issues confronting the mission.

Over the last few weeks, solutions have been put in place, and the situation has significantly improved, permitting Euclid to capture hundreds of mesmerizing initial test images of galaxies.

ESA’s Euclid project manager Giuseppe Racca spoke with SpaceNews reporter Frederic Castel about the initial problems that have affected the flagship astrophysics mission and how he expects it to perform over the next six years.

ESA’s Euclid project manager Giuseppe Racca stands in front of the Euclid space telescope. Credit: ESA

How serious were the problems that impacted the $1.5 billion Euclid telescope during the first few months of the mission?

Once Euclid was in orbit, we discovered that a very small amount of light was being reflected again and again — like a ricochet — on the spacecraft’s surfaces. This was producing stray light that was impacting the visible light detector and disturbing Euclid’s capability to observe very faint galaxies. This was a truly big issue that could end up compromising the mission.! To resolve it, we turned the spacecraft two and a half degrees around the axis of the telescope. That was enough to get rid of this stray light.

In August, another issue arose that prevented you from proceeding with the verification process, and ESA decided to backtrack and return to the previous phase. Why such a decision? 

Early in the mission, as I said, we had some serious concerns over stray light that was interfering with Euclid’s observing instruments. To take multiple pictures or perform spectral and photometric measurements near the infrared, Euclid’s fine guiding sensor needs to be capable of maintaining the telescope in a very precise direction for 75 minutes using guide stars. But in some positions in the sky, high energy cosmic rays and solar protons were striking the sensor intermittently, creating signals that could be mistakenly interpreted as real stars. We had anticipated such interference in our ground simulations, but in the real space environment, the effect was stronger than anticipated.

We discovered the problem in early August, and on Aug. 18, we interrupted some test measurements where we had stray light, doing in the meantime other observations. It took about two months for industry to develop and test a new software to get around this issue.

Are these problems now resolved, and do you expect science observations to start soon? Would you compare your work with the break-in phase of a new vehicle? 

This software patch has been working well for the last two weeks, but we want to be sure it will continue functioning correctly for the six years of the mission. Every morning, I carefully check the downlink data coming from the ESA deep space antenna in Malargüe, Chile, and so far, so good. Yes, you can compare what has happened during the initial phase of the mission to the initial running-in period of your car. Once the break-in phase is over, you can run the engine at any speed and begin testing other capabilities such as accelerations, turns and so on. That’s what we’re doing now.

The 1.2-meter diameter main mirror of ESA’s Euclid mission to unveil the dark Universe, seen during assembly, integration and testing. Credit: Airbus

We hear that some scientists have already been able to access Euclid images and have found them outstanding. How do you rate the mission’s performance so far, and when will you share these images with the public?

Things are looking extremely good in terms of image quality and wide field views. Quality is comparable with that of the NASA Hubble mission, but Euclid is able to cover in a single week what Hubble could do in five years. We have already collected over 1,000 pictures with an amazing level of quality.

However, obtaining nice pictures and conducting a good science observation campaign are two different things. After a dozen years designing and developing Euclid, it is exhilarating and very moving to see these first images. But you can’t just release them to the public; it takes time to get scientific validation. We expect the first science publication in January. Over the six-year mission, Euclid will observe billions of galaxies and create the largest 3D map of the sky ever made. Some 2,000 scientists around the world are already involved in communicating over Euclid’s initial images and data. 

Will understanding the nature of dark matter and dark energy be the holy grail of the Euclid mission? Do you think results could bring the mission a Nobel Prize?

It’s true that calling both phenomena “dark” means that we don’t know their specific nature. We just assume that dark matter ensures the cohesion of galaxies and galactic clusters while dark energy is responsible for the accelerated expansion of the universe. Together, they represent 95% of the invisible content of the universe.

Euclid seeks to address very fundamental questions concerning the structure of the universe and how it has evolved over the past 10 billion years, when most stars and galaxies were formed. The mission could indeed lead to a Nobel Prize, especially if the data shows that our understanding of gravity needs to be changed in some fundamental way. And even if it doesn’t, simply confirming the theories with six years of measurements would prove quite significant.

What is NASA’s role in Euclid, and will NASA’s future Roman Space Telescope take over much of the same research?

NASA contributed to Euclid by providing the infrared spectrometer flight detectors and their readout electronics, and now Caltech will soon have an important role in processing data as part of the nine Euclid Science Data Centers

In May 2027, NASA’s Roman Space Telescope will join Euclid in exploring this cosmic puzzle with even more powerful instruments. Euclid and Roman have complementary strategies. Euclid’s earlier look over broad regions of the sky will allow it to serve as a scouting mission, allowing Roman to concentrate over a smaller area, probing the universe to a greater depth and precision.

In this 2023 ESA video, an animation of Euclid space telescope is shown scanning the night sky using a ‘step-and-stare’ method that combines separate measurements to form what ESA says will be “the largest cosmological survey ever conducted in the visible and near-infrared.”

source

Canadian wildfire smoke linked to boost in NYC asthma cases

New research finds a stark association between Canadian wildfire smoke and increases in the number of people being seen for asthma-related symptoms in New York City emergency departments.

The findings confirm that harmful smoke from wildfires is capable of traveling great distances and can impact the health of people hundreds of miles away.

With smoke from Canadian wildfires once again descending on the Northeastern United States, residents of New England and New York are being urged to take precautions to protect their health.

While the wildfire smoke is not expected to be as bad as the thick orange haze that permeated the skies over New York City and the New England states last June, a heightened risk for health problems remains—especially among those with certain respiratory conditions.

The new study, which appears in the Journal of the American Medical Association, is unique because previous studies on the health impacts of wildfire smoke have focused on areas near the wildfires, but not the broader global regions affected by drifting smoke.

Looking at data from June 2023, the researchers found there was a significant spike in ambient fine particulate matter (PM2.5) in the air over New York City during a smoke wave that dropped down from Canada over a three-day period from June 6 to June 8.

On the same days that the spike in PM2.5 pollution occurred over New York, emergency department visits for asthma-related symptoms jumped to 261 per day across the city, compared with an average of about 182 visits per day during reference periods identified by the researchers before and after the smoke wave. Both the fine particulate matter and the emergency room visits peaked on June 7.

PM2.5 pollution has been shown to affect respiratory health, cardiovascular health, birth outcomes, and mental health, says lead author Kai Chen, an assistant professor of epidemiology (environmental health sciences) at the Yale University School of Public Health.

The findings should be taken as a word of caution for people living in regions immediately affected by wildfires as well as those downwind from the fires, the scientists say.

“People should take precautions for wildfire-induced air quality alerts seriously,” Chen says. “This means altering your daily routines as you would do for other extreme events like hurricanes.”

In the study, Chen and his team defined a wildfire smoke wave as a period of at least two consecutive days where the daily mean level of PM2.5 particulate matter exceeded the maximum level—56.8 micrograms per cubic meter (µg/m3)—during a baseline period between January 2021 and May 2023. The daily mean level of PM2.5 pollution from June 6 to June 8 was 100.9 µg/m3, compared with only 9 µg/m3 during the reference period.

The researchers also looked at how the wildfire smoke impacted subgroups in New York City, separating and analyzing data by borough (Bronx, Brooklyn, Manhattan, Queens, and Staten Island) and by age (0-4, 5-17, 18-64, and 65 and over).

They found that all boroughs of New York City were affected by the June smoke wave. All age groups were also affected by the wildfire smoke, but data revealed that people between the ages of 18 and 64 were most likely to visit the emergency department for asthma-related conditions during that time.

Study limitations include data being limited to New York City, the examination of only one acute outcome, and a lack of accounting for changes in population activity patterns or movement.

Amidst the more frequent and larger wildfires in recent years due to a warming climate, the researchers emphasize the need for timely communication about limiting wildfire smoke exposure to protect vulnerable populations.

Additional coauthors are from Yale and Columbia University Mailman School of Public Health.

Source: Elizabeth Lin for Yale University

source

Commercial Interruption: Space wargame exposes risk of dangerous escalation

Ukraine’s successful military use of Starlink in its defense against Russia has become a poster child for the increased military reliance on commercial space systems among nations. Yet, military use of commercial space services could create as many problems as it solves. Recent media coverage of Ukraine’s reliance on Starlink since Russia invaded has raised a disturbing question: Might private space barons and state-controlled commercial space firms — e.g., Elon Musk and Jeff Bezos, OneWeb and Guo Wang/StarNet — switch off critical systems, jeopardizing the security of major and lesser states?

This question is but one of many. Could an attack against a commercial space system constitute an act of war? Could it escalate to all-out nuclear war? What of space being a truly separate warfighting domain? Some argue that space isn’t a separate warfighting domain but merely an arena for conducting support operations for ground, air, and sea warfare. Supporters of this view insist that whatever military actions are taken in space stay in space. Are they right? Or is space a domain in which on-orbit hostile engagement can further catalyze ground, sea, or air engagements, as well as spark further in-space hostilities?

Should governments assume responsibility for defending commercial space systems? When, why, and how should they do so? Should governments remunerate commercial space system companies if an adversary damages their system? Or should wartime damage of space-based systems simply be considered a part of doing business, handled as other risks are — with light national regulations and private insurance? How, if at all, should governments facilitate such regulation? Is there a clear line between peacetime and war operations in space? What if a non-state actor uses commercial space systems to conduct attacks? Who, if anyone, should be held accountable?

If the answers to these questions are less than crystal clear, should Washington lay down minimum hardening or reconstitution requirements for all commercial satellite systems for companies contracting with the U.S. government? What role, if any, should international governance play in reducing the risks of military conflict and escalation? Also, what, if anything, should the U.S. and allied governments do to protect satellites (both commercial and military) or to enforce any rules that states might want? Are additional passive and active defenses needed? What about space bodyguard systems that could gently push potentially hostile robot spacecraft a safe distance away from critical satellites?

So far, the United States and other spacefaring nations have either mostly deferred or taken a hands-off approach to dealing with these questions. Instead, Washington has treated commercial space as it has the internet — as a special area where protecting First Amendment rights and free-trade entrepreneurialism trumps instituting new forms of regulation. With space, states are roughly where they were with sea power in the 1600s and air power before World War I. In these periods, dramatic acts of air war and naval piracy were about to ensue, but instituting national or international regulation hardly seemed urgent…until they were.

All of this raises a fundamental question: Just how sustainable is it for us to continue our current relaxed approach to the military exploitation of commercial space?

Game play

To find out, the Nonproliferation Policy Education Center (NPEC) designed and conducted a wargame this summer tailored to purpose.

The game’s play begins in 2027. India contracts with U.S. commercial satellite imagery firm Maxar to buy a controlling 51 percent interest in a three-satellite, sun-synchronous system. Washington backs the sale and not only buys New Delhi mobile ground stations, but blesses India’s controlling a 51 percent share, and places U.S. military payloads on the satellites to demonstrate the significance of this U.S.-Indian space collaboration.

Meanwhile, China develops a similar system and sells its imagery and communications links to a Pakistani government-blessed private entity. The entity is run by a retired Pakistani general who has close ties to Pakistani terrorist groups. Without asking for explicit Pakistani permission, the general gives one such group access to the Chinese satellite system’s “peaceful” commercial imagery and communications links.

The terrorist group wants Islamabad to take a stronger stand against India’s “occupation” of Kashmir. Towards this end (i.e., to force the hand of Pakistan and draw it into a major war), the terrorist group mates long-range drones with the Chinese satellite system’s imagery and secure communications links to strike India’s strategic nuclear air base at Ambala twice. The terrorists’ drones destroy several Indian nuclear-capable fighters and strategic nuclear forces command planes and kill dozens of Indian airmen.

U.S. intelligence and other open sources confirm that China’s satellite system supported the strike. A flurry of diplomatic and international commercial legal initiatives ensues. None, however, are ultimately acted upon. Under mounting domestic pressure to act, India uses a newly constructed ground-based space-tracking laser to dazzle the Chinese-Pakistani satellite system the terrorists used. This dazzling unintentionally damages the Chinese satellite’s optics.

China, in the game’s second move, retaliates by using its own ground-based laser systems to damage the optics of one of the U.S.-India Maxar satellites. Shortly after, a Chinese rendezvous satellite closes in on a second U.S.-India Maxar satellite as it comes within range of China’s laser system. This second satellite, including its American payload, goes dead. It is unclear what caused the satellite to stop functioning.

Meanwhile, India informs Washington that another Chinese rendezvous satellite is closing in on the third and last U.S.-India Maxar satellite. Shortly thereafter, this Maxar satellite also goes dead. As with the previous Chinese attack, the United States has no bodyguard satellites to deflect a possible Chinese rendezvous satellite assault. At this point, U.S. intelligence briefs the President, who authorizes a covert U.S. cyberattack against the offending Chinese rendezvous satellite, which disables it. China immediately blames Washington for killing its satellite.

Meanwhile, India launches a cruise missile attack against suspected terrorist sites in Pakistan. As the Pakistani government ponders what retaliatory military action it will take on the ground against India, the U.S. Space Command readies itself for a Chinese space counterattack.

Read the full after-action report, Commercial Satellite Use Catalyzes Nuclear-Armed States to Combat, at tinyurl.com/yc7nxzm3

Findings

The NPEC wargame’s final hot-wash discussion session supported four key findings:

1. Space combat can catalyze combat both on Earth and in space. Many experts like to believe that whatever happens in space stays in space. This game strongly suggested otherwise. India and Pakistan’s exploitation of commercial space satellite systems not only encouraged the United States and China to attack each other’s commercial satellites but intensified land warfare between India and Pakistan. What makes this finding worrisome is the continued lack of clarity as to what an act of war might be regarding commercial satellite systems that get “damaged” or are exploited for military purposes. Nor does it help that such military space operations can be conducted quickly and send complex, ambiguous signals to military and civilian space operators. In the game, no fewer than four nuclear-armed states struggled to determine who was doing what, and even what was happening, as India’s nuclear strategic forces were seriously degraded and a U.S. military space payload was destroyed. All of this could be quite escalatory. At a minimum, it recommends engaging all spacefaring nations in further talks to clarify what acts of war in space might be and determine how their detection might best be enhanced and verified. The latter would likely entail some combination of private national and public international efforts. Ideally, one might create a dedicated international body to verify illicit space activities that could credibly assign attribution. This international body’s surveillance requirements, in turn, could be used to help justify additional national funding of private space support contracts to increase the quality and availability of space situational awareness information. It is unclear what, if any, thinking is being done within the U.S. government or elsewhere to determine the optimal mix of private and international space surveillance and verification efforts. Beyond this, the United States and its spacefaring partners need to consider what, if any, new kinds of space capabilities, such as bodyguard satellites and active satellite defenses, might be needed to enforce desirable offensive space operations red lines.

2. As commercial satellite systems spread, rogue states and terrorists will try to exploit them, increasing the likelihood that major, nuclear-armed states could be dragged into wars. In the game, a terrorist group based in Pakistan uses commercial satellite systems to target strategic Indian nuclear assets with the aim of forcing the Pakistani government to side with them and wage a major war over Kashmir. What made this gambit relatively easy was the largely unregulated provision of “private” commercial satellite services and links to a wide variety of states, firms, and nonstate entities. At a minimum, this suggests the United States and other like-minded governments should encourage private firms operating under national jurisdiction to assume greater responsibility for their possible misuse. Specifically, spacefaring nations should explore creating commercial rules analogous to “knowyour-customer” rules used in the banking industry. Governments should apply these new rules to private space service providers, holding them responsible for the harm their customers inflict using their services in wars or through acts of terrorism. The challenge here will be to prevent smaller nations from offering licenses under little or no such regulation. In these cases, insurance premiums from reputable insurers should be set much higher than for properly regulated licensing or not be made available at all. Again, it is unclear to what extent governments and private firms are yet thinking through these matters.

3. With the increased use of commercial satellite systems, America’s credibility in mediating conflicts between nuclear-armed combatants will be questioned in new, demanding ways. Historically, Washington has acted as an honest broker in numerous Pakistani-India military crises. In this space scenario, the United States, however, deferred to India so much that Islamabad and Washington mistakenly concluded there was little point in engaging with one another and that India was free to take considerable action on its own. This resulted in Pakistan reluctantly conceding to heavy-handed Chinese bullying for Beijing’s pre-clearance of any Pakistani diplomatic crisis messaging. If Washington had done more to engage not just India, but Pakistan, on security issues prior to the conflict, this might have been avoided. Once conflict began, though, both India and Pakistan sought Washington’s information on what was occurring in space and on the ground. Unfortunately, neither Pakistan nor India knew how much Washington knew, and Washington’s lack of transparency created distrust. Because space warfare and its diplomatic management demand more space situational awareness than will ever be available, building trust is vital. One way to increase confidence and transparency is to encourage the private sector to provide more access to the space situational awareness information they might have (something the American team facilitated in the game). Governments, including the United States, might facilitate this by paying private firms to share what they know with international or multilateral space organizations that, in turn, would make it generally available both in peacetime and during crises. Another way to increase trust, that the game players discussed, would be to create new multilateral space security working groups starting with states in war zones (including those that have nuclear weapons on their soil) that Washington has working relations with (e.g., three-way talks among the United States, Pakistan, and India; with Turkey and Greece; among Middle Eastern states and Israel, etc.).

4. Leaving space activities as unmanaged as the internet is a prescription for military mischief. One of the major game discoveries is that satellite systems’ increasing duality makes it difficult to attribute the cause of destructive and disruptive space incidents or to determine appropriate responses. In the game, the United States puts a major U.S. military payload on a commercial spacecraft that India controls. China attacks it; the United States conducts a space counterattack, escalating the conflict. This play raised several questions. What military missions should only be conducted on government-owned dedicated military spacecraft? If the United States needs or wants to place military payloads on foreign-owned spacecraft, should it only do so if the United States has a military security agreement with the state from which the spacecraft is launched? Should governments or international organizations require a minimum amount of survivability features for all commercial satellites (especially those that are dual-use)? Should states condition any government protection or indemnification of private space assets against foreign military assaults? What should these conditions be — hardening, adequate insurance, being able to quickly reconstitute the targeted space system, supporting the deployment of bodyguard satellite systems to deflect possible hostile rendezvous satellite assaults, etc.? All of these questions were raised before or during the game; none were answered.


Henry Sokolski is executive director of the Nonproliferation Policy Education Center. Before founding NPEC, he was the Pentagon’s Deputy for Nonproliferation Policy from 1989 to 1993.

This article originally appeared in the October 2023 issue of SpaceNews magazine.

source

Material can reconnect severed nerves

A new material can reconnected severed nerves, a study in rodents shows.

Researchers have long recognized the therapeutic potential of using magnetoelectrics—materials that can turn magnetic fields into electric fields—to stimulate neural tissue in a minimally invasive way and help treat neurological disorders or nerve damage. The problem, however, is that neurons have a hard time responding to the shape and frequency of the electric signal resulting from this conversion.

Rice University neuroengineer Jacob Robinson and his team designed the first magnetoelectric material that not only solves this issue but also performs the magnetic-to-electric conversion 120 times faster than similar materials. According to the study in Nature Materials, the researchers showed that the material can be used to precisely stimulate neurons remotely and to bridge the gap in a broken sciatic nerve in a rat model.

The material’s qualities and performance could have a profound impact on neurostimulation treatments, making for significantly less invasive procedures, Robinson says. Instead of implanting a neurostimulation device, tiny amounts of the material could simply be injected at the desired site. Moreover, given magnetoelectrics’ range of application in computing, sensing, electronics, and other fields, the research provides a framework for advanced materials design that could drive innovation more broadly.

“We asked, ‘Can we create a material that can be like dust or is so small that by placing just a sprinkle of it inside the body you’d be able to stimulate the brain or nervous system?’” says Joshua Chen, a Rice doctoral alumnus and a lead author of the study. “With that question in mind, we thought that magnetoelectric materials were ideal candidates for use in neurostimulation. They respond to magnetic fields, which easily penetrate into the body, and convert them into electric fields—a language our nervous system already uses to relay information.”

The researchers started with a magnetoelectric material made up of a piezoelectric layer of lead zirconium titanate sandwiched between two magnetorestrictive layers of metallic glass alloys, or Metglas, which can be rapidly magnetized and demagnetized.

Gauri Bhave, a former researcher in the Robinson lab who now works in technology transfer for Baylor College of Medicine, explains that the magnetorestrictive element vibrates with the application of a magnetic field.

“This vibration means it basically changes its shape,” Bhave says. “The piezoelectric material is something that, when it changes its shape, creates electricity. So when those two are combined, the conversion that you’re getting is that the magnetic field you’re applying from the outside of the body turns into an electric field.”

However, the electric signals magnetoelectrics generate are too fast and uniform for neurons to detect. The challenge was to engineer a new material that could generate an electric signal that would actually get cells to respond.

“For all other magnetoelectric materials, the relationship between the electric field and the magnetic field is linear, and what we needed was a material where that relationship was nonlinear,” Robinson says. “We had to think about the kinds of materials we could deposit on this film that would create that nonlinear response.”

The researchers layered platinum, hafnium oxide, and zinc oxide and added the stacked materials on top of the original magnetoelectric film. One of the challenges they faced was finding fabrication techniques compatible with the materials.

“A lot of work went into making this very thin layer of less than 200 nanometers that gives us the really special properties,” Robinson says.

“This reduced the size of the entire device so that in the future it could be injectable,” Bhave adds.

As proof of concept, the researchers used the material to stimulate peripheral nerves in rats and demonstrated the material’s potential for use in neuroprosthetics by showing it could restore function in a severed nerve.

“We can use this metamaterial to bridge the gap in a broken nerve and restore fast electric signal speeds,” Chen says. “Overall, we were able to rationally design a new metamaterial that overcomes many challenges in neurotechnology. And more importantly, this framework for advanced material design can be applied toward other applications like sensing and memory in electronics.”

Robinson, who drew on his doctoral work in photonics for inspiration in engineering the new material, says he finds it “really exciting that we can now design devices or systems using materials that have never existed before rather than being confined to ones in nature.”

“Once you discover a new material or class of materials, I think it’s really hard to anticipate all the potential uses for them,” says Robinson, a professor of electrical and computer engineering and bioengineering. “We’ve focused on bioelectronics, but I expect there may be many applications beyond this field.”

The research had support from the National Science Foundation and the National Institutes of Health.

Source: Rice University

source

Hair relaxers may boost Black women’s uterine cancer risk

Long-term use of chemical hair relaxers by postmenopausal Black women is associated with an increased risk of uterine cancer, a new study shows.

Chemical hair relaxers are heavily marketed to, and commonly used by, Black women to straighten curly or tightly coiled hair. These products are only loosely regulated and are known to contain potentially harmful ingredients, including chemicals known as endocrine disruptors which can be absorbed via inhalation or through the skin.

Prior studies have linked these chemicals to a wide range of women’s reproductive health outcomes.

Compared to women who never or rarely used hair relaxers, Black women who reported using hair relaxers more than twice a year or for more than five years had a greater than 50% increased risk of uterine cancer.

“Our study suggests that moderate and heavy use of chemical hair relaxers may be associated with higher risk of uterine cancer among postmenopausal Black women. In addition, there are major racial disparities in uterine cancer. Compared to non-Hispanic white women, Black women have higher rates of aggressive subtypes of uterine cancer and are nearly twice as likely to die from their disease,” says corresponding author Kimberly Bertrand, associate professor of medicine at Boston University Chobanian & Avedisian School of Medicine.

The researchers asked nearly 45,000 women in Black Women’s Health Study (BWHS) who had no prior history of cancer and an intact uterus about their past use of chemical hair relaxers. They then followed the women for up to 22 years and compared rates of uterine cancer among women who reported frequent or long-term use of hair relaxers to rates among women who never or rarely used hair relaxers.

They found that, among postmenopausal women, rates of uterine cancer were statistically significantly higher for those who commonly used hair relaxers even after adjustment for other potential risk factors.

These findings highlight the importance of continued research regarding the potential adverse health effects of exposure to chemical hair relaxers and their constituents, the researchers say.

“Black women are often underrepresented in health research and may have unique exposures that contribute to disparities in disease. This study fills an important gap in knowledge about the potential health effects of hair relaxer use, which is very common in Black women,” says Bertrand who also is an epidemiologist at Boston University’s Slone Epidemiology Center.

The researchers hope these results will raise awareness of the potential toxic effects of these products and promote efforts to reduce exposure.

“Importantly, identification of safer alternatives to straightening hair, stricter regulation of cosmetic products, and policies to prohibit discrimination against natural hair such as the CROWN Act could represent important steps toward reducing racial disparities in uterine cancer.”

The findings appear in the journal Environmental Research.

The National Institutes of Health and the Cancer Research Foundation supported the work.

Source: Boston University

source

Insects evolved ‘instantly’ due to climate havoc

An experiment in the wake of 2017’s Hurricane Harvey shows how species can evolve instantly when they move in response to a climate catastrophe.

“With the profound and rapid changes we’re seeing with the environment, movement is becoming critical for species’ survival,” says Rice University evolutionary biologist Scott Egan, senior author of a study in Nature Ecology and Evolution. “The takeaway from this study is that while natural selection is still incredibly important, there’s another form of evolutionary change that’s directly related to movement, and it could make a huge difference in the evolution of organisms.”

Harvey, the most intense rainfall event in United States history, stalled over southeast Texas and dropped more than three feet of rain over thousands of square miles in a matter of days. Record flooding in and around Houston produced “mini extinctions” of insects and other species in areas that remained inundated for 10 or more days.

Study lead author Mattheau Comerford, Egan, and coauthors Scott Carroll and Tatum La discovered evolution played out across space rather than time when the insect species Jadera haematoloma recolonized flooded areas. Comerford, a former graduate student in Egan’s lab, is now a postdoctoral research fellow at the University of Massachusetts Boston. Carroll is a biologist at the University of California, Davis, and La is a Stanford University undergraduate.

The researchers showed that in the wake of the storm, a form of evolution called spatial sorting exerted a greater influence on the evolution of J. haematoloma, commonly known as the red-shouldered soapberry bug, than did natural selection.

“This is an understudied and underappreciated form of evolutionary change that has emerged from the invasive species literature,” Egan says of spatial sorting. “For example, cane toads in Australia have exhibited these patterns. And up until the last decade or two, spatial sorting has not really been considered a significant form of evolutionary change.”

Natural selection and spatial sorting each change the likelihood that specific traits will be passed to future generations. With natural selection, change plays out over decades or even millennia as traits become more common or less common depending on which individuals survive and reproduce. With spatial sorting, change occurs when species move into new territory, and traits become more common or less common depending on which individuals are at the leading edge of range expansion.

“If everybody is moving at different rates, the ones that move the fastest will always tend to end up at a higher density at the range edge,” Egan says. “So right at that range edge, you have just a subpopulation of the fastest dispersers. And when they mate, there’s a tendency for them to have even faster dispersing offspring. So there’s this kind of accumulative, increasing rate of dispersal as the range front moves outward.”

Soapberry bug beaks

Prior to Harvey, Comerford spent about 10 months gathering field data about soapberry bugs at 15 sites around the Houston metropolitan area. Native to Texas and the southwestern United States, the bugs are a textbook example of rapid evolution. They feed on seeds by plunging needle-like beaks deep into the fruit of their host plants.

When the goldenrain tree was widely introduced in the United States as an ornamental plant after World War II, it proved an opportune host for soapberry bugs. The trees’ seeds are not as deep as those of the soapberry’s native host plants, and the bugs rapidly evolved shorter beaks to access them. Through decades of study, Carroll and others have shown this is one of the fastest recorded examples of ecological adaptation.

Southeast Texas is one of the few places in the United States where short-beaked soapberry bugs coexist alongside two native varieties with longer beaks that are adapted to the seed depths of their host plants.

At field sites, Comerford sampled soapberry bugs from the three host plants every two weeks. Prior to the storm, the bugs found on each host were the varieties whose beaks were adapted to match the host’s seeds. This was expected, given the “extremely strong selective pressure to rapidly adapt beak length to suit the seed pods of whatever host plant they are living on,” he says.

Harvey flooded 11 of the research team’s 15 sites. At non-flooded sites, the three varieties of soapberry bugs continued to thrive. At flooded sites, Comerford found soapberry bugs began returning within three months, but only bugs with longer wings reached these sites. It took more than two years for all the sites to be recolonized by the long-winged bugs.

“Looking at our data and past publications, we saw that beak length and wing length were genetically linked, meaning insects with longer wings also had longer beaks,” Comerford says. “So now that we had better dispersers driving spatial sorting, they were dragging along these other traits, the longer beaks, that had nothing to do with dispersal.”

The persistence of long-winged, long-beaked individuals on all of the host plants at the recolonized sites showed “spatial sorting was having an effect that was much stronger than the effect of natural selection in that moment,” Comerford says. “That is completely novel in this experiment.”

Populations on the move

Egan says the study demonstrates the significance of spatial sorting, because similar scenarios could play out after many types of climate catastrophes, including wildfires, droughts, heat waves and winter storms.

“All these episodic types of events are predicted to increase in frequency and magnitude in the coming years,” Egan says. “As populations move in response to these events, and as they are moving poleward to get into the envelope to which they are adapted, they’re going to be subject to spatial sorting.”

Comerford says, “The hallmark of the Anthropocene, and all of the changes that we’re seeing with the environment, is that the ability to move and migrate to new locations that are more hospitable is critical for the survival of organisms. This study shows there is a force out there that we haven’t been accounting for that could make a huge difference in the survival and evolutionary change of organisms on the move. We need to be cognizant of its potential, because this is a phenomenon that is probably going to come into play more as events unfold.”

The research had from the National Science Foundation, the American Museum of Natural History’s Theodore Roosevelt Memorial Fund, the American Philosophical Society’s Lewis and Clark Fund, and the Society for the Study of Evolution’s Small Grants for Local and Regional Outreach program.

Source: Rice University

source

Evolution Space to develop solid rocket motors at NASA Stennis

WASHINGTON — Evolution Space, a startup developing solid rocket motors, has signed an agreement to establish production and testing operations at NASA’s Stennis Space Center.

The company announced Oct. 10 that is reached an agreement with Stennis set up a production facility at the former Mississippi Army Ammunition Plant, which Stennis acquired in 2011 after its deactivation by the U.S. Army.

Evolution Space expects to start production of solid rocket motors there in the second quarter of 2024 at what will be called the Minor Scale Propulsion Center. The company will also test those motors at Stennis’s E-3 Test Complex. The agreement includes support for future expansion.

“By partnering with NASA, we are able to rapidly stand up a facility which will add considerable capability to the U.S. solid rocket motor industrial base,” said Manny Ballestero, vice president of production and development at Evolution Space, in a statement.

Evolution Space has been developing solid rocket motors for defense and commercial applications. The company successfully flew those motors on suborbital launches in the spring from the Mojave Desert and from a floating platform in the Gulf of Mexico that served as a demonstration of sea-based launch sites being developed by The Spaceport Company.

Evolution Space also announced it closed a bridge funding round to allow it to proceed with its work at Stennis while it works on a separate Series A round. A company spokesperson said the size of the round was $1.2 million, but that the company could not disclose the investors who participated in the round.

While Stennis is a center for liquid-propellant rocket engine testing for NASA and several companies, Evolution Space is the first company working on solid rocket motors to establish a presence at Stennis.

“Evolution Space gains access to critical NASA Stennis infrastructure and expertise as it continues to build its propulsion capabilities. In turn, we continue frontline work with commercial companies as we support NASA’s commitment to increase access to space and grow our federal city,” Rick Gilbrech, director of the Stennis Space Center, said in a NASA statement.

source

DoD-funded space project advances non-GPS navigation

WASHINGTON — Vector Atomic, a California-based startup, worked with Honeywell Aerospace to produce a cutting-edge navigation sensor that uses an atomic clock to take precise measurements without relying on GPS.

The atomic sensor, funded by the Pentagon’s Defense Innovation Unit, was delivered in August and is awaiting a ride to space, Vector Atomic’s CEO Jamil Abo-Shaeer, told SpaceNews.

Abo-Shaeer, a former project manager at the Defense Advanced Research Projects Agency, co-founded Vector Atomic in 2018 with the goal of fielding and commercializing atomic instruments. 

The company in 2020 was selected by DIU to build an atomic sensor — a device that exploits the quantum properties of atoms to make very precise measurements — that could survive the rigors of space. 

Abo-Shaeer said Vector Atomic has no venture capital funding. After winning the DIU contract that provided about $10 million in government funds, the company partnered with Honeywell to build an atomic inertial navigation sensor, qualify it for space flight and integrate it with a satellite bus.

Inertial navigation devices have been around for decades but the atomic variant is a more complex technology that has only existed in labs, said Abo-Shaeer.

Inertial navigation systems use motion sensors, or accelerometers, and rotation sensors known as gyroscopes to continuously calculate the position, orientation, and velocity of a moving object without the need for external references like GPS. 

Atomic sensors that use atomic clocks are more precise but they’ve only been tested in laboratories and are very fragile, he said. DIU’s project is about figuring out if these devices can be made robust enough for deployment in real-world systems. 

And the best way to answer that, said Abo-Shaeer, is to send one of these sensors into the harshest environment, which is outer space, after putting it through the rigors of space launch. 

The problem with current technology

Inertial navigation devices used today in air, maritime, ground and space platforms are “incredibly good and incredibly robust” at providing short-term position, velocity and attitude data, he said. Over time, however, small errors in measurement accumulate, causing inertial navigation systems to drift. This means they cannot provide accurate position data for extended periods without external updates from GPS or other systems.

Quantum sensors that use atomic clocks improve accuracy and don’t drift over time like conventional systems, which makes them desirable for military applications, said Abo-Shaeer.

“They come calibrated out of the box and they stay calibrated,” he said. “The atomic systems will drift as well but you can navigate longer. They’re not doing anything fundamentally different from current technology. They’re just potentially doing it better.”

When the atomic sensor reaches orbit, it will operate autonomously. “It turns itself on, takes measurements, and pushes data back down to the ground,” said Abo-Shaeer.

Launch date TBD

Lt. Col. Nicholas Estep, program manager at DIU, said he could not discuss the specifics of the space mission that will fly Vector Atomic’s sensor, or the projected date for the launch.

The recent delivery of the quantum sensor marks a “compelling milestone for the quantum sensing community,” he told SpaceNews. “Atomic clocks have been flying on GPS for a long time, but other than atomic clocks, other forms of quantum sensing have not materialized outside the lab.”

The physics and the phenomenology of quantum sensing is “fun and very exciting,” said Estep. What DIU is doing is moving the technology to the next phase, which is systems engineering and prototyping.

“What we’ve shown so far is that you can put it together in an integrated package that’s actually reliable and can go through the rigors of qualification,” Estep said. 

“We’ve gone through the qualification in order to be manifested and incorporated into a space demonstration,” he added. “We’ve delivered an actual experimental payload that has an atomic gyroscope.”

source

Not everyone’s mental health took a hit during COVID

The COVID-19 pandemic did not affect everyone’s mental health in the same way, a new study shows.

At the height of the pandemic, the media and some mental health professionals began to speculate about the ways lockdown, isolation, and worry about the disease would worsen people’s mental health. But researchers found not everyone took the same hit.

“As much as we found evidence of symptom increase across different kinds of samples and different symptoms, we also found a lot of evidence of resilience,” says lead author Mary Blendermann, a psychology PhD student working under the supervision of Lauren Hallion, assistant professor of psychology at the University of Pittsburgh.

Early reports during the pandemic suggested ballooning mental health diagnoses, Blendermann notes, but without being able to compare pre-COVID-19 rates of mental illness with rates reported at the height of stay-at-home orders, there wasn’t scientific evidence to confirm an increase in diagnoses.

To make that comparison, Blendermann and other students analyzed comparable studies to find out how symptom severity and prevalence for six different kinds of mental health conditions changed from before the pandemic to the period after January 29, 2020—the day before the World Health Organization designated COVID-19 a public health emergency of international concern.

For the paper, published in the journal Psychological Medicine, the researchers considered six symptom clusters: obsessive-compulsive disorder (OCD), post-traumatic stress disorder (PTSD), fear, anxiety, depression, and general distress. Across the 97 studies analyzed, Blendermann found a trend toward increasing symptoms of mental illness at the population level for all illnesses other than PTSD. (That exception may be because the team didn’t find many studies that considered PTSD, Blendermann says.)

When the team analyzed by demographic groups and across different illnesses, more patterns began to emerge. Some groups fared better than others, and some illnesses tended to increase more. Those with OCD showed an increase in symptoms, for instance, including increases in washing and other contamination-related symptoms. It makes sense that a contagious disease may have disproportionately affected people with preexisting fears of contamination, Blendermann says.

But save for people with OCD, those with preexisting mental health conditions were not any more affected than people without. Like others, they tended to show increases in anxiety and general distress.

“I see that as being a response to a global pandemic,” Blendermann says. “People might just be more anxious because the environment had more threats.”

When the researchers analyzed data by demographic groups, they found healthy young people showed the biggest increase in symptoms of anxiety and depression. Another group that showed a marked increase in anxiety and depression was 20- to 40-year-old women. Blendermann suggests that might be the result of the increased duties—teaching and caretaking, for instance—women took on during the pandemic.

Despite the overall increased incidence of some symptoms of mental illness, Blendermann says, the team also found plenty of evidence of resilience. In some studies that collected data as the pandemic wore on, early symptoms of certain conditions were followed by returns to baseline.

Some people’s mental health may have benefited from the conditions imposed by COVID-19, says Hallion, the paper’s senior author. “Students and people with pre-existing mental health conditions generally did not show an overall worsening of anxiety and depression from before to during the pandemic. It may be that lowered academic, social, and professional expectations due to the pandemic had a protective effect for these groups.”

Being able to work from home, for instance, may have led to reduced stress for those with anxiety disorders. It would be a mistake to assume that even a disaster on the scale of COVID-19 would have an enduring negative effect on everyone, Blendermann says.

“People do adapt, they do find ways to find stability, even when something this disruptive happens.”

Source: University of Pittsburgh

“As much as we found evidence of symptom increase across different kinds of samples and different symptoms, we also found a lot of evidence of resilience,” says Mary Blendermann.

source