10 things to know about the updated COVID vaccines

Experts from Yale University have answers for you about the newly updated COVID-19 vaccines.

There will be better protection against severe disease, hospitalization, and death from COVID-19 in the coming months now that newly updated (2023–2024 formula) mRNA COVID vaccines are available. The new shots are expected to keep more people from getting seriously ill with the virus through the winter, when infections and hospitalizations tend to tick upwards. And unlike the spring booster that targeted people ages 60 and older, these updated vaccines are for everyone ages 6 months and older.

The Food and Drug Administration (FDA) and the Centers for Disease Control and Prevention (CDC) approved the updated vaccines by Pfizer-BioNTech and Moderna in mid-September. (In early October, they also authorized an updated Novavax vaccine for use in individuals 12 and older; more on that below.)

The vaccines target XBB.1.5, a subvariant of Omicron that dominated the United States—and the world—from November 2021 until earlier this year. The CDC says the updated vaccines should also work against currently circulating variants of the SARS-CoV-2 virus—many of which descended from, or are related to, the XBB strain. This includes EG.5, the dominant strain in the US, and BA.2.86, a new subvariant sparking concern because it has more than 30 mutations to its spike protein.

While COVID-19 has been causing mostly mild illness recently, Yale Medicine infectious diseases specialist Onyema Ogbuagu reminds people that the disease can still lead to hospitalization and death.

“Infections can have long-term consequences,” Ogbuagu says, adding that even healthy people can develop Long COVID—a condition in which new, continuing, or recurring (and sometimes debilitating) symptoms are present four or more weeks after an initial coronavirus infection.

Here, Yale experts tell you what you need to know about the updated COVID vaccine:

Note: Information in this article was accurate at the time of original publication. Because information about COVID-19 changes rapidly, we encourage you to visit the websites of the Centers for Disease Control & Prevention (CDC), World Health Organization (WHO), and your state and local government for the latest information.

Information provided in Yale Medicine articles is for general informational purposes only. No content in the articles should ever be used as a substitute for medical advice from your doctor or other qualified clinician. Always seek the individual advice of your health care provider with any questions you have regarding a medical condition.

Source: Yale University

source

India shifted launch of its Chandrayaan-3 moon lander to avoid space objects

BAKU, Azerbaijan — The liftoff time for India’s historic Chandrayaan-3 lunar lander was shifted four seconds to avoid close approaches to other space objects.

Chandrayaan-3 launched July 14 on a LVM-3 heavy-lift rocket from Satish Dhawan Space Centre into an initial highly-elliptical Earth orbit.

This began a circuitous journey to the moon which culminated in the lander making the highest latitude lunar landing and making India the fourth country to safely soft land on the moon.

The nominal launch time was however changed after analysis of the orbits of tracked space objects. 

“Based on collision avoidance analysis the liftoff time was shifted by four seconds so that there was no collision threat,” Anil Kumar, chief general manager of safe and sustainable space operations management at the Indian Space Research Organisation (ISRO), said at the 74th International Astronautical Congress (IAC) here in Baku, Oct. 6.

“This is a mandatory practice, not only for ISRO but for all space launch vehicles. That is, before launch, the liftoff should be cleared,” Kumar said.

“The number of space objects in orbit, especially low Earth orbit, is so high. U.S. Space Force  is tracking and cataloging more than 30,000 objects with a size of more than 10 centimeters. 

“For this LVM-3 launch on that particular day… when we analyzed, we found that during the first orbital phase of Chandrayaan-3 many objects, including operational objects, were coming closer than one kilometer.” 

This led to a shift of launch time by four seconds to avoid a number of close approaches.

A panel presenting Chandrayaan-3 initial results at IAC 2023, Oct. 6, 2023. Credit: IAF/Youtube

Chandrayaan-3 was then successfully launched into an orbit similar to a geostationary transfer orbit. This was raised a number of times before a translunar injection burn and, after this, captured into an elliptical lunar orbit. This was then lowered to allow the spacecraft to line up its landing attempt for Aug. 23. 

The details were provided at a late breaking news event at IAC. Initial science results from Chandrayaan-3’s Vikram lander and Pragyan rover data collection during their solo operational day on the moon were discussed, including the first-ever measurements of the near-surface lunar plasma environment near the lunar south pole. 

Plasma levels were found to be sparse, with tens to hundreds of electrons per cubic centimeter. A detailed analysis is in progress, with potential implications for what frequencies should be used for lunar radio communications.

A propulsive “hop” performed Sept. 2 by Vikram also helped ISRO repeat science measurements in a new location, 40 centimeters from the original landing point. This included new temperature measurements from the Chandra’s Surface Thermophysical Experiment (ChaSTE) thermal conductivity and temperature payload.

The Chandrayaan-3 Vikram lander imaged by Pragyan rover, Aug. 30, 2023. Credit: ISRO

The maneuver, which echoed the hop by the U.S. Surveyor 6 mission in 1967, also provided a chance to further verify accuracy and performance of the science payloads.

Scientists had hoped that solar-powered Vikram and Pragyan would reawaken in late September after sunrise following the deep cold of lunar night, despite a lack of radioisotope heater units. ISRO attempted to contact the pair but without success.

source

AI Jesus? Experts question wisdom of godbots

The question we ask is, “What is it about the chatbot that makes it seem like a good place to turn for answers?” Our answer is that the design of the chatbots invites us to treat them like more-than-human oracles. Why? First of all, they’re opaque. They don’t show you their workings. And so they can trigger a cognitive response in people that actually has a very long history. They do what oracles, prophets, spirit mediums, and divination practitioners have always done. They have access to a source that is totally mysterious. It can seem to be tapping into something that just knows more than I do. A source like that seems more than human. It can appear to be divine.

If you go through the history of human divination techniques, you see this is repeated over and over again, whether it’s ancient Chinese casting the I-Ching or Yoruba casting cowrie shells. An example we use in our article is sacrificing animals and then studying their entrails to find marks that come from the spirit world, a very common practice found from ancient Rome to many contemporary societies. Or the Delphic Oracle, who seems to have been a spirit medium, someone who went into a trance and whose words, sometimes quite enigmatic, seemed to come from elsewhere.

You don’t have to believe in divine authority for this to work. You just need to feel that AI surpasses humans. The urge to turn to it for answers can start with no more than that. I really want to stress this point: we are not saying “Well, some suckers will fall for this.” The godbots are just an extreme case of something that’s actually much more common. People who pride themselves on their scientific rationality are susceptible, too.

Now, the second aspect of the chatbots is that they’re designed to give you one answer and to give it with complete authority, without any doubts. When Harry Truman was president, he supposedly complained about his economic advisers: “When I ask them for advice, they say, ‘Well, on the one hand, this, on the other hand, that.’” Truman said, “I want someone to find me a one-armed economist!”

Right now that’s what chatbots do. This is one way they’re more dangerous—and perhaps more appealing—than, say, the Google search function. Google says, “Well look, here’s a whole bunch of sources.” So it’s at least implying that there isn’t necessarily just one answer. Look at all these different sources! And if you want, you can go further into them, even compare them to one another.

Chatbots in their current state aren’t like that. In effect they say, “I’m not going to tell you where I got the answer. You just have to accept it. And there’s only one answer.” Life is complex, is often bewildering and there’s an irresistible attraction to things that promise to make it simpler.

And again, it’s the design of the chatbot that because of its opacity, on the one hand, it has all the authority of crowdsourcing. For better or for worse, we’ve come to place a huge amount of faith in the wisdom of the crowd, and then projected it onto chatbots. As a result, it seems to know more than any human could possibly know. So, how can you doubt it?

And its inner workings are opaque—even computer programmers will tell you that some of the things going on in these algorithms are just too complex to explain. It’s not necessarily that they don’t understand their own devices, but that the explanation can be just as complicated as the thing it’s meant to explain.

source

Northrop and Voyager emphasize benefits of commercial space station partnership

WASHINGTON — Northrop Grumman and Voyager Space executives said their decision to work together on a commercial space station project, rather than pursue competing efforts, is a natural progression for an emerging industry.

The two companies announced a partnership Oct. 3 where Northrop Grumman will develop a version of its Cygnus spacecraft capable of docking autonomously to Voyager’s Starlab space station, along with other potential future contributions. Northop will also end development of its own proposed commercial station.

In a joint interview, Steve Krein, vice president of civil and commercial space at Northrop Grumman at Northrop Grumman, and Dylan Taylor, chief executive of Voyager Space, said the partnership emerged from discussions the two companies had about using Cygnus vehicles to provide cargo transportation to Starlab.

“We started talking about if we could combine the best elements of both teams— our cargo logistics and human spaceflight experience with the capabilities of Voyager — to develop what I’ll call the ‘dream team,’” Krein said. “We realized in short order that was a really good combination.”

Taylor said Voyager started considering a partnership with Northrop after setting up a joint venture with Airbus to develop Starlab that was announced in August. “We have a good relationship with Northrop. They are, in my opinion, the best hardware manufacturer in aerospace and defense,” he said. “We were in a position to have a conversation with them regarding how they might be able to help our project along.”

“Adding Northrop to the team further strengthens the project, and I think further strengthens the likelihood that this will be the first station flying and will be the right solution to replace the International Space Station when it’s deorbited,” he said of Starlab.

Krein said Northrop concluded there was a stronger business case to work as a partner with Voyager rather than go it alone with its own station. “Although we’ve made a significant amount of progress and understood the business case,” he said, “there was just a stronger case to be made for a combination of the talents, expertise and subject matter experts with ourselves, Voyager and their partners.”

He said that a partnership provided greater assurance that a station could be ready in time to meet NASA’s need to have at least one commercial station ready before the end of the decade. Such partnerships, he added, were part of the “natural consolidation” of any industry.

Taylor agreed. “It’s natural that there will be consolidation of skill sets and talent in the commercialization of private space stations in LEO,” he said. “Northrop was a very obvious, compelling partner to enhance the project.”

Under the partnership, Voyager will pay Northrop an unspecified amount for upgrading the Cygnus for automated docking, as well as agree to purchase a set number of Cygnus flights. Taylor said the companies are “actively scoping” additional Northrop contributions to Starlab, looking at “a lot of different areas to get Northrop and the technical capability involved in the project.”

Northrop will be able to provide Cygnus cargo resupply services to other customers as part of the agreement. “We’re in discussions with just about everybody that’s going to be having commercial space stations,” Krein said, as well as talks with NASA about how to ensure support for the ISS “to 2030 and beyond.”

While the partnership means that one fewer company is now pursuing a station to succeed the ISS, NASA saw the arrangement as a positive development. “Refining strategies and evolving partnerships are part of the process as we build a robust low Earth orbit economy where NASA is one of many customers,” said Angela Hart, manager of NASA’s commercial LEO development program, in an agency statement Oct. 3.

Both Krein and Taylor said they were pleased with NASA’s approach to supporting commercial space station development. “We’re very comfortable with where NASA is right now,” Taylor said. He added he would like to see, as the effort advances, more details from NASA about its requirements and its expected demand for commercial space station facilities. “The clearer that demand signal is, the easier it is to get the market excited about what it is we’re doing.”

“What’s important is that we have no space station gap. So we’re cheering everybody on and don’t want anyone to fail,” he said. “But we’re highly confident that Starlab is the right solution for this commercial opportunity.”

source

India hopes reforms will make it a global space hub

BAKU, Azerbaijan — The Indian government is continuing a series of reforms aimed at increasing private involvement in the space sector and attracting global capital.

“A transition is happening in India. We are moving from ISRO being the sole player in the space sector to the private sector taking on a more meaningful role,” Pawan Goenka, chairman of the Indian National Space Promotion Authorization Center (IN-SPACe), said at a forum at the 74th International Astronautical Congress in Baku, Oct. 5.

The Indian government approved the Indian Space Policy 2023 in April this year, which follows a number of developments in recent years. 

“What the Indian Space Policy did was take everything to do with space — satellite communication, remote sensing, space operations, transportation, navigation, everything — and put it into one comprehensive document only 12 pages long,” Goenka said.

The reforms define clearer roles for the Indian Space Research Organization (ISRO), IN-SPACe, and NewSpace India Limited (NSIL), while also removing barriers to participation for  non-government entities. ISRO will continue as a civil space agency, focusing on research and development of advanced space technologies and areas including human spaceflight, while IN-SPACe will regulate and authorize space activities in India, nurture startups and facilitate cooperation with ISRO. 

The policy has removed almost all restrictions on the private sector to participate in the space sector in India, which was earlier almost inaccessible. Building rockets, launching, owning and operating satellites, developing services and acquiring and disseminating Earth observation data are now all permissible. 

Furthermore, India is also set to finalize a new foreign direct investment (FDI) policy for the space sector. This is expected to liberalize rules for foreign ownership in a bid to attract global investment, and will affect areas including satellite manufacturing, ground segment, launch vehicles, subsystems and more. 

While these measures are broad in scope, India has specific areas in mind in which the country can look to build, according to Goenka.

A decadal vision and strategy for the Indian space economy is set to be released in the coming days. This will set a target for where India sees itself in 10 years in terms of space economy, and where the focus of efforts will be put.

“Currently, the Indian Space economy is measured at about eight billion dollars, which is only about 2% of the global space economy,” says Goenka. He however states that the country has aspirations to increase that multifold.

“There are a few things that we think India can have a competitive advantage in. The first one in manufacturing, which is that India can become a small satellite manufacturing hub.”

Goenka also says the country can become “pretty big in small launch vehicles,” and in launching low Earth orbit constellations. He also noted ground station services, Earth observation data and space applications as other areas of potential growth.

Goenka enumerates several advantages that can bolster India’s position in these areas: new institutional support, state-level policies, a vast domestic market, a high number of STEM graduates, and competitive labor costs. These he says can ultimately grow India’s share of the global space economy.

source

Cooking fuels linked to developmental delays in kids

New research finds a link between unclean cooking fuels like natural gas, propane, and wood and developmental delays in children.

Researchers looked at indoor air pollution exposure and early childhood development in a sample of more than 4,000 mother-child pairs in the United States.

“Exposure to unclean cooking fuel and passive smoke during pregnancy and in early life are associated with developmental delays in children,” says Alexandra Grippo, first author of the study in the journal Environmental Research.

“While cigarette smoke is known to be harmful during pregnancy, cooking fuel may not be viewed the same way,” says Grippo, who worked on the study while pursuing her master’s in epidemiology in the School of Public Health and Health Professions at the University at Buffalo.

“Gas stoves are a main contributor to indoor carbon monoxide and nitrogen dioxide levels, with some families using them multiple times a day. Infants and young children spend more time indoors and are particularly vulnerable to indoor pollutants because they are not fully developed.”

Clean fuel use refers to cooking with electricity, including a microwave, and heating using either electricity or solar power, while unclean fuel users include those who used one or more fuels other than electricity.

While gas stoves have come under fire in recent years as cities around the country have moved to ban them in new buildings, the researchers stress that this study focused on more than just natural gas.

“We found that children exposed to any unclean cooking fuel had an increased risk of developmental delays,” says Kexin Zhu, co-first author who worked on the research as an epidemiology PhD student. Zhu is now a postdoctoral associate in the Rutgers Center for Pharmacoepidemiology and Treatment Science.

Due to the small number of cases, the researchers were not able to examine the associations for specific fuel types, Zhu says. “Based on our study, future research with large sample sizes is needed to investigate the relationship between the use of gas stoves and child development.”

The analysis included 4,735 mother-child pairs enrolled between 2008 and 2010 in the Upstate KIDS Study, a large population birth cohort that followed childhood development milestones through three years of age. Study participants self-reported indoor air pollution information during pregnancy and the postnatal period.

Indoor air pollution exposure was assessed by collecting information on child exposure to cooking fuels, heating fuels, and passive smoking at approximately 4 months old, 12 months, and 36 months. The researchers asked participants what fuel was usually used for cooking and heating and whether they lived with someone who smokes.

The researchers used the Ages and Stages Questionnaire, a parental rating instrument used for screening children’s development and milestone achievement, to measure child development in five developmental domains: communication, gross and fine motor skills, personal-social, and problem-solving.

It is believed to be the first study to examine the impact of cooking fuels, heating fuels, and passive smoking on child development measured in the five developmental domains in the US, says senior author Lina Mu, an associate professor of epidemiology and environmental health in the University at Buffalo’s School of Public Health and Health Professions. Mu was also part of the Upstate KIDS Study research team.

Unclean cooking fuel exposure from pregnancy to 36 months of age increased the odds of failing any developmental domain by 28%, the gross motor domain by 52%, and the personal-social domain by 36%. Researchers observed significant associations of unclean cooking with failing any domains and specific domains among infants of young mothers, singletons (a pregnancy with one baby) and male infants, but not among infants of older mothers, non-singletons, or female infants.

In this study, 21.5% of women reported exposure to passive smoke during pregnancy, and 14.2% reported being active smokers during pregnancy. There was a positive association reported of passive smoke exposure with failing the problem-solving domain among children of non-smoking mothers.

“Passive smoking or secondhand smoking is also an important source of indoor air pollution and should not be ignored,” Zhu says.

“Passive smoke contains toxicants, such as lead, that can harm children’s development,” Zhu adds. “We found that passive smoke exposure may increase the likelihood of failing the problem-solving domain among young children of non-smoking mothers. Protecting children from passive smoke is, therefore, important for improving their health and well-being.”

Additional coauthors are from the University at Buffalo, the Eunice Kennedy Shriver National Institute of Child Health, and the University at Albany School of Public Health.

Source: University at Buffalo

source

New head of smallsat supplier Blue Canyon sets sights on defense market growth

WASHINGTON — Chris Winslett for three years ran one of Lockheed Martin’s small-satellite business sectors, helping the company win major orders from the Space Development Agency for its low Earth orbit constellation.

Winslett last month was named general manager of Blue Canyon Technologies, a small-satellite manufacturer founded in 2008 and acquired by the defense contractor Raytheon Technologies in 2020. 

Growing Blue Canyon’s defense footprint is a key goal for Winslett, he told SpaceNews in a recent interview. The company is currently producing small satellites and cubesats mostly for NASA and commercial customers, and is pursuing orders from spacecraft manufacturers competing for Space Development Agency orders. 

“I think there’s enormous opportunities in defense, civil, on the intelligence side, as well as commercial space,” said Winslett. 

Chris Winslett. Credit: Blue Canyon Technolgies

Based in Boulder, Colorado, Blue Canyon in 2018 won a high-profile military contract when it was selected by the Defense Advanced Research Projects Agency to supply buses for the Blackjack technology demonstration in low Earth orbit.

The contract with DARPA had options to supply up to 20 buses. But the program was beset by delays and supply chain problems and was ultimately downsized to only four satellites, which launched to orbit in June. 

The Blackjack satellites were built with Blue Canyon’s largest satellite bus, the Saturn-class, ESPA-Grande. Although DARPA’s program didn’t generate as many orders as Blue Canyon had anticipated, these buses are now in increasing demand by other defense agencies, said Winslett. 

Raytheon is using Blue Canyon’s Saturn buses to build seven satellites for the Space Development Agency. The smallsat subsidiary, however, does not expect to supply exclusively to its parent company, and will be a merchant supplier of buses and other hardware to any manufacturer, Winslett said.

“SDA over the last three years has really disrupted the government market, buying satellites in a very short amount of time and at an affordable cost,” he said. “I think you’ll see a lot more government agencies move in that direction.”

Winslett said an “enormous amount of lessons were learned from the Blackjack program,” particularly about how to manage the supply chain.

An industry-wide shortage of parts and components, aggravated by the covid pandemic, resulted in Blackjack delays and also affected Space Development Agency programs

As a result, Raytheon and Blue Canyon started a new effort, called Project Sunrise, to ensure supply shortages are averted in the future, said Winslett. The program puts greater focus on inventories and ensuring key parts are always available. “It’s going to help us really shorten our cycle times,” he added. “Our customers are looking at building capabilities quickly, probably much faster than has historically been done.”

More demand for cubesats

Although defense customers like SDA and other satellite operators are interested in the larger sized smallsats, there is still a significant demand for tiny cubesats used primarily for experiments. 

In response, Blue Canyon last year opened an expanded cubesat factory in Boulder, increasing capacity from 50 to 85 a year. The company also operates a manufacturing facility in Lafayette, Colorado. 

To accommodate larger payloads on cubesats, Blue Canyon introduced a bigger 16U cubesat, named XB16.

“We are in discussion with various customers regarding the XB16,” said Winslett.

“We continue to see a large demand for cubesats, especially in the 12U to 16U size,” he said. Many customers are moving toward larger microsats, he added, “but I don’t think you’ll see cubesats go away. I actually think there’s still a large market for that.”

There are more manufacturers today supplying small satellites, Winslett said, “but the pie is continuing to grow. If you look at the market today versus three or five years ago, it’s substantially bigger,” he added. “It’s a competitive market, but I do think there’s opportunities for all of us.”

Upcoming deliveries

Winslett said Blue Canyon has recently delivered 11 smallsats for various government customers.

The company supplied satellites for several missions that are projected to launch in the near future. 

MethaneSAT, projected to launch in 2024, was built on a Saturn-class microsat. The Environmental Defense Fund will use it to measure methane emissions.

NASA’s Pandora mission, projected to launch in 2024 or 025, uses a Saturn-class microsat. NASA and the Department of Energy’s Lawrence Livermore National Laboratory designed the mission to study the atmospheres of exoplanets.

Blue Canyon also supplied cubesats for NASA’s CLICK B/C mission, projected to launch in 2024, designed to demonstrate optical communication crosslink between two small spacecraft in LEO. It also built cubesats for the upcoming VISORS telescope demonstration mission planned by NASA and other partners, and the EZIE mission funded by NASA and the Johns Hopkins Applied Physics Lab to image the Earth’s magnetic footprint.

source

New Mexico footprints really are from the last Ice Age

Footprints preserved in mud in New Mexico were made by humans thousands of years before any people were thought to be in the Americas, a new analysis confirms.

The standard story of the peopling of the Americas has Asians migrating across a land bridge into Alaska some 14,000 years ago, after Ice Age glaciers melted back, and gradually spreading southward across a land never before occupied by humankind.

But the claim in 2021 that human footprints discovered in mud in what is now New Mexico were between 23,000 and 21,000 years old turned that theory on its head.

Now, the new analysis of these footprints, using two different techniques, confirms the date, providing seemingly incontrovertible proof that humans were already living in North America during the height of the last Ice Age.

The study appears in the journal Science.

“When the original paper was published in 2021, the authors were very cautious about claiming a paradigm shift, which is what this is all about,” says David Wahl, an adjunct associate professor of geography at the University of California, Berkeley, and a USGS scientist specializing in pollen analysis. “I mean, if people were here 7,000 years prior to the Clovis culture, why don’t we see more evidence?”

The Clovis culture, named after a 13,500-year-old site in New Mexico, is characterized by distinct stone and bone tools found in close association with Pleistocene or Ice Age animals, including mammoths, and is considered by many archeologists to be the first human culture in the Americas.

Even though there are increasing reports from other sites around North and South America where humans appear to have been killing animals or occupying settlements as early as 16,000 years ago, the Clovis-first theory is still dominant.

So, when US Geological Survey (USGS) researchers and an international team of scientists claimed an age of 23,000 to 21,000 years for seeds found in the footprints—thousands of these footprints are preserved in an alkali flat in White Sands National Park—the pushback from archeologists was extreme.

One point of contention was that the seeds came from a common aquatic plant, spiral ditchgrass (Ruppia cirrhosa). Dating aquatic plants can be problematic, Wahl says, because if the plants grow totally submerged, they take up dissolved carbon rather than carbon from the air. Dissolved carbon can be older because it comes from surrounding bedrock, potentially causing the age measured by radiocarbon dating to be older than the material analyzed actually is. Hence, the reanalysis.

“The immediate reaction in some circles of the archeological community was that the accuracy of our dating was insufficient to make the extraordinary claim that humans were present in North America during the Last Glacial Maximum. But our targeted methodology in this current research really paid off,” says Jeff Pigati, a USGS research geologist and co-lead author of the new study.

For the newly published follow-up study, the researchers focused on radiocarbon dating of conifer pollen—more than 75,000 pollen grains per sample from fir, spruce, and pine, which are terrestrial.

Wahl, Marie Champagne, and Jeff Honke of USGS worked with Susan Zimmerman at Lawrence Livermore National Laboratory to painstakingly isolate the pollen grains through several steps including physical and chemical separation, followed by purifying via flow cytometry at the Flow Cytometry Core Facility at Indiana University (FCCF).

The carbon isotope composition of each sample was then determined using mass spectrometry at the Center for Mass Spectometry (CAMS). Importantly, the pollen samples were collected from the exact same layers as the original Ruppia seeds, so a direct comparison could be made. In each case, the pollen age was statistically identical to the corresponding seed age.

“Pollen samples also helped us understand the broader environmental context at the time the footprints were made,” Wahl says. “The pollen in the samples came from plants typically found in cold and wet glacial conditions, in stark contrast with pollen from the modern playa, which reflects the desert vegetation found there today.”

In addition, Harrison Gray on the USGS team used a different type of dating called optically stimulated luminescence, which dates the last time quartz grains were exposed to sunlight. Using this method, the team found that quartz samples collected within the footprint-bearing layers had a minimum age of about 21,500 years, providing further support to the radiocarbon results.

“We were confident in our original ages, as well as the strong geologic, hydrologic, and stratigraphic evidence, but we knew that independent chronologic control was critical,” says Kathleen Springer, USGS research geologist and co-lead author of the paper.

With three separate lines of evidence pointing to the same approximate age, it is highly unlikely that all are incorrect or biased and, taken together, they provide strong support for the 23,000-to-21,000-year age range for the footprints.

“Critics asked us to provide more than one piece of evidence, and we’ve given them a total of three,” Wahl says. “So, we’re feeling really good.”

One implication of 23,000-year-old human footprints in North America is that America’s first settlers may have originally come through Alaska before the land was covered by glaciers 20,000 years ago rather than afterward, as presumed when human occupancy was thought to have begun 14,000 years ago. Another alternative is that the settlers traveled along the coast.

Additional researchers are from the National Park Service (NPS) and Bournemouth University in Poole, United Kingdom.

Source: UC Berkeley

source

Trees could make air pollution worse as the planet heats up

Trees including oak and poplar will emit more isoprene—a compound that worsens air pollution—as global temperatures rise, according to new research.

It’s a simple question that sounds a little like a modest proposal.

“Should we cut down all the oak trees?” asks Tom Sharkey, a professor in the Plant Resilience Institute at Michigan State University.

To be clear, Sharkey wasn’t sincerely suggesting that we should cut down all the oaks. Still, his question was an earnest one, prompted by his team’s latest research, which appears in the Proceedings of the National Academy of Sciences.

The team discovered that, on a warming planet, plants like oaks and poplars will emit more of a compound that exacerbates poor air quality, contributing to problematic particulate matter and low-atmosphere ozone.

The rub is that the same compound, called isoprene, can also improve the quality of clean air while making plants more resistant to stressors including insects and high temperatures.

“Do we want plants to make more isoprene so they’re more resilient, or do we want them making less so it’s not making air pollution worse? What’s the right balance?” asks Sharkey, who also works at the Plant Research Laboratory and in the biochemistry and molecular biology department.

“Those are really the fundamental questions driving this work. The more we understand, the more effectively we can answer them.”

Sharkey has been studying isoprene and how plants produce it since the 1970s, when he was a doctoral student at Michigan State.

Isoprene from plants is the second-highest emitted hydrocarbon on Earth, only behind methane emissions from human activity. Yet most people have never heard of it, Sharkey says.

“It’s been behind the scenes for a long time, but it’s incredibly important,” Sharkey says.

It gained a little notoriety in the 1980s, when then-president Ronald Reagan falsely claimed trees were producing more air pollution than automobiles. Yet there was a kernel of truth in that assertion.

Isoprene interacts with nitrogen oxide compounds found in air pollution produced by coal-fired power plants and internal combustion engines in vehicles. These reactions create ozone, aerosols, and other byproducts that are unhealthy for both humans and plants.

“There’s this interesting phenomenon where you have air moving across a city landscape, picking up nitrogen oxides, then moving over a forest to give you this toxic brew,” Sharkey says. “The air quality downwind of a city is often worse than the air quality in the city itself.”

Now, with support from the National Science Foundation, Sharkey and his team are working to better understand the biomolecular processes plants use to make isoprene. The researchers are particularly interested in how those processes are affected by the environment, especially in the face of climate change.

Prior to the team’s new publication, researchers understood that certain plants produce isoprene as they carry out photosynthesis. They also knew the changes that the planet is facing were having competing effects on isoprene production.

That is, increasing carbon dioxide in the atmosphere drives the rate down, while increasing temperatures accelerate the rate. One of the questions behind the team’s new publication was essentially which one of these effects will win out.

“We were looking for a regulation point in the isoprene’s biosynthesis pathway under high carbon dioxide,” says Abira Sahu, the lead author of the new report and a postdoctoral research associate in Sharkey’s research group.

“Scientists have been trying to find this for a long time,” Sahu says. “And, finally, we have the answer.”

“For the biologists out there, the crux of the paper is that we identified the specific reaction slowed by carbon dioxide, CO2,” Sharkey says.

“With that, we can say the temperature effect trumps the CO2 effect,” he says. “By the time you’re at 95 degrees Fahrenheit—35 degrees Celsius—there’s basically no CO2 suppression. Isoprene is pouring out like crazy.”

In their experiments, which used poplar plants, the team also found that when a leaf experienced warming of 10 degrees Celsius, its isoprene emission increased more than tenfold, Sahu says.

“Working with Tom, you realize plants really do emit a lot of isoprene,” says Mohammad Mostofa, an assistant professor who works in Sharkey’s lab and was another author of the new report.

The discovery will help researchers better anticipate how much isoprene plants will emit in the future and better prepare for the impacts of that. But the researchers also hope it can help inform the choices people and communities make in the meantime.

“We could be doing a better job,” Mostofa says.

That could mean planting fewer oaks in the future to limit isoprene emissions. As for what we do about the trees already emitting isoprene, Sharkey does have an idea that doesn’t involve cutting them down.

“My suggestion is that we should do a better job controlling nitrogen oxide pollution,” Sharkey says.

Sarathi Weraduwage, a former postdoctoral researcher in Sharkey’s lab who is now an assistant professor at Bishop’s University in Quebec, also contributed to the research.

Source: Michigan State

source

Machine learning predicts how drugs will affect cancer cells

A new approach can predict how individual cells react to specific treatments, offering hope for more accurate diagnoses and therapeutics.

Cancer is triggered by changes in cells that lead to the proliferation of pathogenic tumor cells. In order to find the most effective combination and dosage of drugs, it is advantageous if physicians can see inside the body, so to speak, and determine what effect the drugs will have on individual cells.

The new machine learning approach allows such cell changes and drug effects to be modeled and predicted with much greater accuracy and nuance than before.

In the battle against cancer, a fine-grained understanding of the behavior of individual cells towards a drug is key. After all, a medication ideally ought to destroy only tumor cells. However, if the effect of a drug is known only as a statistical average of a larger cell population, an analysis of the drug’s effect might not detect that certain tumor cells survive the drug due to their nature or obtained resistances, and the cancer will continue to spread.

The new approach recognizes the distinct reactions individual cells can have to a drug within a larger population. This understanding of cell variation is pivotal for advancing more effective cancer treatments.

“The diversity within a group of cells greatly influences their sensitivity or resistance to changes. Instead of basing our understanding on the average response of a cell group, our method can precisely describe—and even predict—how each cell reacts to disturbances like those from a drug,” says Gunnar Rätsch, professor of biomedical informatics at ETH Zurich and the University Hospital Zurich.

CellOT method

Researchers refer to the molecular reactions with which cells respond to chemical, physical, or genetic influences as perturbations. Such disturbances alter the affected cells and can, for example, trigger their death. The effect a given drug has on a cancer cell can also be seen as a perturbation.

Understanding which cancer cells respond to a drug and identifying the traits of those that form resistance to a drug is crucial for developing new treatment approaches and strategies. Such new treatments could be more effective at inhibiting cell growth or even causing pathogenic cells to die.

In their study, published in the journal Nature Methods, the researchers demonstrate that their method works not only on cancer cells but also on other pathogenic cells—for example, including the case of lupus erythematosus. This autoimmune disease is typically accompanied with a red rash and can lead to inflammation of the chest, heart, or ribs.

Another key innovation to emerge from the study is the ability to make predictions: the Zurich researchers are calling their new machine learning method CellOT. Besides evaluating existing cell measurement data and thus expanding the knowledge of cellular perturbation reactions, CellOT can also predict how individual cells will respond to a perturbation whose reactions have not yet been measured in the laboratory.

The new method paves the way towards more targeted and personalized treatments: the predictions allow for the forecasting of a perturbation’s effect on unseen cells, and thus indicate how well a patient’s cells respond to the drug in question.

Comprehensive clinical trials are still required before the approach can be used in a hospital setting. At present, the researchers have demonstrated the method’s ability to provide highly accurate predictions.

Gradual cancer cell changes

Machine learning is what made such predictions possible. For CellOT, the researchers use novel machine learning algorithms and train these with both data from unperturbed cells and data from cells that changed after a perturbation response. In the process, the algorithm learns how cellular perturbation reactions arise, how they progress, and the likely phenotypes of altered cell states.

The researchers collaborated with a research group led by Lucas Pelkmans, professor of cellular systems biology at the University of Zurich. Gabriele Gut, formerly a postdoc in Pelkmans’ lab and now senior scientist in the Medical Oncology and Haematology Clinic at the University Hospital Zurich, measured the specific cell changes using a technique called 4i multiplex protein imaging.

“CellOT works particularly well on data acquired with this technique,” Pelkmans says. In addition, the researchers obtained single-cell RNA data from public databases.

“Mathematically speaking, our machine learning model is based on the assumption that cells change gradually after a perturbation,” says Charlotte Bunne, who, along with Stefan Stark and Gabriele Gut, is the lead author of the study and is working on her doctorate under Andreas Krause, professor of computer science. “These gradual changes in cell states can be described and predicted well using the mathematical theory of optimal transport.”

CellOT is now the first approach to use optimal transport and machine learning to predict the perturbation responses of cells from new samples. “Established OT methods do not allow for out-of-sample or out-of-measurement predictions. But that’s exactly what CellOT can do,” Bunne says.

Source: ETH Zurich

source