Why go back to the moon?

In a new book, Joseph Silk explores what the moon can offer humans over the next half century.

As our nearest celestial neighbor, the moon has forever captured the awe of human beings. Some ancient cultures worshipped it as a deity or believed its eclipses to be omens. It was Galileo peering through an early telescope in 1609 who discovered the moon’s rocky surface, and NASA’s Apollo 11 mission in 1969 that sent the first humans to walk upon it.

A half-century has now passed since humans last made direct contact with the moon, with Apollo 17 in 1972. But a new era of exploration has begun with zeal, as a number of space agencies and commercial ventures worldwide launch ambitious lunar projects.

Look forward another half-century or so, says Silk, a Johns Hopkins University astrophysicist, and the moon could be teeming with activity: hotels and villages, lunar mining, ports into deeper space, and giant telescopes that could make the James Webb technology look amateur.

“We will build on the moon. We will colonize the moon. We will exploit the moon. We will do science on the moon,” Silk writes in his new book, Back to the Moon: The Next Giant Leap for Humankind (Princeton University Press, 2022). “Lunar science will open up new vistas on the most profound questions we have ever posed.”

As Back to the Moon hits shelves, there is tangible progress on this front. The Japanese company ispace intends to become the first private venture to make a cargo delivery to the moon, aboard a SpaceX rocket. At the same time, NASA is commencing the first test phase of its $93 billion Artemis program, which will send four astronauts to the moon in 2025 and establish a permanent base there, with the grand ambition to use the moon as a launchpad for the first-ever crewed mission to Mars.

A professor of physics and astronomy, Silk has penned previous books on the big bang, infinity, and other weighty cosmological topics. In Back to the Moon, he posits that the moon in fact offers our only pathway to surpassing the current limits of astronomy. “We’re running out of resources on Earth for it,” he says, “but the moon provides a site for achieving much more.”

The low gravity on the moon, for instance, could allow for easier manufacturing of megatelescopes 10 times larger than what’s possible on Earth, and the lack of lunar atmosphere can allow those telescopes to peer farther afield with exquisite precision, Silk says. These features will be crucial for studying far larger samples of Earth-like planets beyond our own solar system—and in turn for tackling one of humanity’s most probing mysteries: Are we alone in this universe?

In searching for exoplanets that could feasibly host life, astronomers know what to look for, as Silk writes: “the reflected glints of oceans, the green glows of forests, the presence of oxygen in the atmospheres, and even more advanced but subtle signs of intelligent life such as… industrial pollution of planetary atmospheres.” The megatelescopes, Silk says, could also help us understand the very origins of the cosmos, the dark ages before the first stars appeared.

A quarter of a million miles and three days from Earth, the moon can also serve as an improved launch site for deeper travels into space—in part because of the prohibitive payload required for rocket fuel to achieve interplanetary transport from Earth. On the moon, we’ll be able to produce that fuel directly from liquefying oxygen and hydrogen found in abundant lunar ice in the depths of permanently shadowed polar craters.

To pursue these endeavors, human settlement on the moon is necessary, Silk says. NASA already intends to build its Artemis base camp on the lunar south pole, where China, too, has plans for an international research station.

Silk also envisions denser habitats, villages or even cities, constructed within the vast lava tubes beneath the moon’s surface, protected from meteorites and other harms. But within the next 15 or 20 years, he says, moon resorts may be the first civilian projects we’ll see—”a very sophisticated tourism that opens up the moon to many more people than astronauts and engineers.” He can imagine lunar golfing and rover rides over lunar terrain. “At first, this will be accessible only to the very wealthy,” Silk says, likening it to the early days of airplane travel. “But just wait a decade or two.”

Silk acknowledges that humans are likely to carry their earthly failings onto the moon, and that intense international competition could erupt over commercial, military, and mining interests. An Outer Space Treaty, signed by the United Nations in 1967, does prohibit any nation from claiming sovereignty over any part of outer space, but Silk says we need something more detailed and enforceable. “We have to get our act together in the next decade to sort out how different countries can collaborate when they do… anything that involves territorial claims,” he says.

The most pressing argument Silk raises for our investment in the moon is chillingly existential: Ultimately, it may present humankind its best chance of longer-term survival. Silk points to extinction-level threats—global warming, pandemics, and wars, among them—that could force us to seek shelter elsewhere. The moon’s barren landscape and extreme temperatures make it not ideal for large or permanent populations, but it can serve as a steppingstone toward distant planets that humans could potentially colonize. It’s the stuff of sci-fi.

“Whether through cryogenic preservation of humans or genetic rebirth, the centurylong travel times to the nearest stars will not deter future generations of astronauts,” he writes, adding that the limitless potential of robotics and artificial intelligence will also open more doors than we can possibly imagine.

“There’s so much to learn,” Silk says. “Humanity has always been interested in discovering distant realms, in solving difficult questions that haven’t been answered. The moon offers us that vista.”

Source: Katie Pearce for Johns Hopkins University


Unexpected reactions happen when light and nanoplastics meet

Researchers have analyzed how light breaks down polystyrene, a nonbiodegradable plastic that packing peanuts, DVD cases, and disposable utensils are made of.

The researchers find that nanoplastic particles can play active roles in environmental systems.

Plastics are ubiquitous in our society, found in packaging and bottles as well as making up more than 18% of solid waste in landfills. Many of these plastics also make their way into the oceans, where they take up to hundreds of years to break down into pieces that can harm wildlife and the aquatic ecosystem.

In particular, when exposed to light, the nanoplastics derived from polystyrene unexpectedly facilitated the oxidation of aqueous manganese ions and formation of manganese oxide solids that can affect the fate and transport of organic contaminants in natural and engineering water systems.

The research shows how the photochemical reaction of nanoplastics through light absorption generates peroxyl and superoxide radicals on nanoplastic surfaces, and initiates oxidation of manganese into manganese oxide solids.

“As more plastic debris accumulates in the natural environment, there are increasing concerns about its adverse effects,” says research team leader Young-Shin Jun, professor of energy, environmental, and chemical engineering in the McKelvey School of Engineering at Washington University in St. Louis, who leads the Environmental Nanochemistry Laboratory.

“However, in most cases, we have been concerned about the roles of the physical presence of nanoplastics rather than their active roles as reactants. We found that such small plastic particles that can more easily interact with neighboring substances, such as heavy metals and organic contaminants, and can be more reactive than we previously thought.”

Jun and her former student, Zhenwei Gao, a postdoctoral scholar at the University of Chicago, experimentally demonstrated that the different surface functional groups on polystyrene nanoplastics affected manganese oxidation rates by influencing the generation of the highly reactive radicals, peroxyl, and superoxide radicals.

The production of these reactive oxygen species from nanoplastics can endanger marine life and human health and potentially affects the mobility of the nanoplastics in the environment via redox reactions, which in turn might negatively affect their environmental remediation.

The team also looked at the size effects of polystyrene nanoplastics on manganese oxidation, using 30 nanometer, 100 nanometer, and 500 nanometer particles. The two larger-sized nanoparticles took longer to oxidize manganese than the smaller particles. Eventually, the nanoplastics will be surrounded by newly formed manganese oxide fibers, which can make them easily aggregated and can change their reactivities and transport.

“The smaller particle size of the polystyrene nanoplastics may more easily decompose and release organic matter because of their larger surface area,” Jun says. “This dissolved organic matter may quickly produce reactive oxygen species in light and facilitate manganese oxidation.”

“This experimental work also provides useful insights into the heterogeneous nucleation and growth of manganese oxide solids on such organic substrates, which benefits our understanding of manganese oxide occurrences in the environment and engineered materials syntheses,” Jun says. “These manganese solids are excellent scavengers of redox-active species and heavy metals, further affecting geochemical element redox cycling, carbon mineralization, and biological metabolisms in nature.”

Jun’s team plans to study the breakdown of diverse common plastic sources that can release nanoplastics and reactive oxidizing species and to investigate their active roles in the oxidation of transition and heavy metal ions in the future.

The research appears in ACS Nano. Partial funding for this research came from the National Science Foundation and the McDonnell International Scholars Academy at Washington University in St. Louis.

Source: Washington University in St. Louis


Elite divers could shed light on lung disease

Researchers are studying elite free divers to understand the limits of human physiology.

The insights could lead to better treatments for lung disease.

The world’s top free divers can hold their breath for minutes at a time, embarking on extended underwater adventures without the aid of scuba equipment.

People with chronic obstructive pulmonary disease often struggle to get enough oxygen. In response, arterioles—tiny branches of the arteries—bringing blood to the lungs constrict. That leads to high pulmonary blood pressure and strains the heart.

“It’s mostly thought of as a beneficial adaptation; if you inhale something blocking an airway, blood vessels going to that area will constrict and send blood elsewhere where it can pick up oxygen,” says Andy Lovering of the University of Oregon. “But the problem is that if you deplete the oxygen from the entire lung, the pressure inside increases, causing pulmonary hypertension.”

Free divers, on the other hand, intentionally put themselves into an oxygen-deprived state. During long dives, their blood oxygen levels sink to extreme lows. That could cause organ damage in some people. But trained divers can quickly bounce back, ready for another dive.

In studies of Croatian divers, Lovering’s team has identified a few distinctive adaptations, described in two recent papers. Together, those adaptations might help divers keep their heart and lungs working effectively under extremely low oxygen conditions.

In a study in Experimental Physiology, the researchers placed both trained divers and healthy nondivers into a low-oxygen environment for 20 to 30 minutes.

“The normal response to low oxygen is for arterioles in lungs to constrict,” raising pulmonary blood pressure, says Tyler Kelly, a graduate student in Lovering’s lab who led the work. “But we found that these athlete divers had a minimal response, if any.”

The arterioles in their lungs didn’t constrict as much in response to low oxygen, reducing the strain on the heart that diminished oxygen usually causes.

“It’s a really unique adaptation,” Lovering says.

In a study in the Journal of Science and Medicine in Sport, the researchers found that the divers were also more likely than nondivers to have a patent foramen ovale, a hole that creates a passageway between the left and right sides of the upper chambers of the heart. This hole is present in all babies in utero, allowing blood to circumvent the developing lungs. It usually closes soon after birth once the lungs kick into action. But in a small number of people, it remains partially open.

In divers, this hole could act like a relief valve, helping to reduce pressure on the right side of the heart under low-oxygen conditions, Lovering suggests.

Lovering’s team isn’t sure yet whether these are adaptations that arise due to extensive training or whether people who have the differences from birth are simply more likely to succeed as divers.

Divers often build their stamina via dryland training, essentially practicing holding their breath for increasingly long time periods while out of the water. In follow-up work, Lovering wants to test whether sending ordinary people through a breath-holding diving training program can induce the same physiological changes in regular people as is seen in the divers.

If so, structured breath-holding exercises could be a treatment for people with chronic lung disease, dampening their body’s response to low oxygen and minimizing the strain on the heart and lungs.

Source: Laurel Hamers for University of Oregon


Podcast: Is another US Civil War on the way?

Just days after the second anniversary of the January 6 attack on the United States Capitol, a new podcast episode reflects on some daunting questions.

Is democracy on the brink of the collapse? Why are US politics so polarized? And are we headed for another civil war? William Howell, a University of Chicago professor and director of the Center for Effective Government, has been thinking about these questions, along with political scientists across the country.

In this episode of the Big Brains podcast, Howell explains why claims of another civil war are overexaggerated, and instead, offers some correctives:

Subscribe to Big Brains on Apple Podcasts, Stitcher, and Spotify.

Source: University of Chicago


SARS-CoV-2 viral toxin may make COVID worse

A new study reveals how a viral toxin the SARS-CoV-2 virus produces may contribute to severe COVID-19 infections.

The study shows how a portion of the SARS-CoV-2 “spike” protein can damage cell barriers that line the inside of blood vessels within organs of the body, such as the lungs, contributing to what is known as vascular leak.

Blocking the activity of this protein may help prevent some of COVID-19’s deadliest symptoms, including pulmonary edema, which contributes to acute respiratory distress syndrome (ARDS).

“People are aware of the role of bacterial toxins, but the concept of a viral toxin is still a really new idea,”

“In theory, by specifically targeting this pathway, we could block pathogenesis that leads to vascular disorder and acute respiratory distress syndrome without needing to target the virus itself,” says lead author Scott Biering, a postdoctoral scholar at the University of California, Berkeley.

“In light of all the different variants that are emerging and the difficulty in preventing infection from each one individually, it might be beneficial to focus on these triggers of pathogenesis in addition to blocking infection altogether.”

The spike protein and vascular leak

While many vaccine skeptics have stoked fears about potential dangers of the SARS-CoV-2 spike protein—which is the target of COVID-19 mRNA vaccines—the researchers say that their work provides no evidence that the spike protein can cause symptoms in the absence of viral infection. Instead, their study suggests that the spike protein may work in tandem with the virus and the body’s own immune response to trigger life-threatening symptoms.

In addition, the amount of spike protein circulating in the body after vaccination is far less concentrated than the amounts that have been observed in patients with severe COVID-19 and that were used in the study.

“The amount of spike protein that you would have in a vaccine would never be able to cause leak,” says senior author Eva Harris, a professor of infectious diseases and vaccinology. “In addition, there’s no evidence that [the spike protein] is pathogenic by itself. The idea is that it’s able to aid and abet an ongoing infection.”

By examining the impact of the SARS-CoV-2 spike protein on human lung and vascular cells, and on the lungs of mice, the research team was able to uncover the molecular pathways that allow the spike protein to disrupt critical internal barriers in the body. In addition to opening new avenues for the treatment of severe COVID-19, understanding how the spike protein contributes to vascular leak could shed light on the pathology behind other emerging infectious diseases.

“We think that a lot of viruses that cause severe disease may encode a viral toxin,” Biering says. “These proteins, independent of viral infection, interact with barrier cells, and cause these barriers to malfunction. This allows the virus to disseminate, and that amplification of virus and vascular leak is what triggers severe disease. I’m hoping that we can use the principles that we’ve learned from the SARS-CoV-2 virus to find ways to block this pathogenesis so that we are more prepared when the next pandemic happens.”

Vascular leak occurs when the cells that line blood vessels and capillaries are disrupted, allowing plasma and other fluids to leak out of the bloodstream. In addition to causing the lung and heart damage observed in severe COVID-19, vascular leak can also lead to hypovolemic shock, the primary cause of death from dengue.

Dengue and SARS-CoV-2

Before the COVID-19 pandemic, Biering and other members of the Harris Research Program were studying the role of dengue virus protein NS1 in triggering vascular leak and contributing to hypovolemic shock. When the pandemic hit, the team wondered if a similar viral toxin in SARS-CoV-2 could also be contributing to the acute respiratory distress syndrome that was killing COVID-19 patients.

“People are aware of the role of bacterial toxins, but the concept of a viral toxin is still a really new idea,” Harris says. “We had identified this protein secreted from dengue virus-infected cells that, even in the absence of the virus, is able to cause endothelial permeability and disrupt internal barriers. So, we wondered if a SARS-CoV-2 protein, like spike, might be able to do similar things.”

“COVID-19 is not gone. We have better vaccines now, but we don’t know how the virus is going to mutate in the future.”

Spike proteins coat the outer surface of SARS-CoV-2, giving the virus its knobby appearance. They play a critical role in helping the virus infect its hosts: The spike protein binds to a receptor called ACE2 on human and other mammalian cells, which—like a key turning a lock—allows the virus to enter the cell and hijack cellular function. The SARS-CoV-2 virus sheds a large portion of the spike protein containing the receptor-binding domain (RBD) when it infects a cell.

“What’s really interesting is that circulating spike protein correlates with severe COVID-19 cases in the clinic,” Biering says. “We wanted to ask if this protein was also contributing to any vascular leak we saw in the context of SARS-CoV-2.”

Currently, scientists attribute the heart and lung damage associated with severe COVID-19 to an overactive immune response called a cytokine storm. To test the theory that the spike protein might also play a role, Biering and other team members used thin layers of human endothelial and epithelial cells to mimic the linings of blood vessels in the body. They found that exposing these cellular layers to the spike protein increased their permeability, a hallmark of vascular leak.

Using CRISPR-Cas9 gene editing technology, the team showed that this increased permeability occurred even in cells that did not express the ACE2 receptor, indicating that it could occur independently of viral infection. In addition, they found that mice that were exposed to the spike protein also exhibited vascular leak, even though mice do not express the human ACE2 receptor and cannot be infected with SARS-CoV-2.

Finally, with the help of RNA sequencing, the researchers found that the spike protein triggers vascular leak through a molecular signaling pathway that involves glycans, integrins, and transforming growth factor beta (TGF-beta). By blocking the activity of integrins, the team was able to reverse the vascular leak in mice.

“We identified a new pathogenic mechanism of SARS-CoV-2 in which the spike protein can break down the barriers lining our vasculature. The resulting increase in permeability can lead to vascular leak, as is commonly observed in severe COVID-19 cases, and we could recapitulate those disease manifestations in our mouse models,” says coauthor Felix Pahmeier, a graduate student in the Harris lab. “It was interesting to see the similarities and differences between spike and dengue virus protein NS1. Both are able to disrupt endothelial barriers, but the timelines and host pathways involved seem to differ between the two.”

Looking ahead

While blocking the activity of integrins may be a promising target for treating severe COVID-19, Harris says more work needs to be done to understand the exact role of this pathway in disease progression. While increased vascular permeability can accelerate infection and lead to internal bleeding, it can also help the body fight off the virus by giving immune machinery better access to infected cells.

“SARS-CoV-2 evolved to have a spike surface protein with increased capacity of interacting with host cell membrane factors, such as integrins, by acquiring an RGD motif. This motif is a common integrin-binding factor exploited by many pathogens, including bacteria and other viruses, to infect host cells,” says Francielle Tramontini Gomes de Sousa, former assistant project scientist in Harris’s lab and co-first author of the study.

“Our study shows how spike RGD interacts with integrins, resulting in TGF-beta release and activation of TGF-beta signaling. Using in vitro and in vivo models of epithelial, endothelial, and vascular permeability, we were able to improve understanding of the cellular mechanisms of increased levels of TGF-beta in COVID-19 patients and how spike-host cell interactions could contribute to disease.”

The team is continuing to study the molecular mechanisms that lead to vascular leak and is also investigating possible viral toxins in other viruses that cause severe disease in humans.

“COVID-19 is not gone. We have better vaccines now, but we don’t know how the virus is going to mutate in the future,” Biering says.

“Studying this process may be able to help us develop a new arsenal of drugs so that if someone is experiencing vascular leak, we can just target that. Maybe it doesn’t stop the virus from replicating, but it could stop that person from dying.”

The research appears in Nature Communications. Additional coauthors are from UC Berkeley; the Chan Zuckerberg Biohub; the University of California, San Francisco; the University of California, San Diego; Cornell University; and the University of North Carolina at Chapel Hill.

Support for the work came from the National Institute of Allergy and Infectious Diseases (NIAID); a Fast Grant from Emergent Ventures; the National Science Foundation; the National Heart, Lung, and Blood Institute; the National Institutes of Health; the Innovative Genomics Institute; and the Life Sciences Research Foundation.

Source: UC Berkeley


City lizards sport genomic markers in common

City lizards have parallel genomic markers when compared to neighboring forest lizards, a study finds.

The genetic variations linked to urbanization underlie physical differences in the urban lizards including longer limbs and larger toepads that show how these lizards have evolved to adapt to city environments.

Urbanization has dramatically transformed landscapes around the world—changing how animals interact with nature, creating “heat islands” with higher temperatures, and hurting local biodiversity. Yet many organisms survive and even thrive in these urban environments, taking advantage of new types of habitat created by humans. Researchers studying evolutionary changes in urban species have found that some populations, for example, undergo metabolic changes from new diets or develop an increased tolerance of heat.

“Urbanization impacts roughly two-thirds of the Earth and is expected to continue to intensify, so it’s important to understand how organisms might be adapting to changing environments,” says Kristin Winchell, assistant professor of biology at New York University and the study’s first author.

“In many ways, cities provide us with natural laboratories for studying adaptive change, as we can compare urban populations with their non-urban counterparts to see how they respond to similar stressors and pressures over short periods of time.”

scan of lizard with highlighted closeups of toe pads
Urban lizards had significantly longer limbs and larger toe pads with more specialized scales on their toes. (Credit: NYU)

Anolis cristatellus lizards—a small-bodied species also known as the Puerto Rican crested anole—are common in both urban and forested areas of Puerto Rico. Prior studies by Winchell and her colleagues found that urban Anolis cristatellus have evolved certain traits to live in cities: they have larger toepads with more specialized scales that allow them to cling to smooth surfaces like walls and glass, and have longer limbs that help them sprint across open areas.

In the new study, the researchers looked at 96 Anolis cristatellus lizards from three regions of Puerto Rico—San Juan, Arecibo, and Mayagüez—comparing lizards living in urban centers with those living in forests surrounding each city. Their findings appear in PNAS.

They first confirmed that the lizard populations in the three regions were genetically distinct from one another, so any similarities they found among lizards across the three cities could be attributed to urbanization. They then measured their toepads and legs and found that urban lizards had significantly longer limbs and larger toepads with more specialized scales on their toes, supporting their earlier research that these traits have evolved to enable urban lizards to thrive in cities.

To understand the genetic basis of these trait differences, the researchers conducted several genomic analyses on exomic DNA, the regions of the genome that code for proteins. They identified a set of 33 genes found in three regions of the lizard genome that were repeatedly associated with urbanization across populations, including genes related to immune function and metabolism.

“While we need further analysis of these genes to really know what this finding means, we do have evidence that urban lizards get injured more and have more parasites, so changes to immune function and wound healing would make sense. Similarly, urban anoles eat human food, so it is possible that they could be experiencing changes to their metabolism,” says Winchell.

In an additional analysis, they found 93 genes in the urban lizards that are important for limb and skin development, offering a genomic explanation for the increases in their legs and toepads.

“The physical differences we see in the urban lizards appear to be mirrored at the genomic level,” says Winchell. “If urban populations are evolving with parallel physical and genomic changes, we may even be able to predict how populations will respond to urbanization just by looking at genetic markers.”

“Understanding how animals adapt to urban environments can help us focus our conservation efforts on the species that need it the most, and even build urban environments in ways that maintain all species,” adds Winchell.

Do the differences in urban lizards apply to people living in cities? Not necessarily, according to Winchell, as humans aren’t at the whim of predators like lizards are. But humans are subject to some of the same urban factors, including pollution and higher temperatures, that seem to be contributing to adaptation in other species.

Additional study authors are from Princeton University; Washington University in St. Louis; the University of Massachusetts Boston and Universidad Católica de la Santísima Concepción in Chile; Virginia Commonwealth University; and Rutgers University-Camden. The research had funding in part from the National Science Foundation, and from the University of Massachusetts Boston Bollinger Memorial Research Grant.

Source: NYU


Team pinpoints genetic cause of late-onset ataxia

Researchers have identified a previously unknown genetic cause of a late-onset cerebellar ataxia.

The discovery will improve diagnosis and open new treatment avenues for this progressive condition.

Late-onset cerebellar ataxias (LOCA) are a heterogenous group of neurodegenerative diseases that manifest in adulthood with unsteadiness. One to three in 100,000 people worldwide will develop a late-onset ataxia.

Until recently, most patients with late-onset ataxia had remained without a genetic diagnosis.

Researchers led by Bernard Brais, a neurologist and researcher at The Neuro (Montreal Neurological Institute-Hospital) of McGill University, and Stephan Züchner of the University of Miami’s Miller School of Medicine in collaboration with neurologists from the Universities of Montreal and Sherbrooke, studied a group of 66 Quebec patients from different families who had late-onset ataxia for which an underlying genetic cause had not yet been found.

Using the most advanced genetic technologies, the team found that 40 (61%) of the patients carried the same novel disease-causing variant in the gene FGF14, making it the most common genetic cause of late-onset ataxia in Quebec. They found that a small stretch of repetitive DNA underwent a large size increase in patients, a phenomenon known as repeat expansion.

To confirm their initial findings, the team reached out to international collaborators in Tübingen in Germany, Perth in Australia, London in the United Kingdom, and Bengaluru in India. They found that 10-18% of LOCA patients in these independent cohorts also carried the same FGF14 error. These results confirmed that a repeat expansion in FGF14 is one of the most common genetic causes of late-onset ataxia described to date.

Going beyond gene discovery, the team also studied brains from deceased patients and neurons derived from patients and found that the error causes decreased expression of the gene and its protein.

Patients with a FGF14-related late-onset ataxia experience unsteadiness (ataxia) usually beginning in their fifties. The disease may start with short episodes of ataxia that can be precipitated by exercise and alcohol intake. The coordination problems become permanent on average around 59 years old. The disease usually progresses slowly and affects walking, speech, and hand coordination. Often with time it requires walking aids. The condition is most often transmitted from an affected parent although it can appear in families with no previous history of ataxia.

The FGF14 GAA repeated sequence of the mutation raises much interest, since it is identical to the one causing Friedreich ataxia in the FXN gene, the most common cause of autosomal recessive ataxia worldwide. Because of this similarity, it is possible that some of the new treatments under development for Friedreich ataxia may be used to treat patients with an expansion in FGF14.

This study also suggests that patients may benefit from a drug called aminopyridine that is already marketed for other neurological conditions. This is especially promising since some patients with an expansion in FGF14 have responded well to this treatment.

“This opens the door to a clinical trial of this drug in these patients,” says Brais. “It’s great news for patients in Canada and worldwide. It also makes genetic testing possible so people and families can arrive at the end of their long diagnostic journey.”

The research appears in the New England Journal of Medicine. Funding for the study came from many national agencies, including the NIH, the Fondation Groupe Monaco, and the Montreal General Hospital Foundation.

Source: McGill University


How many people does a scoop of wastewater represent?

Researchers have developed a machine learning model that uses the assortment of microbes found in wastewater to tease out how many individual people they represent.

Research from the lab of Fangqiong Ling showed earlier this year that the amount of SARS-CoV-2 in a wastewater system was correlated with the burden of disease—COVID-19—in the region it served.

But before that work could be done, Ling needed to know: How can you figure out the number of individuals represented in a random sample of wastewater?

A chance encounter with a colleague helped Ling, an assistant professor in the department of energy, environmental, and chemical engineering at the McKelvey School of Engineering at Washington University in St. Louis, led to the creation of a method to do just that

Going forward, this method may be able to link other properties in wastewater to individual-level data.

The problem was straightforward: “If you just take one scoop of wastewater, you don’t know how many people you’re measuring,” Ling says. This is counter to the way studies are typically designed.

“Usually when you design your experiment, you design your sample size, you know how many people you’re measuring,” Ling says. Before she could look for a correlation between SARS-CoV-2 and the number of people with COVID, she had to figure out how many people were represented in the water she was testing.

Initially, Ling thought that machine learning might be able to uncover a straightforward relationship between the diversity of microbes and the number of people it represented, but the simulations, done with an “off-the-shelf” machine learning, didn’t pan out.

Then Ling had a chance encounter with Likai Chen, an assistant professor of mathematics and statistics. The two realized they shared an interest in working with novel, complex data. Ling mentioned that she was working on a project that Chen might be able to contribute to.

“She shared the problem with me and I says, that’s indeed something we can do,” Chen says. It happened that Chen was working on a problem that used a technique that Ling also found helpful.

The key to being able to tease out how many individual people were represented in a sample is related to the fact that, the bigger the sample, the more likely it is to resemble the mean, or average. But in reality, individuals tend not to be exactly “average.” Therefore, if a sample looks like an average sample of microbiota, it’s likely to be made up of many people. The farther away from the average, the more likely it is to represent an individual.

“But now we are dealing with high-dimensional data, right?” Chen says. There are near-endless number of ways that you can group these different microbes to form a sample. “So that means we have to find out, how do we aggregate that information across different locations?”

Using this basic intuition—and a lot of math—Chen worked with Ling to develop a more tailored machine learning algorithm that could, if trained on real samples of microbiota from more than 1,100 people, determine how many people were represented in a wastewater sample (these samples were unrelated to the training data).

“It’s much faster and it can be trained on a laptop,” Ling says. And it’s not only useful for the microbiome, but also, with sufficient examples—training data—this algorithm could use viruses from the human virome or metabolic chemicals to link individuals to wastewater samples.

“This method was used to test our ability to measure population size,” Ling says. But it goes much further. “Now we are developing a framework to allow validation across studies.”

The research appears in the journal PLOS Computational Biology.

Source: Washington University in St. Louis


1 opioid-use disorder med may be safest during pregnancy

People with opioid-use disorder who are pregnant may have more favorable neonatal health outcomes when using buprenorphine compared with methadone, a new study shows.

Buprenorphine is an active ingredient in suboxone and other medications approved for treatment of opioid-use disorder.

The study, published in the New England Journal of Medicine, compared the safety of the two medications—buprenorphine and methadone—used to treat opioid-use disorder during pregnancy.

“Our results may encourage increasing access to buprenorphine treatment specifically among pregnant people,” says lead author Elizabeth Suarez, a pharmacoepidemiologist who is a faculty member at the Center for Pharmacoepidemiology and Treatment Science at the Rutgers Institute for Health, Health Care Policy and Aging Research.

“It’s essential for the general public to understand the importance of opioid-use disorder treatment during pregnancy to avoid harms associated with lack of treatment.”

Overdose deaths from opioids continue to increase and opioid-use disorder remains a prevalent issue in the United States, according to the National Center for Health Statistics and research in Drug and Alcohol Dependence.

People who have untreated opioid-use disorders while pregnant and their infants are at greater risk of harms because of withdrawal, continued opioid use, and overdose. Previous studies found buprenorphine may be safer for the infant than methadone, but results were uncertain.

Using a large national database of people insured by Medicaid, researchers examined maternal outcomes in a sample of pregnant individuals with opioid-use disorder, including delivery by cesarean section and severe pregnancy complications, and infant outcomes, including preterm birth, low birth weight, small size for gestational age, and neonatal abstinence syndrome that occurs when the infant experiences withdrawal after delivery after being exposed to certain drugs in pregnancy.

They found that using buprenorphine to treat opioid-use disorder during pregnancy may result in better outcomes for the baby than methadone. Buprenorphine use was associated with lower risk of preterm birth, small size for gestational age, low birth weight, and neonatal abstinence syndrome compared with methadone use.

“These results may guide clinical recommendations for people with opioid-use disorder who are pregnant or are hoping to be pregnant,” says Suarez, who also is an instructor of epidemiology with the department of biostatistics and epidemiology at the Rutgers School of Public Health.

Future research should explore whether pregnant people with opioid-use disorder have a better experience taking buprenorphine or methadone and if they are more likely to stay on one medication longer than the other, Suarez says.

Additional coauthors are from Brigham and Women’s Hospital, Harvard Medical School, and Stanford University School of Medicine. The National Institute on Drug Abuse supported the work.

Source: Rutgers University


How to wake up alert and refreshed

Researchers have discovered that you can wake up each morning without feeling sluggish by paying attention to three key factors: sleep, exercise, and breakfast.

Do you feel groggy until you’ve had your morning joe? Do you battle sleepiness throughout the workday?

You’re not alone. Many people struggle with morning alertness, but the new study demonstrates that awaking refreshed each day is not just something a lucky few are born with.

“From car crashes to work-related accidents, the cost of sleepiness is deadly.”

The findings come from a detailed analysis of the behavior of 833 people who, over a two-week period, were given a variety of breakfast meals; wore wristwatches to record their physical activity and sleep quantity, quality, timing, and regularity; kept diaries of their food intake; and recorded their alertness levels from the moment they woke up and throughout the day.

The researchers included twins—identical and fraternal—in the study to disentangle the influence of genes from environment and behavior.

The researchers found that the secret to alertness is a three-part prescription requiring substantial exercise the previous day, sleeping longer and later into the morning, and eating a breakfast high in complex carbohydrates, with limited sugar.

The researchers also discovered that a healthy controlled blood glucose response after eating breakfast is key to waking up more effectively.

“All of these have a unique and independent effect,” says Raphael Vallat a postdoctoral fellow at the University of California, Berkeley and first author of the study. “If you sleep longer or later, you’re going to see an increase in your alertness. If you do more physical activity on the day before, you’re going to see an increase. You can see improvements with each and every one of these factors.”

Morning grogginess is more than just an annoyance. It has major societal consequences: Many auto accidents, job injuries, and large-scale disasters are caused by people who cannot shake off sleepiness. The Exxon Valdez oil spill in Alaska, the Three Mile Island nuclear meltdown in Pennsylvania, and an even worse nuclear accident in Chernobyl, Ukraine, are well-known examples.

“Many of us think that morning sleepiness is a benign annoyance. However, it costs developed nations billions of dollars every year through loss of productivity, increased health care utilization, work absenteeism. More impactful, however, is that it costs lives—it is deadly,” says senior author Matthew Walker, professor of neuroscience and psychology and author of Why We Sleep (Simon & Schuster, 2018).

“From car crashes to work-related accidents, the cost of sleepiness is deadly. As scientists, we must understand how to help society wake up better and help reduce the mortal cost to society’s current struggle to wake up effectively each day.”

What you eat

Walker and Vallat teamed up with researchers in the United Kingdom, the US, and Sweden to analyze data acquired by a UK company, Zoe Ltd., that has followed hundreds of people for two-week periods in order to learn how to predict individualized metabolic responses to foods based on a person’s biological characteristics, lifestyle factors, and the foods’ nutritional composition.

The researchers gave participants preprepared meals, with different proportions of nutrients incorporated into muffins, for the entire two weeks to see how they responded to different diets upon waking. A standardized breakfast, with moderate amounts of fat and carbohydrates, was compared to a high protein (muffins plus a milkshake), high carbohydrate, or high sugar (glucose drink) breakfast. The subjects also wore continuous glucose monitors to measure blood glucose levels throughout the day.

“…there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on…”

The worst type of breakfast, on average, contained high amounts of simple sugar; it was associated with an inability to wake up effectively and maintain alertness. When given this sugar-infused breakfast, participants struggled with sleepiness.

In contrast, the high carbohydrate breakfast—which contained large amounts of carbohydrates, as opposed to simple sugar, and only a modest amount of protein—was linked to individuals revving up their alertness quickly in the morning and sustaining that alert state.

“A breakfast rich in carbohydrates can increase alertness, so long as your body is healthy and capable of efficiently disposing of the glucose from that meal, preventing a sustained spike in blood sugar that otherwise blunts your brain’s alertness,” Vallat says

“We have known for some time that a diet high in sugar is harmful to sleep, not to mention being toxic for the cells in your brain and body,” Walker adds. “However, what we have discovered is that, beyond these harmful effects on sleep, consuming high amounts of sugar in your breakfast, and having a spike in blood sugar following any type of breakfast meal, markedly blunts your brain’s ability to return to waking consciousness following sleep.”

How you sleep

It wasn’t all about food, however. Sleep mattered significantly. In particular, Vallat and Walker discovered that sleeping longer than you usually do, and/or sleeping later than usual, resulted in individuals ramping up their alertness very quickly after awakening from sleep.

According to Walker, between seven and nine hours of sleep is ideal for ridding the body of “sleep inertia,” the inability to transition effectively to a state of functional cognitive alertness upon awakening. Most people need this amount of sleep to remove a chemical called adenosine that accumulates in the body throughout the day and brings on sleepiness in the evening, something known as sleep pressure.

“Considering that the majority of individuals in society are not getting enough sleep during the week, sleeping longer on a given day can help clear some of the adenosine sleepiness debt they are carrying,” Walker speculates.

“In addition, sleeping later can help with alertness for a second reason,” he says. “When you wake up later, you are rising at a higher point on the upswing of your 24-hour circadian rhythm, which ramps up throughout the morning and boosts alertness.”

It’s unclear, however, what physical activity does to improve alertness the following day.

“It is well known that physical activity, in general, improves your alertness and also your mood level, and we did find a high correlation in this study between participants’ mood and their alertness levels,” Vallat says. “Participants that, on average, are happier also feel more alert.”

But Vallat also notes that exercise is generally associated with better sleep and a happier mood.

“It may be that exercise-induced better sleep is part of the reason exercise the day before, by helping sleep that night, leads to superior alertness throughout the next day,” Vallat says.

Walker notes that the restoration of consciousness from non-consciousness—from sleep to wake—is unlikely to be a simple biological process.

“If you pause to think, it is a non-trivial accomplishment to go from being nonconscious, recumbent, and immobile to being a thoughtful, conscious, attentive, and productive human being, active, awake, and mobile. It’s unlikely that such a radical, fundamental change is simply going to be explained by tweaking one single thing,” he says. “However, we have discovered that there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on—a relatively simple prescription for how best to wake up each day.”

It’s under your control

Comparisons of data between pairs of identical and non-identical twins showed that genetics plays only a minor and insignificant role in next-day alertness, explaining only about 25% of the differences across individuals.

“We know there are people who always seem to be bright-eyed and bushy-tailed when they first wake up,” Walker says. “But if you’re not like that, you tend to think, ‘Well, I guess it’s just my genetic fate that I’m slow to wake up. There’s really nothing I can do about it, short of using the stimulant chemical caffeine, which can harm sleep.

“But our new findings offer a different and more optimistic message. How you wake up each day is very much under your own control, based on how you structure your life and your sleep. You don’t need to feel resigned to any fate, throwing your hands up in disappointment because, ‘…it’s my genes, and I can’t change my genes.’ There are some very basic and achievable things you can start doing today, and tonight, to change how you awake each morning, feeling alert and free of that grogginess.”

Walker, Vallat, and their colleagues continue their collaboration with the Zoe team, examining novel scientific questions about how sleep, diet, and physical exercise change people’s brain and body health, steering them away from disease and sickness.

Additional coauthors of the paper are from of King’s College London; Lund University in Malmö, Sweden; Zoe Ltd.; the University of Nottingham in the UK; and Massachusetts General Hospital and Harvard Medical School in Boston. Zoe Ltd. and the Department of Twin Studies at King College London funded the study.

The research appears in Nature Communications.

Source: UC Berkeley