How to wake up alert and refreshed

Researchers have discovered that you can wake up each morning without feeling sluggish by paying attention to three key factors: sleep, exercise, and breakfast.

Do you feel groggy until you’ve had your morning joe? Do you battle sleepiness throughout the workday?

You’re not alone. Many people struggle with morning alertness, but the new study demonstrates that awaking refreshed each day is not just something a lucky few are born with.

“From car crashes to work-related accidents, the cost of sleepiness is deadly.”

The findings come from a detailed analysis of the behavior of 833 people who, over a two-week period, were given a variety of breakfast meals; wore wristwatches to record their physical activity and sleep quantity, quality, timing, and regularity; kept diaries of their food intake; and recorded their alertness levels from the moment they woke up and throughout the day.

The researchers included twins—identical and fraternal—in the study to disentangle the influence of genes from environment and behavior.

The researchers found that the secret to alertness is a three-part prescription requiring substantial exercise the previous day, sleeping longer and later into the morning, and eating a breakfast high in complex carbohydrates, with limited sugar.

The researchers also discovered that a healthy controlled blood glucose response after eating breakfast is key to waking up more effectively.

“All of these have a unique and independent effect,” says Raphael Vallat a postdoctoral fellow at the University of California, Berkeley and first author of the study. “If you sleep longer or later, you’re going to see an increase in your alertness. If you do more physical activity on the day before, you’re going to see an increase. You can see improvements with each and every one of these factors.”

Morning grogginess is more than just an annoyance. It has major societal consequences: Many auto accidents, job injuries, and large-scale disasters are caused by people who cannot shake off sleepiness. The Exxon Valdez oil spill in Alaska, the Three Mile Island nuclear meltdown in Pennsylvania, and an even worse nuclear accident in Chernobyl, Ukraine, are well-known examples.

“Many of us think that morning sleepiness is a benign annoyance. However, it costs developed nations billions of dollars every year through loss of productivity, increased health care utilization, work absenteeism. More impactful, however, is that it costs lives—it is deadly,” says senior author Matthew Walker, professor of neuroscience and psychology and author of Why We Sleep (Simon & Schuster, 2018).

“From car crashes to work-related accidents, the cost of sleepiness is deadly. As scientists, we must understand how to help society wake up better and help reduce the mortal cost to society’s current struggle to wake up effectively each day.”

What you eat

Walker and Vallat teamed up with researchers in the United Kingdom, the US, and Sweden to analyze data acquired by a UK company, Zoe Ltd., that has followed hundreds of people for two-week periods in order to learn how to predict individualized metabolic responses to foods based on a person’s biological characteristics, lifestyle factors, and the foods’ nutritional composition.

The researchers gave participants preprepared meals, with different proportions of nutrients incorporated into muffins, for the entire two weeks to see how they responded to different diets upon waking. A standardized breakfast, with moderate amounts of fat and carbohydrates, was compared to a high protein (muffins plus a milkshake), high carbohydrate, or high sugar (glucose drink) breakfast. The subjects also wore continuous glucose monitors to measure blood glucose levels throughout the day.

“…there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on…”

The worst type of breakfast, on average, contained high amounts of simple sugar; it was associated with an inability to wake up effectively and maintain alertness. When given this sugar-infused breakfast, participants struggled with sleepiness.

In contrast, the high carbohydrate breakfast—which contained large amounts of carbohydrates, as opposed to simple sugar, and only a modest amount of protein—was linked to individuals revving up their alertness quickly in the morning and sustaining that alert state.

“A breakfast rich in carbohydrates can increase alertness, so long as your body is healthy and capable of efficiently disposing of the glucose from that meal, preventing a sustained spike in blood sugar that otherwise blunts your brain’s alertness,” Vallat says

“We have known for some time that a diet high in sugar is harmful to sleep, not to mention being toxic for the cells in your brain and body,” Walker adds. “However, what we have discovered is that, beyond these harmful effects on sleep, consuming high amounts of sugar in your breakfast, and having a spike in blood sugar following any type of breakfast meal, markedly blunts your brain’s ability to return to waking consciousness following sleep.”

How you sleep

It wasn’t all about food, however. Sleep mattered significantly. In particular, Vallat and Walker discovered that sleeping longer than you usually do, and/or sleeping later than usual, resulted in individuals ramping up their alertness very quickly after awakening from sleep.

According to Walker, between seven and nine hours of sleep is ideal for ridding the body of “sleep inertia,” the inability to transition effectively to a state of functional cognitive alertness upon awakening. Most people need this amount of sleep to remove a chemical called adenosine that accumulates in the body throughout the day and brings on sleepiness in the evening, something known as sleep pressure.

“Considering that the majority of individuals in society are not getting enough sleep during the week, sleeping longer on a given day can help clear some of the adenosine sleepiness debt they are carrying,” Walker speculates.

“In addition, sleeping later can help with alertness for a second reason,” he says. “When you wake up later, you are rising at a higher point on the upswing of your 24-hour circadian rhythm, which ramps up throughout the morning and boosts alertness.”

It’s unclear, however, what physical activity does to improve alertness the following day.

“It is well known that physical activity, in general, improves your alertness and also your mood level, and we did find a high correlation in this study between participants’ mood and their alertness levels,” Vallat says. “Participants that, on average, are happier also feel more alert.”

But Vallat also notes that exercise is generally associated with better sleep and a happier mood.

“It may be that exercise-induced better sleep is part of the reason exercise the day before, by helping sleep that night, leads to superior alertness throughout the next day,” Vallat says.

Walker notes that the restoration of consciousness from non-consciousness—from sleep to wake—is unlikely to be a simple biological process.

“If you pause to think, it is a non-trivial accomplishment to go from being nonconscious, recumbent, and immobile to being a thoughtful, conscious, attentive, and productive human being, active, awake, and mobile. It’s unlikely that such a radical, fundamental change is simply going to be explained by tweaking one single thing,” he says. “However, we have discovered that there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on—a relatively simple prescription for how best to wake up each day.”

It’s under your control

Comparisons of data between pairs of identical and non-identical twins showed that genetics plays only a minor and insignificant role in next-day alertness, explaining only about 25% of the differences across individuals.

“We know there are people who always seem to be bright-eyed and bushy-tailed when they first wake up,” Walker says. “But if you’re not like that, you tend to think, ‘Well, I guess it’s just my genetic fate that I’m slow to wake up. There’s really nothing I can do about it, short of using the stimulant chemical caffeine, which can harm sleep.

“But our new findings offer a different and more optimistic message. How you wake up each day is very much under your own control, based on how you structure your life and your sleep. You don’t need to feel resigned to any fate, throwing your hands up in disappointment because, ‘…it’s my genes, and I can’t change my genes.’ There are some very basic and achievable things you can start doing today, and tonight, to change how you awake each morning, feeling alert and free of that grogginess.”

Walker, Vallat, and their colleagues continue their collaboration with the Zoe team, examining novel scientific questions about how sleep, diet, and physical exercise change people’s brain and body health, steering them away from disease and sickness.

Additional coauthors of the paper are from of King’s College London; Lund University in Malmö, Sweden; Zoe Ltd.; the University of Nottingham in the UK; and Massachusetts General Hospital and Harvard Medical School in Boston. Zoe Ltd. and the Department of Twin Studies at King College London funded the study.

The research appears in Nature Communications.

Source: UC Berkeley

source

New solar tech is nearly 10X more efficient at splitting water

A new kind of solar panel has achieved 9% efficiency in converting water into hydrogen and oxygen—mimicking a crucial step in natural photosynthesis.

Outdoors, it represents a major leap in the technology, nearly 10 times more efficient than solar water-splitting experiments of its kind.

But the biggest benefit is driving down the cost of sustainable hydrogen. This is enabled by shrinking the semiconductor, typically the most expensive part of the device. The team’s self-healing semiconductor withstands concentrated light equivalent to 160 suns.

Currently, humans produce hydrogen from the fossil fuel methane, using a great deal of fossil energy in the process. However, plants harvest hydrogen atoms from water using sunlight. As humanity tries to reduce its carbon emissions, hydrogen is attractive as both a standalone fuel and as a component in sustainable fuels made with recycled carbon dioxide. Likewise, it is needed for many chemical processes, producing fertilizers for instance.

“In the end, we believe that artificial photosynthesis devices will be much more efficient than natural photosynthesis, which will provide a path toward carbon neutrality,” says Zetian Mi, a University of Michigan professor of electrical and computer engineering who led the study in Nature.

The outstanding result comes from two advances. The first is the ability to concentrate the sunlight without destroying the semiconductor that harnesses the light.

“We reduced the size of the semiconductor by more than 100 times compared to some semiconductors only working at low light intensity,” says Peng Zhou, a research fellow in electrical and computer engineering and first author of the study. “Hydrogen produced by our technology could be very cheap.”

And the second is using both the higher energy part of the solar spectrum to split water and the lower part of the spectrum to provide heat that encourages the reaction. The magic is enabled by a semiconductor catalyst that improves itself with use, resisting the degradation that such catalysts usually experience when they harness sunlight to drive chemical reactions.

In addition to handling high light intensities, it can thrive in high temperatures that are punishing to computer semiconductors. Higher temperatures speed up the water splitting process, and the extra heat also encourages the hydrogen and oxygen to remain separate rather than renewing their bonds and forming water once more. Both of these helped the team to harvest more hydrogen.

For the outdoor experiment, Zhou set up a lens about the size of a house window to focus sunlight onto an experimental panel just a few inches across. Within that panel, the semiconductor catalyst was covered in a layer of water, bubbling with the hydrogen and oxygen gasses it separated.

The catalyst is made of indium gallium nitride nanostructures, grown onto a silicon surface. That semiconductor wafer captures the light, converting it into free electrons and holes—positively charged gaps left behind when electrons are liberated by the light. The nanostructures are peppered with nanoscale balls of metal, 1/2000th of a millimeter across, that use those electrons and holes to help direct the reaction.

A simple insulating layer atop the panel keeps the temperature at a toasty 75 degrees Celsius, or 167 degrees Fahrenheit, warm enough to help encourage the reaction while also being cool enough for the semiconductor catalyst to perform well. The outdoor version of the experiment, with less reliable sunlight and temperature, achieved 6.1% efficiency at turning the energy from the sun into hydrogen fuel. However, indoors, the system achieved 9% efficiency.

The next challenges the team intends to tackle are to further improve the efficiency and to achieve ultrahigh purity hydrogen that can be directly fed into fuel cells.

Some of the intellectual property related to this work has been licensed to NS Nanotech Inc. and NX Fuels Inc., which were co-founded by Mi. The University of Michigan and Mi have a financial interest in both companies.

Support for the work came from the National Science Foundation, the Department of Defense, the Michigan Translational Research and Commercialization Innovation Hub, the Blue Sky Program in the College of Engineering at the University of Michigan, and the Army Research Office.

Source: University of Michigan

source

Guys who don’t feel pain seem more muscular

People perceive men described as insensitive to pain as larger and stronger than those who were sensitive to pain, research finds.

Before any physical conflict, people assess their opponent’s features to determine if the ideal tactical response is to fight, flee, or attempt to negotiate.

Throughout evolution, bigger, stronger animals have won fights with smaller, weaker animals. Because of this, when people think about the features that determine who will win a fight, they summarize those features by adjusting a mental picture of their opponent’s size and strength.

According to a new study co-led by Wilson Merrell, a doctoral candidate in psychology at the University of Michigan, and Daniel Fessler, professor of anthropology at UCLA, how we picture an opponent is affected by a psychological feature of the opponent—namely how sensitive they are to pain.

Because it allows people to persist longer in violent conflict, insensitivity to pain can be a valuable characteristic when it comes to winning fights, and this is reflected in how we picture an opponent, the researchers say.

Merrell and Fessler conducted three studies with nearly 1,000 United States online crowdsource workers.

In the first set of studies, participants read about a man who was either highly insensitive to pain (e.g., someone who didn’t feel pain very strongly during events like getting an injection at the doctor or stubbing their toe) or highly sensitive to pain (e.g., someone who felt excruciating pain during those same events).

Participants who read about the pain-insensitive man envisioned him to be taller and more muscular than participants who read about the pain-sensitive man. As the researchers expected, knowing that someone is insensitive to pain causes that person to be seen as more physically imposing.

In a final study, the researchers tested whether a man’s access to a tool that could be used as a weapon affected how sensitive to pain he appeared to be. Participants either saw a picture of a man holding an object that could be used to hurt someone (like a kitchen knife) or an object that could not (like a spatula). The men holding dangerous tools were seen as more insensitive to painful situations like getting a paper cut or bumping their head on a piece of furniture than men holding harmless tools.

The research suggests that representations of physical characteristics like height and muscularity are also subject to assessments of psychological traits, like pain sensitivity.

“Perceptions of others’ sensitivity to pain may play an important role in a variety of social interactions,” Fessler says. “When I first started exploring this topic, I was surprised that so little research had been done outside of medical contexts.

“It was particularly exciting to discover that the relationship between how intimidating someone seems and their sensitivity to pain works both ways—knowing that someone is insensitive to pain makes them seem more formidable, and, conversely, knowing that someone is intimidating makes them seem less sensitive to pain.”

Merrell says the relationship between assessments of pain insensitivity and physical size may have implications for social contexts where judgments about pain, size, and threat influence decision-making.

For example, future work could explore how stereotypes about high pain tolerance, which are often applied to Black men in the United States, play into stereotypes about physical size and influence decision-making in power-imbalanced situations, such as health care and policing.

The study’s other authors are from the University of California, Merced and the University of Michigan. The findings appear in the current issue of Evolution and Human Behavior.

Source: University of Michigan

source

After long decline, stroke deaths are rising again

An analysis of stroke deaths in the United States from 1975 to 2019 finds both a dramatic decline and the potential for an important resurgence.

Stroke mortality (per 100,0000) plummeted from 88 to 31 for women and 112 to 39 for men between 1975 and 2019 in the United States.

Total stroke deaths fell despite the rise in age-adjusted risk because stroke rates skyrocket as people get older. A 10% reduction in the fatality rate for 75-year-olds would more than offset a doubling of the fatality rate among 35-year-olds because strokes are 100 times more common in 75-year-olds.

However, barring further improvements in stroke prevention or treatment, the most recent figures demonstrate that total stroke fatalities will rise as millennials age. Age-adjusted stroke deaths per 100,000 people bottomed out in 2014 and climbed again during the last five years of the study period.

“Starting around 1960, the later you were born, the higher your risk of suffering a fatal ischemic stroke at any particular age,” says lead author Cande Ananth, chief of the division of epidemiology and biostatistics in the department of obstetrics, gynecology, and reproductive sciences at Rutgers Robert Wood Johnson Medical School.

“This study didn’t identify a cause for this trend, but other research suggests the main culprits are increasing rates of obesity and diabetes.”

The analysis used a comprehensive death-certificate database to identify virtually every adult under the age of 85 who died from a stroke during the 44 years—4,332,220 deaths in all.

It was the first stroke-death analysis to divide patients by their year of birth (cohort) and the first to identify the steady rise in age-adjusted ischemic stroke risk from the late 1950s to the early 1990s.

This “age-period-cohort analysis,” which further divided patients by their age at death, also allowed the study team to make two other novel insights:

  • Stroke fatality rates have fallen more for ischemic strokes, which occur when blood vessels to the brain are blocked, than hemorrhagic strokes, which occur when blood vessels leak or burst. The ischemic stroke fatality rate fell roughly 80% over the study period, while the hemorrhagic stroke fatality rate fell roughly 65%.
  • The disparity between male and female stroke fatality rates diminishes as patient age increase. At age 55, men are more than twice as likely as women to suffer a fatal stroke, but the disparity in the rates of fatal stroke is virtually identical at age 85.

“After nearly four decades of declining stroke-related mortality, the risk appears to be increasing in the United States,” Ananth says. “Our research underscores the need for novel strategies to combat this alarming trend.”

The study appears in the International Journal of Epidemiology.

Source: Rutgers University

source

Maya people shopped at places like today’s supermarkets

More than 500 years ago in the midwestern Guatemalan highlands, Maya people bought and sold goods at markets with far less oversight from their rulers than archeologists previously thought.

That’s according to a new study that shows the ruling K’iche’ elite took a hands-off approach when it came to managing the procurement and trade of obsidian by people outside their region of central control.

In these areas, access to nearby sources of obsidian, a glasslike rock used to make tools and weapons, was managed by local people through independent and diverse acquisition networks. Overtime, the availability of obsidian resources and the prevalence of craftsmen to shape it resulted in a system that is in many ways suggestive of contemporary market-based economies.

“Scholars have generally assumed that the obsidian trade was managed by Maya rulers, but our research shows that this wasn’t the case at least in this area,” says Rachel Horowitz, assistant professor of anthropology at Washington State University and lead author of the study published in the journal Latin American Antiquity.

“People seem to have had a good deal of economic freedom including being able to go to places similar to the supermarkets we have today to buy and sell goods from craftsmen.”

While there are extensive written records from the Maya Postclassic Period (1200-1524 AD) on political organization, much less is known about how societal elites wielded economic power. Horowitz set out to address this knowledge gap for the K’iche’ by examining the production and distribution of obsidian artifacts, which are used as a proxy by archeologists to determine the level of economic development in a region.

She performed geochemical and technological analysis on obsidian artifacts excavated from 50 sites around the K’iche’ capital of Q’umarkaj and surrounding region to determine where the raw material originally came from and techniques of its manufacture.

Her results show that the K’iche’ acquired their obsidian from similar sources in the Central K’iche’ region and Q’umarkaj, indicating a high degree of centralized control. The ruling elite also seemed to manage the trade of more valuable forms of nonlocal obsidian, particularly Pachua obsidian from Mexico, based off its abundance in these central sites.

Outside this core region though, in areas conquered by the K’iche, there was less similarity in obsidian economic networks. Horowitz’s analysis suggests these sites had access to their own sources of obsidian and developed specialized places where people could go to buy blades and other useful implements made from the rock by experts.

“For a long time, there has been this idea that people in the past didn’t have market economies, which when you think about it is kind of weird. Why wouldn’t these people have had markets in the past?” she says. “The more we look into it, the more we realize there were a lot of different ways in which these peoples’ lives were similar to ours.”

The Middle American Research Institute at Tulane University loaned Horowitz the obsidian blades and other artifacts she used for her study. The artifacts were excavated in the 1970s.

Moving forward, Horowitz says she plans to examine more of the collection, the rest of which is housed in Guatemala, to discover further details about how the Maya conducted trade, managed their economic systems, and generally went about their lives.

Source: Washington State University

source

Gene-editing gets fungi to spill secrets to new drugs

A high-efficiency gene-editing tool can get fungi to produce significantly more natural compounds, including some previously unknown to the scientific community, say researchers.

Using the approach that simultaneously modifies multiple sites in fungal genomes, Rice University chemical and biomolecular engineer Xue Sherry Gao and collaborators coax fungi into revealing their best-kept secrets, ramping up the pace of new drug discovery.

It is the first time that the technique, multiplex base-editing (MBE), has been deployed as a tool for mining fungal genomes for medically useful compounds. Compared to single-gene editing, the MBE platform reduces the research timeline by over 80% in equivalent experimental settings, from an estimated three months to roughly two weeks.

Fungi and other organisms produce bioactive small molecules such as penicillin to protect themselves from disease agents. These bioactive natural products (NPs) can be used as drugs or as molecular blueprints for designing new drugs.

The study appears in the Journal of the American Chemical Society.

Gene-editing fungi

Base-editing refers to the use of CRISPR-based tools in order to modify a rung in the spiral ladder of DNA known as a base pair. Previously, gene modifications using base-editing had to be carried out one at a time, making the research process more time-consuming. “We created a new machinery that enables base-editing to work on multiple genomic sites, hence the ‘multiplex,’” Gao says.

Gao and her team first tested the efficacy of their new base-editing platform by targeting genes encoding for pigment in a fungal strain known as Aspergillus nidulans. The effectiveness and precision of MBE-enabled genome edits was readily visible in the changed color displayed by A. nidulans colonies.

‘Cryptic’ genes

“To me, the fungal genome is a treasure,” Gao says, referring to the significant medical potential of fungi-derived compounds. “However, under most circumstances, fungi ‘keep to themselves’ in the laboratory and don’t produce the bioactive small molecules we are looking for. In other words, the majority of genes or biosynthetic gene clusters of interest to us are ‘cryptic,’ meaning they do not express their full biosynthetic potential.

“The genetic, epigenetic, and environmental factors that instruct organisms to produce these medically useful compounds are extremely complicated in fungi,” Gao says. Enabled by the MBE platform, her team can easily delete several of the regulatory genes that restrict the production of bioactive small molecules. “We can observe the synergistic effects of eliminating those factors that make the biosynthetic machinery silent,” she says.

Disinhibited, the engineered fungal strains produce more bioactive molecules, each with their own distinct chemical profiles. Five of the 30 NPs generated in one assay were new, never-before-reported compounds.

“These compounds could be useful antibiotics or anticancer drugs,” Gao says. “We are in the process of figuring out what the biological functions of these compounds are and we are collaborating with groups in the Baylor College of Medicine on pharmacological small-molecule drug discovery.”

Gao’s research plumbs fungal genomes in search of gene clusters that synthesize NPs. “Approximately 50% of clinical drugs approved by the US Food and Drug Administration are NPs or NP-derivatives,” and fungi-derived NPs “are an essential pharmaceutical source,” she says. Penicillin, lovastatin, and cyclosporine are some examples of drugs derived from fungal NPs.

The National Institutes of Health and the Robert A. Welch Foundation supported the research.

Source: Rice University

source

Newfound part of the brain acts as shield and watchdog

Researchers have discovered a previously unknown part of brain anatomy that acts as both a protective barrier and platform from which immune cells monitor the brain for infection and inflammation.

From the complexity of neural networks to basic biological functions and structures, the human brain only reluctantly reveals its secrets.

Advances in neuro-imaging and molecular biology have only recently enabled scientists to study the living brain at a level of detail not previously achievable, unlocking many of its mysteries.

The new study comes from the labs of Maiken Nedergaard, co-director of the Center for Translational Neuromedicine at University of Rochester and the University of Copenhagen and Kjeld Møllgård, a professor of neuroanatomy at the University of Copenhagen.

Nedergaard and her colleagues have transformed our understanding of the fundamental mechanics of the human brain and made significant findings in the field of neuroscience, including detailing the many critical functions of previously overlooked cells in the brain called glia and the brain’s unique process of waste removal, which the lab named the glymphatic system.

“The discovery of a new anatomic structure that segregates and helps control the flow of cerebrospinal fluid (CSF) in and around the brain now provides us much greater appreciation of the sophisticated role that CSF plays not only in transporting and removing waste from the brain, but also in supporting its immune defenses,” says Nedergaard.

The study focuses on the series of membranes that encase the brain, creating a barrier from the rest of the body and keeping the brain bathed in CSF. The traditional understanding of what is collectively called the meningeal layer identifies the three individual layers as dura, arachnoid, and pia matter.

The new layer discovered by the US and Denmark-based research team further divides the space between the arachnoid and pia layers, the subarachnoid space, into two compartments, separated by the newly described layer, which the researchers name SLYM, an abbreviation of Subarachnoidal LYmphatic-like Membrane.

While much of the research in the paper describes the function of SLYM in mice, they also report its presence in the adult human brain as well.

SLYM is a type of membrane that lines other organs in the body, including the lungs and heart, called mesothelium. These membranes typically surround and protect organs, and harbor immune cells. The idea that a similar membrane might exist in the central nervous system was a question first posed by Møllgård, the first author of the study, whose research focuses on developmental neurobiology, and on the systems of barriers that protect the brain.

The new membrane is very thin and delicate, consisting of only a few cells in thickness. Yet SLYM is a tight barrier, allowing only very small molecules to transit and it also seems to separate “clean” and “dirty” CSF.

This last observation hints at the likely role played by SLYM in the glymphatic system, which requires a controlled flow and exchange of CSF, allowing the influx of fresh CSF while flushing the toxic proteins associated with Alzheimer’s and other neurological diseases from the central nervous system.

This discovery will help researchers more precisely understand the mechanics of the glymphatic system.

The SLYM also appears important to the brain’s defenses. The central nervous system maintains its own native population of immune cells, and the membrane’s integrity prevents outside immune cells from entering. In addition, the membrane appears to host its own population of central nervous system immune cells that use SLYM as an observation point close to the surface of the brain from which to scan passing CSF for signs of infection or inflammation.

Discovery of the SLYM opens the door for further study of its role in brain disease. For example, the researchers note that larger and more diverse concentrations of immune cells congregate on the membrane during inflammation and aging. Furthermore, when the membrane was ruptured during traumatic brain injury, the resulting disruption in the flow of CSF impaired the glymphatic system and allowed non-central nervous system immune cells to enter the brain.

These and similar observations suggest that diseases as diverse as multiple sclerosis, central nervous system infections, and Alzheimer’s might be triggered or worsened by abnormalities in SLYM function. They also suggest that the delivery of drugs and gene therapeutics to the brain may be affected by SLYM, which will need to be considered as new generations of biologic therapies are being developed.

The research appears in the journal Science. Additional coauthors are from the University of Copenhagen.

Support for the study came from the Lundbeck Foundation, Novo Nordisk Foundation, the National Institute of Neurological Disorders and Stroke, the US Army Research Office, the Human Frontier Science Program, the Dr. Miriam and Sheldon G. Adelson Medical Research Foundation, and the Simons Foundation.

Source: University of Rochester

source

3D imaging tracks cancer radiation in real time

Precise 3D imaging makes it possible to track radiation, used to treat half of all cancer patients, in real time.

By capturing and amplifying tiny sound waves created when X-rays heat tissues in the body, medical professionals can map the radiation dose within the body, giving them new data to guide treatments. It’s a first-of-its-kind view of an interaction doctors have previously been unable to “see.”

Dose accumulation in an imitation patient made of lard over a delivery time of around 19 seconds, as continuously monitored by the iRAI system. (Credit: U. Michigan Optical Imaging Laboratory)

“Once you start delivering radiation, the body is pretty much a black box,” says Xueding Wang, professor of biomedical engineering and professor of radiology who leads the Optical Imaging Laboratory at the University of Michigan.

“We don’t know exactly where the X-rays are hitting inside the body, and we don’t know how much radiation we’re delivering to the target. And each body is different, so making predictions for both aspects is tricky,” says Wang, corresponding author of the study in Nature Biotechnology.

Radiation is used in treatment for hundreds of thousands of cancer patients each year, bombarding an area of the body with high energy waves and particles, usually X-rays. The radiation can kill cancer cells outright or damage them so that they can’t spread.

These benefits are undermined by a lack of precision, as radiation treatment often kills and damages healthy cells in the areas surrounding a tumor. It can also raise the risk of developing new cancers.

With real-time 3D imaging, doctors can more accurately direct the radiation toward cancerous cells and limit the exposure of adjacent tissues. To do that, they simply need to “listen.”

When X-rays are absorbed by tissues in the body, they are turned into thermal energy. That heating causes the tissue to expand rapidly, and that expansion creates a sound wave.

The acoustic wave is weak and usually undetectable by typical ultrasound technology. The new ionizing radiation acoustic imaging system detects the wave with an array of ultrasonic transducers positioned on the patient’s side. The signal is amplified and then transferred into an ultrasound device for image reconstruction.

With the images in-hand, an oncology clinic can alter the level or trajectory of radiation during the process to ensure safer and more effective treatments.

“In the future, we could use the imaging information to compensate for uncertainties that arise from positioning, organ motion, and anatomical variation during radiation therapy,” says first author Wei Zhang, a research investigator in biomedical engineering. “That would allow us to deliver the dose to the cancer tumor with pinpoint accuracy.”

Another benefit of the new technology is it can be easily added to current radiation therapy equipment without drastically changing the processes that clinicians are used to.

“In future applications, this technology can be used to personalize and adapt each radiation treatment to assure normal tissues are kept to a safe dose and that the tumor receives the dose intended,” says Kyle Cuneo, associate professor of radiation oncology at Michigan Medicine.

“This technology would be especially beneficial in situations where the target is adjacent to radiation sensitive organs such as the small bowel or stomach.”

The University of Michigan has applied for patent protection and is seeking partners to help bring the technology to market. The National Cancer Institute and the Michigan Institute for Clinical and Health Research supported the work.

Source: University of Michigan

source

Coral bleaching makes it hard for fish to spot foes

Mass coral bleaching events make it harder for some species of reef fish to identify competitors, new research reveals.

Scientists studying reefs across five Indo-Pacific regions found that the ability of butterfly fish individuals to identify competitor species and respond appropriately was compromised after widespread loss of coral caused by bleaching.

This change means they make poorer decisions that leave them less able to avoid unnecessary fights, using up precious limited energy. The scientists believe this increases the likelihood of coral loss.

“By recognizing a competitor, individual fish can make decisions about whether to escalate, or retreat from, a contest—conserving valuable energy and avoiding injuries,” says lead author Sally Keith, a senior lecturer in marine biology at Lancaster University.

“These rules of engagement evolved for a particular playing field, but that field is changing. Repeated disturbances, such as bleaching events, alter the abundance and identity of corals—the food source of butterfly fish. It’s not yet clear whether these fish have the capacity to update their rule book fast enough to recalibrate their decisions.”

“The impacts of global change on biodiversity are increasingly obvious,” says coauthor Nate Sanders, a professor in the ecology and evolutionary biology department at the University of Michigan. “This work highlights the importance of studying the behavioral responses of individuals in light of global change.”

For the study in Proceedings of the Royal Society B, the researchers took more than 3,700 observations of 38 species of butterfly fish on reefs before and after coral bleaching events and compared their behaviors.

After coral mortality caused by the bleaching event, signaling between fish of different species was less common, with encounters escalating to chases in more than 90% of cases—up from 72% before the event. Researchers also found the distance of these chases increased following bleaching, with fish expending more energy chasing away potential competitors than they would have done previously.

The researchers believe the environmental disturbances are affecting fish recognition and responses because the bleaching events, in which many corals die, are forcing fish species to change and diversify their diets and territories. Therefore, these large-scale environmental changes are disrupting long-established and co-evolved relationships that allow multiple fish species to coexist.

“We know that biodiversity is being lost—species are vanishing and populations are declining,” Sanders says. “Perhaps by focusing more on how the behavior of individuals responds to global change, we can start to predict how biodiversity might change in the future. And better yet, try to do something about it.”

Additional coauthors are from Lancaster University and the University of Queensland. The Natural Environment Research Council, the Australian Research Council, and the Villum Foundation funded the work.

Source: University of Michigan

source

Impulse Space announces first orbital transfer vehicle mission

WASHINGTON — Impulse Space announced Jan. 4 it will launch its first orbital transfer vehicle late this year on a SpaceX rideshare mission.

Impulse Space said its LEO Express-1 mission, using a transfer vehicle it is developing called Mira, is manifested for launch on SpaceX’s Transporter-9 rideshare mission currently scheduled for launch in the fourth quarter of 2023. LEO Express-1 will carry a primary payload for an undisclosed customer.

Barry Matsumori, chief operating officer of Impulse Space, said in an interview that the mission can accommodate additional payloads, like cubesats. The mission profile is still being finalized, but he said the vehicle, after making some initial deployments, may raise its orbit, then lower it to demonstrate operations in what’s known as very low Earth orbit, around 300 kilometers.

The performance of Mira depends on how much payload it is carrying, but he estimated that the vehicle can provide about 1,000 meters per second of delta-v, or change in velocity, with a payload of 300 kilograms. Its propulsion system, using storable propellants, has been extensively tested, with more than 1,000 seconds of runtime, while other elements of the vehicle are in various stages of design and manufacturing.

Impulse Space plans additional missions in 2024, he said. The company will take advantage of future SpaceX Transporter missions as well as opportunities on other vehicles like Relativity Space’s Terran.

Matsumori said the company is seeing growing demand for in-space transportation services. “The market for customers for either LEO transfers or other orbit transfers is developing at about the same pace as the in-space transportation capabilities are developing,” he said. “In the last three months, we’ve seen many more customers than we did in the prior six months.”

The number of options for in-space transportation services is also growing. On the Transporter-6 mission SpaceX launched Jan. 3, D-Orbit flew two of its ION satellite carriers that will deploy nine cubesats and support three hosted payloads. Momentus flew Vigoride-5, its second transfer vehicle carrying one cubesat and one hosted payload. Launcher flew its first Orbiter vehicle, with eight customers on board.

Matsumori said that Impulse Space plans to stand out from competitors based on performance. “Most everyone out there has fairly low delta-v’s for the mass they’re carrying,” he said. “We’re pretty much on the high end of the capabilities of the vehicles.”

Mira is the first in a series of vehicles Impulse Space is developing, with future vehicles capable of placing payloads into geostationary transfer orbits or direct insertions into geostationary orbit. “In-space is an infrastructure of capabilities, just like on Earth,” he said. “We have pickups, we have larger vans, and then we have 18-wheelers to be able to do logistics on Earth. Space is going to be no different.”

source