City lizards sport genomic markers in common

City lizards have parallel genomic markers when compared to neighboring forest lizards, a study finds.

The genetic variations linked to urbanization underlie physical differences in the urban lizards including longer limbs and larger toepads that show how these lizards have evolved to adapt to city environments.

Urbanization has dramatically transformed landscapes around the world—changing how animals interact with nature, creating “heat islands” with higher temperatures, and hurting local biodiversity. Yet many organisms survive and even thrive in these urban environments, taking advantage of new types of habitat created by humans. Researchers studying evolutionary changes in urban species have found that some populations, for example, undergo metabolic changes from new diets or develop an increased tolerance of heat.

“Urbanization impacts roughly two-thirds of the Earth and is expected to continue to intensify, so it’s important to understand how organisms might be adapting to changing environments,” says Kristin Winchell, assistant professor of biology at New York University and the study’s first author.

“In many ways, cities provide us with natural laboratories for studying adaptive change, as we can compare urban populations with their non-urban counterparts to see how they respond to similar stressors and pressures over short periods of time.”

scan of lizard with highlighted closeups of toe pads
Urban lizards had significantly longer limbs and larger toe pads with more specialized scales on their toes. (Credit: NYU)

Anolis cristatellus lizards—a small-bodied species also known as the Puerto Rican crested anole—are common in both urban and forested areas of Puerto Rico. Prior studies by Winchell and her colleagues found that urban Anolis cristatellus have evolved certain traits to live in cities: they have larger toepads with more specialized scales that allow them to cling to smooth surfaces like walls and glass, and have longer limbs that help them sprint across open areas.

In the new study, the researchers looked at 96 Anolis cristatellus lizards from three regions of Puerto Rico—San Juan, Arecibo, and Mayagüez—comparing lizards living in urban centers with those living in forests surrounding each city. Their findings appear in PNAS.

They first confirmed that the lizard populations in the three regions were genetically distinct from one another, so any similarities they found among lizards across the three cities could be attributed to urbanization. They then measured their toepads and legs and found that urban lizards had significantly longer limbs and larger toepads with more specialized scales on their toes, supporting their earlier research that these traits have evolved to enable urban lizards to thrive in cities.

To understand the genetic basis of these trait differences, the researchers conducted several genomic analyses on exomic DNA, the regions of the genome that code for proteins. They identified a set of 33 genes found in three regions of the lizard genome that were repeatedly associated with urbanization across populations, including genes related to immune function and metabolism.

“While we need further analysis of these genes to really know what this finding means, we do have evidence that urban lizards get injured more and have more parasites, so changes to immune function and wound healing would make sense. Similarly, urban anoles eat human food, so it is possible that they could be experiencing changes to their metabolism,” says Winchell.

In an additional analysis, they found 93 genes in the urban lizards that are important for limb and skin development, offering a genomic explanation for the increases in their legs and toepads.

“The physical differences we see in the urban lizards appear to be mirrored at the genomic level,” says Winchell. “If urban populations are evolving with parallel physical and genomic changes, we may even be able to predict how populations will respond to urbanization just by looking at genetic markers.”

“Understanding how animals adapt to urban environments can help us focus our conservation efforts on the species that need it the most, and even build urban environments in ways that maintain all species,” adds Winchell.

Do the differences in urban lizards apply to people living in cities? Not necessarily, according to Winchell, as humans aren’t at the whim of predators like lizards are. But humans are subject to some of the same urban factors, including pollution and higher temperatures, that seem to be contributing to adaptation in other species.

Additional study authors are from Princeton University; Washington University in St. Louis; the University of Massachusetts Boston and Universidad Católica de la Santísima Concepción in Chile; Virginia Commonwealth University; and Rutgers University-Camden. The research had funding in part from the National Science Foundation, and from the University of Massachusetts Boston Bollinger Memorial Research Grant.

Source: NYU

source

Team pinpoints genetic cause of late-onset ataxia

Researchers have identified a previously unknown genetic cause of a late-onset cerebellar ataxia.

The discovery will improve diagnosis and open new treatment avenues for this progressive condition.

Late-onset cerebellar ataxias (LOCA) are a heterogenous group of neurodegenerative diseases that manifest in adulthood with unsteadiness. One to three in 100,000 people worldwide will develop a late-onset ataxia.

Until recently, most patients with late-onset ataxia had remained without a genetic diagnosis.

Researchers led by Bernard Brais, a neurologist and researcher at The Neuro (Montreal Neurological Institute-Hospital) of McGill University, and Stephan Züchner of the University of Miami’s Miller School of Medicine in collaboration with neurologists from the Universities of Montreal and Sherbrooke, studied a group of 66 Quebec patients from different families who had late-onset ataxia for which an underlying genetic cause had not yet been found.

Using the most advanced genetic technologies, the team found that 40 (61%) of the patients carried the same novel disease-causing variant in the gene FGF14, making it the most common genetic cause of late-onset ataxia in Quebec. They found that a small stretch of repetitive DNA underwent a large size increase in patients, a phenomenon known as repeat expansion.

To confirm their initial findings, the team reached out to international collaborators in Tübingen in Germany, Perth in Australia, London in the United Kingdom, and Bengaluru in India. They found that 10-18% of LOCA patients in these independent cohorts also carried the same FGF14 error. These results confirmed that a repeat expansion in FGF14 is one of the most common genetic causes of late-onset ataxia described to date.

Going beyond gene discovery, the team also studied brains from deceased patients and neurons derived from patients and found that the error causes decreased expression of the gene and its protein.

Patients with a FGF14-related late-onset ataxia experience unsteadiness (ataxia) usually beginning in their fifties. The disease may start with short episodes of ataxia that can be precipitated by exercise and alcohol intake. The coordination problems become permanent on average around 59 years old. The disease usually progresses slowly and affects walking, speech, and hand coordination. Often with time it requires walking aids. The condition is most often transmitted from an affected parent although it can appear in families with no previous history of ataxia.

The FGF14 GAA repeated sequence of the mutation raises much interest, since it is identical to the one causing Friedreich ataxia in the FXN gene, the most common cause of autosomal recessive ataxia worldwide. Because of this similarity, it is possible that some of the new treatments under development for Friedreich ataxia may be used to treat patients with an expansion in FGF14.

This study also suggests that patients may benefit from a drug called aminopyridine that is already marketed for other neurological conditions. This is especially promising since some patients with an expansion in FGF14 have responded well to this treatment.

“This opens the door to a clinical trial of this drug in these patients,” says Brais. “It’s great news for patients in Canada and worldwide. It also makes genetic testing possible so people and families can arrive at the end of their long diagnostic journey.”

The research appears in the New England Journal of Medicine. Funding for the study came from many national agencies, including the NIH, the Fondation Groupe Monaco, and the Montreal General Hospital Foundation.

Source: McGill University

source

How many people does a scoop of wastewater represent?

Researchers have developed a machine learning model that uses the assortment of microbes found in wastewater to tease out how many individual people they represent.

Research from the lab of Fangqiong Ling showed earlier this year that the amount of SARS-CoV-2 in a wastewater system was correlated with the burden of disease—COVID-19—in the region it served.

But before that work could be done, Ling needed to know: How can you figure out the number of individuals represented in a random sample of wastewater?

A chance encounter with a colleague helped Ling, an assistant professor in the department of energy, environmental, and chemical engineering at the McKelvey School of Engineering at Washington University in St. Louis, led to the creation of a method to do just that

Going forward, this method may be able to link other properties in wastewater to individual-level data.

The problem was straightforward: “If you just take one scoop of wastewater, you don’t know how many people you’re measuring,” Ling says. This is counter to the way studies are typically designed.

“Usually when you design your experiment, you design your sample size, you know how many people you’re measuring,” Ling says. Before she could look for a correlation between SARS-CoV-2 and the number of people with COVID, she had to figure out how many people were represented in the water she was testing.

Initially, Ling thought that machine learning might be able to uncover a straightforward relationship between the diversity of microbes and the number of people it represented, but the simulations, done with an “off-the-shelf” machine learning, didn’t pan out.

Then Ling had a chance encounter with Likai Chen, an assistant professor of mathematics and statistics. The two realized they shared an interest in working with novel, complex data. Ling mentioned that she was working on a project that Chen might be able to contribute to.

“She shared the problem with me and I says, that’s indeed something we can do,” Chen says. It happened that Chen was working on a problem that used a technique that Ling also found helpful.

The key to being able to tease out how many individual people were represented in a sample is related to the fact that, the bigger the sample, the more likely it is to resemble the mean, or average. But in reality, individuals tend not to be exactly “average.” Therefore, if a sample looks like an average sample of microbiota, it’s likely to be made up of many people. The farther away from the average, the more likely it is to represent an individual.

“But now we are dealing with high-dimensional data, right?” Chen says. There are near-endless number of ways that you can group these different microbes to form a sample. “So that means we have to find out, how do we aggregate that information across different locations?”

Using this basic intuition—and a lot of math—Chen worked with Ling to develop a more tailored machine learning algorithm that could, if trained on real samples of microbiota from more than 1,100 people, determine how many people were represented in a wastewater sample (these samples were unrelated to the training data).

“It’s much faster and it can be trained on a laptop,” Ling says. And it’s not only useful for the microbiome, but also, with sufficient examples—training data—this algorithm could use viruses from the human virome or metabolic chemicals to link individuals to wastewater samples.

“This method was used to test our ability to measure population size,” Ling says. But it goes much further. “Now we are developing a framework to allow validation across studies.”

The research appears in the journal PLOS Computational Biology.

Source: Washington University in St. Louis

source

1 opioid-use disorder med may be safest during pregnancy

People with opioid-use disorder who are pregnant may have more favorable neonatal health outcomes when using buprenorphine compared with methadone, a new study shows.

Buprenorphine is an active ingredient in suboxone and other medications approved for treatment of opioid-use disorder.

The study, published in the New England Journal of Medicine, compared the safety of the two medications—buprenorphine and methadone—used to treat opioid-use disorder during pregnancy.

“Our results may encourage increasing access to buprenorphine treatment specifically among pregnant people,” says lead author Elizabeth Suarez, a pharmacoepidemiologist who is a faculty member at the Center for Pharmacoepidemiology and Treatment Science at the Rutgers Institute for Health, Health Care Policy and Aging Research.

“It’s essential for the general public to understand the importance of opioid-use disorder treatment during pregnancy to avoid harms associated with lack of treatment.”

Overdose deaths from opioids continue to increase and opioid-use disorder remains a prevalent issue in the United States, according to the National Center for Health Statistics and research in Drug and Alcohol Dependence.

People who have untreated opioid-use disorders while pregnant and their infants are at greater risk of harms because of withdrawal, continued opioid use, and overdose. Previous studies found buprenorphine may be safer for the infant than methadone, but results were uncertain.

Using a large national database of people insured by Medicaid, researchers examined maternal outcomes in a sample of pregnant individuals with opioid-use disorder, including delivery by cesarean section and severe pregnancy complications, and infant outcomes, including preterm birth, low birth weight, small size for gestational age, and neonatal abstinence syndrome that occurs when the infant experiences withdrawal after delivery after being exposed to certain drugs in pregnancy.

They found that using buprenorphine to treat opioid-use disorder during pregnancy may result in better outcomes for the baby than methadone. Buprenorphine use was associated with lower risk of preterm birth, small size for gestational age, low birth weight, and neonatal abstinence syndrome compared with methadone use.

“These results may guide clinical recommendations for people with opioid-use disorder who are pregnant or are hoping to be pregnant,” says Suarez, who also is an instructor of epidemiology with the department of biostatistics and epidemiology at the Rutgers School of Public Health.

Future research should explore whether pregnant people with opioid-use disorder have a better experience taking buprenorphine or methadone and if they are more likely to stay on one medication longer than the other, Suarez says.

Additional coauthors are from Brigham and Women’s Hospital, Harvard Medical School, and Stanford University School of Medicine. The National Institute on Drug Abuse supported the work.

Source: Rutgers University

source

How to wake up alert and refreshed

Researchers have discovered that you can wake up each morning without feeling sluggish by paying attention to three key factors: sleep, exercise, and breakfast.

Do you feel groggy until you’ve had your morning joe? Do you battle sleepiness throughout the workday?

You’re not alone. Many people struggle with morning alertness, but the new study demonstrates that awaking refreshed each day is not just something a lucky few are born with.

“From car crashes to work-related accidents, the cost of sleepiness is deadly.”

The findings come from a detailed analysis of the behavior of 833 people who, over a two-week period, were given a variety of breakfast meals; wore wristwatches to record their physical activity and sleep quantity, quality, timing, and regularity; kept diaries of their food intake; and recorded their alertness levels from the moment they woke up and throughout the day.

The researchers included twins—identical and fraternal—in the study to disentangle the influence of genes from environment and behavior.

The researchers found that the secret to alertness is a three-part prescription requiring substantial exercise the previous day, sleeping longer and later into the morning, and eating a breakfast high in complex carbohydrates, with limited sugar.

The researchers also discovered that a healthy controlled blood glucose response after eating breakfast is key to waking up more effectively.

“All of these have a unique and independent effect,” says Raphael Vallat a postdoctoral fellow at the University of California, Berkeley and first author of the study. “If you sleep longer or later, you’re going to see an increase in your alertness. If you do more physical activity on the day before, you’re going to see an increase. You can see improvements with each and every one of these factors.”

Morning grogginess is more than just an annoyance. It has major societal consequences: Many auto accidents, job injuries, and large-scale disasters are caused by people who cannot shake off sleepiness. The Exxon Valdez oil spill in Alaska, the Three Mile Island nuclear meltdown in Pennsylvania, and an even worse nuclear accident in Chernobyl, Ukraine, are well-known examples.

“Many of us think that morning sleepiness is a benign annoyance. However, it costs developed nations billions of dollars every year through loss of productivity, increased health care utilization, work absenteeism. More impactful, however, is that it costs lives—it is deadly,” says senior author Matthew Walker, professor of neuroscience and psychology and author of Why We Sleep (Simon & Schuster, 2018).

“From car crashes to work-related accidents, the cost of sleepiness is deadly. As scientists, we must understand how to help society wake up better and help reduce the mortal cost to society’s current struggle to wake up effectively each day.”

What you eat

Walker and Vallat teamed up with researchers in the United Kingdom, the US, and Sweden to analyze data acquired by a UK company, Zoe Ltd., that has followed hundreds of people for two-week periods in order to learn how to predict individualized metabolic responses to foods based on a person’s biological characteristics, lifestyle factors, and the foods’ nutritional composition.

The researchers gave participants preprepared meals, with different proportions of nutrients incorporated into muffins, for the entire two weeks to see how they responded to different diets upon waking. A standardized breakfast, with moderate amounts of fat and carbohydrates, was compared to a high protein (muffins plus a milkshake), high carbohydrate, or high sugar (glucose drink) breakfast. The subjects also wore continuous glucose monitors to measure blood glucose levels throughout the day.

“…there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on…”

The worst type of breakfast, on average, contained high amounts of simple sugar; it was associated with an inability to wake up effectively and maintain alertness. When given this sugar-infused breakfast, participants struggled with sleepiness.

In contrast, the high carbohydrate breakfast—which contained large amounts of carbohydrates, as opposed to simple sugar, and only a modest amount of protein—was linked to individuals revving up their alertness quickly in the morning and sustaining that alert state.

“A breakfast rich in carbohydrates can increase alertness, so long as your body is healthy and capable of efficiently disposing of the glucose from that meal, preventing a sustained spike in blood sugar that otherwise blunts your brain’s alertness,” Vallat says

“We have known for some time that a diet high in sugar is harmful to sleep, not to mention being toxic for the cells in your brain and body,” Walker adds. “However, what we have discovered is that, beyond these harmful effects on sleep, consuming high amounts of sugar in your breakfast, and having a spike in blood sugar following any type of breakfast meal, markedly blunts your brain’s ability to return to waking consciousness following sleep.”

How you sleep

It wasn’t all about food, however. Sleep mattered significantly. In particular, Vallat and Walker discovered that sleeping longer than you usually do, and/or sleeping later than usual, resulted in individuals ramping up their alertness very quickly after awakening from sleep.

According to Walker, between seven and nine hours of sleep is ideal for ridding the body of “sleep inertia,” the inability to transition effectively to a state of functional cognitive alertness upon awakening. Most people need this amount of sleep to remove a chemical called adenosine that accumulates in the body throughout the day and brings on sleepiness in the evening, something known as sleep pressure.

“Considering that the majority of individuals in society are not getting enough sleep during the week, sleeping longer on a given day can help clear some of the adenosine sleepiness debt they are carrying,” Walker speculates.

“In addition, sleeping later can help with alertness for a second reason,” he says. “When you wake up later, you are rising at a higher point on the upswing of your 24-hour circadian rhythm, which ramps up throughout the morning and boosts alertness.”

It’s unclear, however, what physical activity does to improve alertness the following day.

“It is well known that physical activity, in general, improves your alertness and also your mood level, and we did find a high correlation in this study between participants’ mood and their alertness levels,” Vallat says. “Participants that, on average, are happier also feel more alert.”

But Vallat also notes that exercise is generally associated with better sleep and a happier mood.

“It may be that exercise-induced better sleep is part of the reason exercise the day before, by helping sleep that night, leads to superior alertness throughout the next day,” Vallat says.

Walker notes that the restoration of consciousness from non-consciousness—from sleep to wake—is unlikely to be a simple biological process.

“If you pause to think, it is a non-trivial accomplishment to go from being nonconscious, recumbent, and immobile to being a thoughtful, conscious, attentive, and productive human being, active, awake, and mobile. It’s unlikely that such a radical, fundamental change is simply going to be explained by tweaking one single thing,” he says. “However, we have discovered that there are still some basic, modifiable, yet powerful ingredients to the awakening equation that people can focus on—a relatively simple prescription for how best to wake up each day.”

It’s under your control

Comparisons of data between pairs of identical and non-identical twins showed that genetics plays only a minor and insignificant role in next-day alertness, explaining only about 25% of the differences across individuals.

“We know there are people who always seem to be bright-eyed and bushy-tailed when they first wake up,” Walker says. “But if you’re not like that, you tend to think, ‘Well, I guess it’s just my genetic fate that I’m slow to wake up. There’s really nothing I can do about it, short of using the stimulant chemical caffeine, which can harm sleep.

“But our new findings offer a different and more optimistic message. How you wake up each day is very much under your own control, based on how you structure your life and your sleep. You don’t need to feel resigned to any fate, throwing your hands up in disappointment because, ‘…it’s my genes, and I can’t change my genes.’ There are some very basic and achievable things you can start doing today, and tonight, to change how you awake each morning, feeling alert and free of that grogginess.”

Walker, Vallat, and their colleagues continue their collaboration with the Zoe team, examining novel scientific questions about how sleep, diet, and physical exercise change people’s brain and body health, steering them away from disease and sickness.

Additional coauthors of the paper are from of King’s College London; Lund University in Malmö, Sweden; Zoe Ltd.; the University of Nottingham in the UK; and Massachusetts General Hospital and Harvard Medical School in Boston. Zoe Ltd. and the Department of Twin Studies at King College London funded the study.

The research appears in Nature Communications.

Source: UC Berkeley

source

New solar tech is nearly 10X more efficient at splitting water

A new kind of solar panel has achieved 9% efficiency in converting water into hydrogen and oxygen—mimicking a crucial step in natural photosynthesis.

Outdoors, it represents a major leap in the technology, nearly 10 times more efficient than solar water-splitting experiments of its kind.

But the biggest benefit is driving down the cost of sustainable hydrogen. This is enabled by shrinking the semiconductor, typically the most expensive part of the device. The team’s self-healing semiconductor withstands concentrated light equivalent to 160 suns.

Currently, humans produce hydrogen from the fossil fuel methane, using a great deal of fossil energy in the process. However, plants harvest hydrogen atoms from water using sunlight. As humanity tries to reduce its carbon emissions, hydrogen is attractive as both a standalone fuel and as a component in sustainable fuels made with recycled carbon dioxide. Likewise, it is needed for many chemical processes, producing fertilizers for instance.

“In the end, we believe that artificial photosynthesis devices will be much more efficient than natural photosynthesis, which will provide a path toward carbon neutrality,” says Zetian Mi, a University of Michigan professor of electrical and computer engineering who led the study in Nature.

The outstanding result comes from two advances. The first is the ability to concentrate the sunlight without destroying the semiconductor that harnesses the light.

“We reduced the size of the semiconductor by more than 100 times compared to some semiconductors only working at low light intensity,” says Peng Zhou, a research fellow in electrical and computer engineering and first author of the study. “Hydrogen produced by our technology could be very cheap.”

And the second is using both the higher energy part of the solar spectrum to split water and the lower part of the spectrum to provide heat that encourages the reaction. The magic is enabled by a semiconductor catalyst that improves itself with use, resisting the degradation that such catalysts usually experience when they harness sunlight to drive chemical reactions.

In addition to handling high light intensities, it can thrive in high temperatures that are punishing to computer semiconductors. Higher temperatures speed up the water splitting process, and the extra heat also encourages the hydrogen and oxygen to remain separate rather than renewing their bonds and forming water once more. Both of these helped the team to harvest more hydrogen.

For the outdoor experiment, Zhou set up a lens about the size of a house window to focus sunlight onto an experimental panel just a few inches across. Within that panel, the semiconductor catalyst was covered in a layer of water, bubbling with the hydrogen and oxygen gasses it separated.

The catalyst is made of indium gallium nitride nanostructures, grown onto a silicon surface. That semiconductor wafer captures the light, converting it into free electrons and holes—positively charged gaps left behind when electrons are liberated by the light. The nanostructures are peppered with nanoscale balls of metal, 1/2000th of a millimeter across, that use those electrons and holes to help direct the reaction.

A simple insulating layer atop the panel keeps the temperature at a toasty 75 degrees Celsius, or 167 degrees Fahrenheit, warm enough to help encourage the reaction while also being cool enough for the semiconductor catalyst to perform well. The outdoor version of the experiment, with less reliable sunlight and temperature, achieved 6.1% efficiency at turning the energy from the sun into hydrogen fuel. However, indoors, the system achieved 9% efficiency.

The next challenges the team intends to tackle are to further improve the efficiency and to achieve ultrahigh purity hydrogen that can be directly fed into fuel cells.

Some of the intellectual property related to this work has been licensed to NS Nanotech Inc. and NX Fuels Inc., which were co-founded by Mi. The University of Michigan and Mi have a financial interest in both companies.

Support for the work came from the National Science Foundation, the Department of Defense, the Michigan Translational Research and Commercialization Innovation Hub, the Blue Sky Program in the College of Engineering at the University of Michigan, and the Army Research Office.

Source: University of Michigan

source

Guys who don’t feel pain seem more muscular

People perceive men described as insensitive to pain as larger and stronger than those who were sensitive to pain, research finds.

Before any physical conflict, people assess their opponent’s features to determine if the ideal tactical response is to fight, flee, or attempt to negotiate.

Throughout evolution, bigger, stronger animals have won fights with smaller, weaker animals. Because of this, when people think about the features that determine who will win a fight, they summarize those features by adjusting a mental picture of their opponent’s size and strength.

According to a new study co-led by Wilson Merrell, a doctoral candidate in psychology at the University of Michigan, and Daniel Fessler, professor of anthropology at UCLA, how we picture an opponent is affected by a psychological feature of the opponent—namely how sensitive they are to pain.

Because it allows people to persist longer in violent conflict, insensitivity to pain can be a valuable characteristic when it comes to winning fights, and this is reflected in how we picture an opponent, the researchers say.

Merrell and Fessler conducted three studies with nearly 1,000 United States online crowdsource workers.

In the first set of studies, participants read about a man who was either highly insensitive to pain (e.g., someone who didn’t feel pain very strongly during events like getting an injection at the doctor or stubbing their toe) or highly sensitive to pain (e.g., someone who felt excruciating pain during those same events).

Participants who read about the pain-insensitive man envisioned him to be taller and more muscular than participants who read about the pain-sensitive man. As the researchers expected, knowing that someone is insensitive to pain causes that person to be seen as more physically imposing.

In a final study, the researchers tested whether a man’s access to a tool that could be used as a weapon affected how sensitive to pain he appeared to be. Participants either saw a picture of a man holding an object that could be used to hurt someone (like a kitchen knife) or an object that could not (like a spatula). The men holding dangerous tools were seen as more insensitive to painful situations like getting a paper cut or bumping their head on a piece of furniture than men holding harmless tools.

The research suggests that representations of physical characteristics like height and muscularity are also subject to assessments of psychological traits, like pain sensitivity.

“Perceptions of others’ sensitivity to pain may play an important role in a variety of social interactions,” Fessler says. “When I first started exploring this topic, I was surprised that so little research had been done outside of medical contexts.

“It was particularly exciting to discover that the relationship between how intimidating someone seems and their sensitivity to pain works both ways—knowing that someone is insensitive to pain makes them seem more formidable, and, conversely, knowing that someone is intimidating makes them seem less sensitive to pain.”

Merrell says the relationship between assessments of pain insensitivity and physical size may have implications for social contexts where judgments about pain, size, and threat influence decision-making.

For example, future work could explore how stereotypes about high pain tolerance, which are often applied to Black men in the United States, play into stereotypes about physical size and influence decision-making in power-imbalanced situations, such as health care and policing.

The study’s other authors are from the University of California, Merced and the University of Michigan. The findings appear in the current issue of Evolution and Human Behavior.

Source: University of Michigan

source

After long decline, stroke deaths are rising again

An analysis of stroke deaths in the United States from 1975 to 2019 finds both a dramatic decline and the potential for an important resurgence.

Stroke mortality (per 100,0000) plummeted from 88 to 31 for women and 112 to 39 for men between 1975 and 2019 in the United States.

Total stroke deaths fell despite the rise in age-adjusted risk because stroke rates skyrocket as people get older. A 10% reduction in the fatality rate for 75-year-olds would more than offset a doubling of the fatality rate among 35-year-olds because strokes are 100 times more common in 75-year-olds.

However, barring further improvements in stroke prevention or treatment, the most recent figures demonstrate that total stroke fatalities will rise as millennials age. Age-adjusted stroke deaths per 100,000 people bottomed out in 2014 and climbed again during the last five years of the study period.

“Starting around 1960, the later you were born, the higher your risk of suffering a fatal ischemic stroke at any particular age,” says lead author Cande Ananth, chief of the division of epidemiology and biostatistics in the department of obstetrics, gynecology, and reproductive sciences at Rutgers Robert Wood Johnson Medical School.

“This study didn’t identify a cause for this trend, but other research suggests the main culprits are increasing rates of obesity and diabetes.”

The analysis used a comprehensive death-certificate database to identify virtually every adult under the age of 85 who died from a stroke during the 44 years—4,332,220 deaths in all.

It was the first stroke-death analysis to divide patients by their year of birth (cohort) and the first to identify the steady rise in age-adjusted ischemic stroke risk from the late 1950s to the early 1990s.

This “age-period-cohort analysis,” which further divided patients by their age at death, also allowed the study team to make two other novel insights:

  • Stroke fatality rates have fallen more for ischemic strokes, which occur when blood vessels to the brain are blocked, than hemorrhagic strokes, which occur when blood vessels leak or burst. The ischemic stroke fatality rate fell roughly 80% over the study period, while the hemorrhagic stroke fatality rate fell roughly 65%.
  • The disparity between male and female stroke fatality rates diminishes as patient age increase. At age 55, men are more than twice as likely as women to suffer a fatal stroke, but the disparity in the rates of fatal stroke is virtually identical at age 85.

“After nearly four decades of declining stroke-related mortality, the risk appears to be increasing in the United States,” Ananth says. “Our research underscores the need for novel strategies to combat this alarming trend.”

The study appears in the International Journal of Epidemiology.

Source: Rutgers University

source

Maya people shopped at places like today’s supermarkets

More than 500 years ago in the midwestern Guatemalan highlands, Maya people bought and sold goods at markets with far less oversight from their rulers than archeologists previously thought.

That’s according to a new study that shows the ruling K’iche’ elite took a hands-off approach when it came to managing the procurement and trade of obsidian by people outside their region of central control.

In these areas, access to nearby sources of obsidian, a glasslike rock used to make tools and weapons, was managed by local people through independent and diverse acquisition networks. Overtime, the availability of obsidian resources and the prevalence of craftsmen to shape it resulted in a system that is in many ways suggestive of contemporary market-based economies.

“Scholars have generally assumed that the obsidian trade was managed by Maya rulers, but our research shows that this wasn’t the case at least in this area,” says Rachel Horowitz, assistant professor of anthropology at Washington State University and lead author of the study published in the journal Latin American Antiquity.

“People seem to have had a good deal of economic freedom including being able to go to places similar to the supermarkets we have today to buy and sell goods from craftsmen.”

While there are extensive written records from the Maya Postclassic Period (1200-1524 AD) on political organization, much less is known about how societal elites wielded economic power. Horowitz set out to address this knowledge gap for the K’iche’ by examining the production and distribution of obsidian artifacts, which are used as a proxy by archeologists to determine the level of economic development in a region.

She performed geochemical and technological analysis on obsidian artifacts excavated from 50 sites around the K’iche’ capital of Q’umarkaj and surrounding region to determine where the raw material originally came from and techniques of its manufacture.

Her results show that the K’iche’ acquired their obsidian from similar sources in the Central K’iche’ region and Q’umarkaj, indicating a high degree of centralized control. The ruling elite also seemed to manage the trade of more valuable forms of nonlocal obsidian, particularly Pachua obsidian from Mexico, based off its abundance in these central sites.

Outside this core region though, in areas conquered by the K’iche, there was less similarity in obsidian economic networks. Horowitz’s analysis suggests these sites had access to their own sources of obsidian and developed specialized places where people could go to buy blades and other useful implements made from the rock by experts.

“For a long time, there has been this idea that people in the past didn’t have market economies, which when you think about it is kind of weird. Why wouldn’t these people have had markets in the past?” she says. “The more we look into it, the more we realize there were a lot of different ways in which these peoples’ lives were similar to ours.”

The Middle American Research Institute at Tulane University loaned Horowitz the obsidian blades and other artifacts she used for her study. The artifacts were excavated in the 1970s.

Moving forward, Horowitz says she plans to examine more of the collection, the rest of which is housed in Guatemala, to discover further details about how the Maya conducted trade, managed their economic systems, and generally went about their lives.

Source: Washington State University

source

Gene-editing gets fungi to spill secrets to new drugs

A high-efficiency gene-editing tool can get fungi to produce significantly more natural compounds, including some previously unknown to the scientific community, say researchers.

Using the approach that simultaneously modifies multiple sites in fungal genomes, Rice University chemical and biomolecular engineer Xue Sherry Gao and collaborators coax fungi into revealing their best-kept secrets, ramping up the pace of new drug discovery.

It is the first time that the technique, multiplex base-editing (MBE), has been deployed as a tool for mining fungal genomes for medically useful compounds. Compared to single-gene editing, the MBE platform reduces the research timeline by over 80% in equivalent experimental settings, from an estimated three months to roughly two weeks.

Fungi and other organisms produce bioactive small molecules such as penicillin to protect themselves from disease agents. These bioactive natural products (NPs) can be used as drugs or as molecular blueprints for designing new drugs.

The study appears in the Journal of the American Chemical Society.

Gene-editing fungi

Base-editing refers to the use of CRISPR-based tools in order to modify a rung in the spiral ladder of DNA known as a base pair. Previously, gene modifications using base-editing had to be carried out one at a time, making the research process more time-consuming. “We created a new machinery that enables base-editing to work on multiple genomic sites, hence the ‘multiplex,’” Gao says.

Gao and her team first tested the efficacy of their new base-editing platform by targeting genes encoding for pigment in a fungal strain known as Aspergillus nidulans. The effectiveness and precision of MBE-enabled genome edits was readily visible in the changed color displayed by A. nidulans colonies.

‘Cryptic’ genes

“To me, the fungal genome is a treasure,” Gao says, referring to the significant medical potential of fungi-derived compounds. “However, under most circumstances, fungi ‘keep to themselves’ in the laboratory and don’t produce the bioactive small molecules we are looking for. In other words, the majority of genes or biosynthetic gene clusters of interest to us are ‘cryptic,’ meaning they do not express their full biosynthetic potential.

“The genetic, epigenetic, and environmental factors that instruct organisms to produce these medically useful compounds are extremely complicated in fungi,” Gao says. Enabled by the MBE platform, her team can easily delete several of the regulatory genes that restrict the production of bioactive small molecules. “We can observe the synergistic effects of eliminating those factors that make the biosynthetic machinery silent,” she says.

Disinhibited, the engineered fungal strains produce more bioactive molecules, each with their own distinct chemical profiles. Five of the 30 NPs generated in one assay were new, never-before-reported compounds.

“These compounds could be useful antibiotics or anticancer drugs,” Gao says. “We are in the process of figuring out what the biological functions of these compounds are and we are collaborating with groups in the Baylor College of Medicine on pharmacological small-molecule drug discovery.”

Gao’s research plumbs fungal genomes in search of gene clusters that synthesize NPs. “Approximately 50% of clinical drugs approved by the US Food and Drug Administration are NPs or NP-derivatives,” and fungi-derived NPs “are an essential pharmaceutical source,” she says. Penicillin, lovastatin, and cyclosporine are some examples of drugs derived from fungal NPs.

The National Institutes of Health and the Robert A. Welch Foundation supported the research.

Source: Rice University

source