Global reservoirs are getting emptier

Over the past two decades, global reservoirs have become increasingly empty despite an overall increase in total storage capacity due to the construction of new reservoirs, research finds.

Led by Huilin Gao, associate professor in the department of civil and environmental engineering at Texas A&M University, researchers used a new approach with satellite data to estimate the storage variations of 7,245 global reservoirs from 1999 to 2018. The researchers report findings in Nature Communications.

Overall, global reservoir storage increased at an annual rate of 28 cubic kilometers, attributed to the construction of new reservoirs. However, despite these efforts, the data reveal that the rate of reservoir filling is lower than anticipated.

“As the global population continues to grow in the 21st century, surface water reservoirs are increasingly being relied on to meet rising demands in the context of a changing climate,” Gao says. “However, the amount of water available in reservoirs and its trends have not been well quantified at the global scale.”

The researchers developed the Global Reservoir Storage dataset, freely available online to benefit decision-makers and the wider science community.

Given the projected decline in water runoff and the rising water demand, the observed trend of diminishing storage returns from reservoir construction is expected to continue, potentially affecting water supplies and with significant implications. These findings indicate that addressing future water demands cannot rely solely on constructing new reservoirs, emphasizing the need for novel management strategies.

“Through this research, we share a new perspective for reevaluating the socio-economic benefits of new reservoir construction and the tension between growing water demand and lessening water availability in developing countries,” says Yao Li, a former postdoctoral researcher at Texas A&M who is now a professor at Southwest University.

The decline in reservoir storage is particularly prominent in the global south, including South Asia, Africa, and South America. Despite efforts to construct new reservoirs, the data show that they fall short of expected filling levels.

The most significant decline is in South America and Africa, where growing populations contribute to an escalated water demand.

In contrast, reservoirs in the global north, including regions in North America and Europe, are experiencing an upward trend in reaching their maximum capacity. Reservoirs in high-latitude regions like the Great Lakes and Siberia exhibit comparatively higher storage capacities, primarily attributed to their lower population densities and lesser effects from human activities.

The analysis did not consider the sedimentation process, and therefore the overall storage decline presented in this study is conservative.

Additional contributors to this research are the Institute of Geographic Sciences and Natural Resources Research, and Virginia Polytechnic and State University.

This research has funding from NASA and the Texas A&M President’s Excellence Fund X-Grants Program.

Source: Texas A&M University

source

DNA barcodes identify plants people ate via their poop

A new technique using DNA barcoding to identify the plant matter in human feces may get at the truth, improving clinical trials, nutrition studies, and more.

What people say they’ve eaten and what they’ve actually eaten are often two very different lists of foods. The new technique could reveal the truth.

Building on earlier studies that attempted to compare DNA found in feces with reported diets, researchers in the lab of Lawrence David, an associate professor of molecular genetics and microbiology in the Duke University School of Medicine, have developed a genetic marker for plant-based foods that can be retrieved from poop.

“We can go back after the fact and detect what foods were eaten,” says Brianna Petrone, an MD/PhD student who led the project.

The marker is a region of DNA plants use to power chloroplasts, the organelle that converts sunlight into sugars. Every plant has this genomic region, called trnL-P6, but it varies slightly from species to species. In a series of experiments, they tested the marker on more than 1,000 fecal samples from 324 study participants across five different studies, about twenty of whom had high-quality records of their diet.

In findings in the Proceedings of the National Academy of Sciences, the researchers show that these DNA markers can indicate not only what was consumed, but the relative amounts of certain food species, and that the diversity of plant DNA found in feces varies according to a person’s diet, age, and household income.

David’s lab relied on a reference database of dietary plants that contains markers for 468 species typically eaten by Americans to connect versions of trnL-P6 detected in poop to specific plant sources. After some tweaking, their barcode was able to distinguish 83% of all major crop families.

Petrone says the subset of crop families that could not currently be detected tended to be consumed in other parts of the world. The lab is now working to add crops such as pearl millet and pili nuts to their database.

They also haven’t tracked meat intake yet, though the technology is capable of that as well, David says. “That relative ratio of plant to animal intake is probably one of the most important nutritional factors we might look at.”

The scientists first tried the marker out on fecal samples from four individuals in a weight loss intervention where they knew exactly what study participants had been fed a day or two before. Knowing the patients had been given a dish called mushroom wild rice pilaf for example, they looked for the markers of its components: wild rice, white rice, portobello mushrooms, onion, pecans, thyme, parsley, and sage.

In this and a second intervention group, they found that barcoding could not only identify the plants, it also could identify relative amounts consumed for some kinds of plants. “When big portions of grains or berries were recorded in the meal, we also saw more trnLfrom those plants in stool,” Petrone says.

Then they looked at samples from 60 adults who had taken part in two studies of fiber supplementation and kept track of what they were eating with surveys. The number of plants detected by trnL was in good agreement with dietary diversity and quality estimated from participants’ survey responses.

Next, they applied the barcoding to a study 246 adolescents with and without obesity with diverse racial, ethnic, and socioeconomic backgrounds. There was only a minimal record of diet in this cohort.

“Dietary data collection was challenging because some traditional surveys are 140 pages long and take up to an hour to fill out, families are busy, and a child might not be able to fill it out alone,” David says. “But because they had banked stool, we were able reanalyze those samples and then gather information about diet that could be used to better understand health and lifestyle patterns between kids. What really struck me was that we could recapitulate things that were known as well get new insights that might not have been as obvious.”

They found 111 different markers from 46 plant families and 72 species in the adolescents’ diet. Four kinds of plants were eaten by more than two thirds of subjects: wheat, found in 96% of participants, chocolate (88%), corn (87%), and the potato family (71%), a group of closely related plants that includes potato and tomatillo.

David says the barcode isn’t able to distinguish individual members of the cabbage family—the brassica—such as broccoli, Brussels sprouts, kale, and cauliflower, which are closely related.

Still, the large adolescent cohort showed that dietary variety was greater for the higher-income study participants. The older the adolescents were however, the lower their intake of fruits, vegetables, and whole grain foods, potentially because of a known pattern of older children eating with their families less often.

David says the barcode is readily able to identify the diversity of plants found in a sample as a proxy for dietary diversity, a known marker of nutrient adequacy and better heart health.

David says that in each of these cohorts, the genomic analyses had been carried out on samples that had been collected years in the past, so the technique opens up the possibility of reconstructing dietary data for studies that have already been finished.

The authors think the new methodology should be a boon for all sorts of studies of human nutrition. “We are limited in how we can track our diets, or participate in nutrition research, or improve our own health, because of the current techniques by which diet is tracked,” David says. “Now we can use genomics to help gather data on what people eat around the world, regardless of differences in age, literacy, culture, or health status.”

The team anticipates extending the technique to studies of disease across the globe, as well as monitoring food biodiversity in settings facing climate instability or ecological distress.

Funding for this work came from the National Institute of Diabetes and Digestive and Kidney Diseases, the Burroughs Wellcome Fund Pathogenesis of Infectious Disease Award, the Duke Microbiome Center, the Springer Nature Limited Global Grant for Gut Health, the Chan Zuckerberg Initiative, the Triangle Center for Evolutionary Medicine, the Integrative Bioinformatics for Investigating and Engineering Microbiomes Graduate Student Fellowship, and the Ruth L. Kirschstein National Research Service Award to the Duke Medical Scientist Training Program. This work used a high-performance computing facility partially supported by grants from the North Carolina Biotechnology Center.

Source: Duke

source

Ancient food scraps in rockshelter clarify shift to farming

Botanical macrofossils, such as maize cobs, avocado seeds, and rinds, from the El Gigante rockshelter in western Honduras offer clues Holocene life in hunter-gatherer societies.

El Gigante is among only a handful of archaeological sites in the Americas that contain well-preserved botanical remains spanning the last 11,000 years. Considered one of the most important archaeological sites discovered in Central America in the last 40 years, El Gigante was recently nominated as a UNESCO World Heritage site.

“No other location shows, as clearly as El Gigante,” state UNESCO materials about the site’s universal value, “the dynamic character of hunter-gatherer societies, and their adaptive way of life in the Central American highlands, and in Mesoamerica broadly during the early and middle Holocene.”

Now, anthropologists Douglas Kennett and Amber VanDerwarker of UC Santa Barbara, UCSB postdoc Richard George, and colleagues from other institutions have excavated and analyzed botanical macrofossils from El Gigante using modern technologies. Their results appear in the journal PLOS ONE.

composite image of seeds, pits, and maize cobs
Macrobotanical remains from El Gigante. (Credit: Thomas Harper/UCSB)

“Our work at El Gigante demonstrates that the early use and management of tree crops like wild avocado and plums by at least 11,000 years ago,” Kennett says, “set the stage for the development of later systems of aboriculture that, when combined with field cropping of maize, beans, and squash, fueled human population growth, the development of settled agricultural villages, and the first urban centers in Mesoamerica after 3,000 years ago.”

The study provides a major update to the chronology of tree and field crop use evident in the El Gigante with 375 radiocarbon dates, finding that tree fruits and squash appeared early, around 11,000 years ago, with most other field crops appearing later in time—maize around 4,500 years, beans around 2,200 years ago.

The initial focus on tree fruits and squash, Kennett notes, is consistent with early coevolutionary partnering with humans as seed dispersers in the wake of megafaunal extinction in Central America. Tree crops predominated through much of the Holocene, and there was an overall shift to field crops after 4,000 years ago that was largely driven by increased reliance on maize farming.

“The transition to agriculture is one of the most significant transformations of our Earth’s environmental and cultural history,” Kennett says. “The domestication of plants and animals in multiple independent centers worldwide resulted in a major demographic transition in human populations that fueled the transition to more intensive forms of agriculture during the last 10,000 years. Agriculture also provided the economic foundation for urbanism and the development of state institutions after 5,000 years ago in many of these same regions.”

The botanical materials at El Gigante, remarkably well preserved, reflect the transition from foraging to farming, providing a rare glimpse of early foraging strategies and changes in subsistence. Unique in its location along the southern periphery of Mesoamerica, and for its lower elevation than the dry caves of central Mexico, the authors note, El Gigante serves as a macrobotanical archive for interactions and the flow of domesticated plants between Mesoamerica, Central America, and South America. Broader still, it enables researchers to examine the long term evolutionary and demographic processes involved in the domestication of multiple tree and field crops.

“The quality of the plant preservation at El Gigante is simply unmatched, giving us a deeper understanding of how ancient Hondurans managed their forests, domesticated a variety of plant species, and intensified their cultivation of key resources over millennia,” says VanDerwarker. “What seems clear is that practices of forest management and field cultivation were closely linked and evolved in tandem.”

We can infer some lessons for modern society, says Kennett:

“Our work shows that different types of agricultural systems supported human populations in Central America and that some were more sustainable than others,” he says. “Forest management and arboriculture persisted for thousands of years before it was eclipsed in importance by the expansion of maize farming after 4,000 years ago. The archaeological record provides an archive of human adaptation that should be considered in the context of anthropogenic alteration of our Earth’s climate today. These ancient archives could help rural farmers in Central America adapt to changing conditions moving into the future.”

Source: UC Santa Barbara

source

Hummingbirds get a bit of alcohol with their food

New research digs into how much alcohol hummingbirds consume.

Your backyard hummingbird feeder filled with sugar water is a natural experiment in fermentation—yeast settle in and turn some of the sugar into alcohol.

The same is true of nectar-filled flowers, which are an ideal gathering place for yeast—a type of fungus—and for bacteria that metabolize sugar and produce ethanol.

To biologist Robert Dudley, this raises a host of questions. How much alcohol do hummingbirds consume in their daily quest for sustenance? Are they attracted to alcohol or repelled by it? Since alcohol is a natural byproduct of the sugary fruit and floral nectar that plants produce, is ethanol an inevitable part of the diet of hummingbirds and many other animals?

“Hummingbirds are eating 80% of their body mass a day in nectar,” says Dudley, professor of integrative biology at University of California, Berkeley. “Most of it is water and the remainder sugar. But even if there are very low concentrations of ethanol, that volumetric consumption would yield a high dosage of ethanol, if it were out there. Maybe, with feeders, we’re not only [feeding] hummingbirds, we’re providing a seat at the bar every time they come in.”

During the worst of the COVID-19 pandemic, when it became difficult to test these questions in the wilds of Central America and Africa, where there are nectar-feeding sunbirds, he tasked several undergraduate students with experimenting on the hummers visiting the feeder outside his office window to find out whether alcohol in sugar water was a turn-off or a turn-on. All three of the test subjects were male Anna’s hummingbirds (Calypte anna), year-round residents of the Bay Area.

The results of that study, which appears in the journal Royal Society Open Science, demonstrate that hummingbirds happily sip from sugar water with up to 1% alcohol by volume, finding it just as attractive as plain sugar water.

They appear to be only moderate tipplers, however, because they sip only half as much as normal when the sugar water contains 2% alcohol.

“They’re consuming the same total amount of ethanol, they’re just reducing the volume of the ingested 2% solution. So that was really interesting,” Dudley says. “That was a kind of a threshold effect and suggested to us that whatever’s out there in the real world, it’s probably not exceeding 1.5%.”

When he and his colleagues tested the alcohol level in sugar water that had sat in the feeder for two weeks, they found a much lower concentration: about 0.05% by volume.

“Now, 0.05% just doesn’t sound like much, and it’s not. But again, if you’re eating 80% of your body weight a day, at .05% of ethanol you’re getting a substantial load of ethanol relative to your body mass,” he says. “So it’s all consistent with the idea that there’s a natural, chronic exposure to physiologically significant levels of ethanol derived from this nutritional source.”

“They burn the alcohol and metabolize it so quickly. Likewise with the sugars. So they’re probably not seeing any real effect. They’re not getting drunk,” he adds.

The research is part of a long-term project by Dudley and his colleagues—herpetologist Jim McGuire and bird expert Rauri Bowie, both professors of integrative biology and curators at UC Berkeley’s Museum of Vertebrate Zoology. They seek to understand the role that alcohol plays in animal diets, particularly in the tropics, where fruits and sugary nectar easily ferment, and alcohol cannot help but be consumed by fruit-eating or nectar-sipping animals.

“Does alcohol have any behavioral effect? Does it stimulate feeding at low levels? Does it motivate more frequent attendance of a flower if they get not just sugar, but also ethanol? I don’t have the answers to these questions. But that’s experimentally tractable,” he says.

Part of this project, funded by the National Science Foundation, involves testing the alcohol content of fruits in Africa and nectar in flowers in the UC Botanical Garden. No systematic studies of the alcohol content of fruits and nectars, or of alcohol consumption by nectar-sipping birds, insects, or mammals, or by fruit-eating animals—including primates—have been done.

But several isolated studies are suggestive. A 2008 study found that the nectar in palm flowers consumed by pen-tailed tree shrews, which are small, ratlike animals in West Malaysia, had levels of alcohol as high as 3.8% by volume. Another study, published in 2015, found a relatively high alcohol concentration—up to 3.8%—in the nectar eaten by the slow loris, a type of primate, and that both slow lorises and aye-ayes, another primate, preferred nectar with higher alcohol content.

The new study shows that birds are also likely consuming alcohol produced by natural fermentation.

“This is the first demonstration of ethanol consumption by birds, quote, in the wild. I’ll use that phrase cautiously because it’s a lab experiment and feeder measurement,” Dudley says. “But the linkage with the natural flowers is obvious. This just demonstrates that nectar-feeding birds, not just nectar-feeding mammals, not just fruit-eating animals, are all potentially exposed to ethanol as a natural part of their diet.”

The next step, he says, is to measure how much ethanol is naturally found in flowers and determine how frequently it’s being consumed by birds. He plans to extend his study to include Old World sunbirds and honey eaters in Australia, both of which occupy the nectar-sipping niche that hummingbirds have in America.

Dudley has been obsessed with alcohol use and misuse for years, and in his book, The Drunken Monkey: Why We Drink and Abuse Alcohol (University of California Press, 2014), presented evidence that humans’ attraction to alcohol is an evolutionary adaptation to improve survival among primates. Only with the coming of industrial alcohol production has our attraction turned, in many cases, into alcohol abuse.

“Why do humans drink alcohol at all, as opposed to vinegar or any of the other 10 million organic compounds out there? And why do most humans actually metabolize it, burn it, and use it pretty effectively, often in conjunction with food, but then some humans also consume to excess?” he asks.

“I think, to get a better understanding of human attraction to alcohol, we really have to have better animal model systems, but also a realization that the natural availability of ethanol is actually substantial, not just for primates that are feeding on fruit and nectar, but also for a whole bunch of other birds and mammals and insects that are also feeding on flowers and fruits,” he says. “The comparative biology of ethanol consumption may yield insight into modern day patterns of consumption and abuse by humans.”

This work received support from the National Science Foundation and UC Berkeley’s Undergraduate Research Apprentice Program.

Source: UC Berkeley

source

Microwaving insecticide could keep bed nets working

To make the insecticide deltamethrin more effective, researchers are turning to the microwave.

Deltamethrin is an insecticide that is commonly incorporated into bed nets to fight mosquitoes that carry malaria. But some mosquitoes have become resistant to it, making the nets less effective and increasing the risk of disease.

Now, experiments by New York University chemistry professor Bart Kahr and colleagues show that heating up insecticides can rearrange their crystal structure, yielding new forms that may work better against mosquitoes. They started their research with DDT before moving on to deltamethrin, as Kahr explains in the video above.

Science News reported on this encouraging development to counter the problem of insecticide resistance.

Kahr and colleagues report their findings in Malaria Journal.

Source: NYU

source

How the Lengya virus enters human cells

New findings shed light on how the highly infectious Lengya virus, which has recently transferred from animals to people, is able to enter human cells.

Ariel Isaacs and Yu Shang Low of the University of Queensland have uncovered the structure of the fusion protein of Langya virus, which was discovered in people in eastern China in August 2022.

Isaacs says the virus caused fever and severe respiratory symptoms and was from the same class of viruses as the deadly Nipah and Hendra viruses.

“We’re at an important juncture with viruses from the Henipavirus genus, as we can expect more spill over events from animals to people,” Isaacs says. “It’s important we understand the inner workings of these emerging viruses, which is where our work comes in.”

The team used molecular clamp technology to hold the fusion protein of the Langya virus in place to uncover the atomic structure using cryogenic electron microscopy at the university’s Centre for Microscopy & Microanalysis.

“Understanding the structure and how it enters cells is a critical step towards developing vaccines and treatments to combat Henipavirus infections,” says Isaacs. “There are currently no treatments or vaccines for them, and they have the potential to cause a widespread outbreak.”

Associate professor Daniel Watterson, a senior researcher on the project, says they also saw that the Langya virus fusion protein structure is similar to the deadly Hendra virus, which first emerged in southeast Queensland in 1994.

“These are viruses that can cause severe disease and have the potential to get out of control if we’re not properly prepared,” Watterson says. “We saw with COVID-19 how unprepared the world was for a widespread viral outbreak and we want to be better equipped for the next outbreak.”

The researchers will now work to develop broad-spectrum human vaccines and treatments for Henipaviruses, such as Langya, Nipah, and Hendra.

The research appears Nature Communications. Support came from the Coalition for Epidemic Preparedness Innovations, the Queensland and Australian governments, and philanthropic partners.

Source: University of Queensland

source

Poor sense of smell tied to higher depression risk in older adults

Researchers say they have significant new evidence of a link between decreased sense of smell and risk of developing late-life depression.

Their findings in Journal of Gerontology: Medical Sciences do not demonstrate that loss of smell causes depression, but suggests that it may serve as a potent indicator of overall health and well-being.

“We’ve seen repeatedly that a poor sense of smell can be an early warning sign of neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease, as well as a mortality risk. This study underscores its association with depressive symptoms,” says Vidya Kamath, associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.

“Additionally, this study explores factors that might influence the relationship between olfaction and depression, including poor cognition and inflammation,” Kamath says.

The study used data gathered from 2,125 participants in a federal government study known as the Health, Aging and Body Composition Study (Health ABC). This cohort was composed of a group of healthy older adults ages 70–73 at the start of the eight-year study period in 1997–98. Participants showed no difficulties in walking 0.25 miles, climbing 10 steps, or performing normal activities at the start of the study, and were assessed in person annually and by phone every six months. Tests included those for the ability to detect certain odors, depression, and mobility assessments.

In 1999, when smell was first measured, 48% of participants displayed a normal sense of smell, 28% showed a decreased sense of smell, known as hyposmia, and 24% had a profound loss of the sense, known as anosmia. Participants with a better sense of smell tended to be younger than those reporting significant loss or hyposmia.

Over follow-up, 25% of participants developed significant depressive symptoms. When analyzed further, researchers found that individuals with decreased or significant loss of smell had increased risk of developing significant depressive symptoms at longitudinal follow-up than those in the normal olfaction group. Participants with a better sense of smell tended to be younger than those reporting significant loss or hyposomia.

Researchers also identified three depressive symptom “trajectories” in the study group: stable low, stable moderate, and stable high depressive symptoms. Poorer sense of smell was associated with an increased chance of a participant falling into the moderate or high depressive symptoms groups, meaning that the worse a person’s sense of smell, the higher their depressive symptoms. These findings persisted after adjusting for age, income, lifestyle, health factors, and use of antidepressant medication.

“Losing your sense of smell influences many aspects of our health and behavior, such as sensing spoiled food or noxious gas, and eating enjoyment. Now we can see that it may also be an important vulnerability indicator of something in your health gone awry,” says Kamath. “Smell is an important way to engage with the world around us, and this study shows it may be a warning sign for late-life depression.”

Humans’ sense of smell is one of two chemical senses. It works through specialized sensory cells, called olfactory neurons, which are found in the nose. These neurons have one odor receptor; it picks up molecules released by substances around us, which are then relayed to the brain for interpretation. The higher the concentration of these smell molecules the stronger the smell, and different combinations of molecules result in different sensations.

Smell is processed in the brain’s olfactory bulb, which is believed to interact closely with the amygdala, hippocampus, and other brain structures that regulate and enable memory, decision-making, and emotional responses.

The researchers say their study suggests that olfaction and depression may be linked through both biological (e.g., altered serotonin levels, brain volume changes) and behavioral (e.g., reduced social function and appetite) mechanisms.

The researchers plan to replicate their findings from this study in more groups of older adults, and examine changes to individuals’ olfactory bulbs to determine if this system is in fact altered in those diagnosed with depression. They also plan to examine if smell can be used in intervention strategies to mitigate risk of late-life depression.

Additional scientists who contributed to this research are from the Johns Hopkins University School of Medicine and Bloomberg School of Public Health; the University of Connecticut; the University of California, San Francisco; the National Institute on Aging; and Michigan State University.

No authors declared conflicts of interest related to this research under Johns Hopkins University School of Medicine policies.

Support for the work came from the National Institute on Aging, the National Institute of Nursing Research and the Intramural Research Program of the National Institutes of Health: National Institute on Aging.

Source: Johns Hopkins

source

In Europe, ancestral family ties predict politics today

The stronger your ancestral family ties, the more likely you are to hold right-wing cultural policy preferences, research finds.

A new study from Neil Fasching and Yphtach Lelkes of the University of Pennsylvania’s Annenberg School for Communication finds that the family structure of one’s ancestors—sometimes dating back thousands of years—reliably predicts their political beliefs today. If you come from a family line with strong kinship ties (who typically live with extended family and marry within their community), you’re likely to hold right-wing cultural values, and among some, left-wing economic attitudes.

“Interestingly, we find evidence that the association isn’t just with the individual beliefs,” says Fasching, a doctoral student studying political communication. “It plays out even at the country and legislative level. If a country’s population is rooted in close, tight-knit families, that country is less likely to pass LGBTQ-friendly laws, for example.”

In order to trace the effects of ancestral kinship strength on contemporary political attitudes, Fasching and Lelkes assigned “kinship tightness scores” to more than 20,000 second-generation immigrants living in 32 European countries. They chose second-generation immigrants in order to disentangle the respondents’ current location from their ancestors’ location.

The researchers crunched the numbers, using data on individuals’ beliefs, values, ethnic groups, and the degree to which a person’s ethnic group has historically depended on hunter-gathering versus agriculture, to determine the association between right-wing beliefs and family structure. Fasching and Lelkes also measured individuals’ political engagement.

Modern family kinship structure is determined by many factors, the researchers say, but most commonly, it is based on who family members are allowed to marry, geographic distance to extended family, and trust of outsiders.

These structures can date back to early civilization, when hunter-gatherers searched for game and Neolithic humans farmed crops and bred animals. Hunter-gatherers tended to have relatively weak family ties due to their need to travel to find food. It was easiest for these societies to revolve around the nuclear family, rather than establish permanent settlements with extended families and strong kinship ties.

The researchers found that kinship strength is strongly associated with more anti-LGBT laws. Countries low on kinship tightness, such as Norway, Finland, Germany, and the United States, were much less likely to have implemented anti-LGBT laws, while countries high on kinship tightness, such as the Democratic Republic of the Congo, Grenada, and Liberia, were much more likely to have anti-LGBT laws.

“While policy decisions may seem like they’re driven by current or transient factors, our research shows that public policy is also deeply rooted in the cultural fabric of a society,” says Lelkes, associate professor of communication at Annenberg and co-director of the Polarization Research Lab and the Center for Information Networks and Democracy.

“Ancestral family kinship structures are a part of that cultural fabric and can shape the fundamental values and social norms that underpin our societies. These structures were formed based on the environment in which our ancestors lived and can have a significant influence on the policies that are ultimately adopted.”

The findings appear in the British Journal of Political Science.

Source: Penn

source

Findings suggest future meds for natural killer/T-cell lymphoma

A new study finds an increase in transcription factor TOX2 in people with natural killer/T-cell lymphoma.

The increased TOX2 level leads to the growth and spread of natural killer/T-cell lymphoma (NKTL), as well as the overproduction of PRL-3, an oncogenic phosphatase that is a known key player in the survival and metastasis of several other types of cancers.

This breakthrough discovery presents a potential novel therapeutic target to treat NKTL.

NKTL is an Epstein-Barr virus (EBV) associated, aggressive non-Hodgkin lymphoma (NHL) with very poor treatment outcomes in the advanced stages. It is prevalent in Asia and Latin America but rare in Europe and North America. Combined radiation therapy and chemotherapy is the consensus standard therapy for NKTL, however, they are also often associated with high relapse rate and serious side effects. Improved knowledge of the molecular mechanism leading to NKTL progression, as well as the development of novel targeted therapy strategies, has to be addressed urgently.

Professor Chng Wee Joo and associate professor Takaomi Sanda from the Cancer Science Institute of Singapore (CSI Singapore), along with Ong Choon Kiat from Duke-NUS Medical School, report the findings in the journal Molecular Cancer.

The findings are also the first to show the involvement of TOX2 and PRL-3 in NKTL. They validated the findings in both cell lines and in a large set of patient tumor samples. In addition, the team analyzed the clinical features of 42 NKTL cases in an independent cohort and found that TOX2 was not only overexpressed in NKTL primary tumors, but also negatively associated with patient survival.

Currently, there are no TOX2-specific inhibitors. As such, targeting TOX2, or its downstream PRL-3, could be a valuable therapeutic intervention for NKTL patients and warrants further study.

Chng, who is the co-lead author of the study, says, “We have now identified novel treatment targets, TOX2 and the downstream PRL3, in NKTL, where new treatment is greatly needed. We can use different strategies to target these. Proteolysis-targeting chimera (PROTAC) targeting TOX2 to degrade TOX2 protein may be a viable NKTL therapy option.

“A humanized antibody, PRL3-zumab, has been approved for phase 2 clinical trials in Singapore, US, and China to treat all solid tumors. With our findings from this study, it is definitely timely to evaluate PRL3-zumab’s effect in patient with NKTL.”

Moving forward, the group is currently testing novel agents for targeting TOX2 and PRL-3 in NKTL. The long-term goal is to bring these novel agents into clinical trials.

Source: National University of Singapore

source

Force of hits, not just number, raises CTE risk

The clearest predictor of what could cause a person to suffer the brain disease CTE later in life is the cumulative force of thousands of hits to the head, and not just the sheer volume of concussions, research shows.

For years, researchers studying chronic traumatic encephalopathy, or CTE, believed the primary cause of it was repetitive hits to the head, whether or not those hits caused concussions. They believed the more frequently that a person sustained head blows, the more likely they were to develop neurological and cognitive struggles later in life.

The new study in Nature Communications adds a wrinkle to the research.

The study is the largest one to date, examining root causes of CTE, which is associated with everything from memory loss to impulsive behavior to suicidal thoughts and depression.

Using data from 34 published studies that tracked blows to the head measured by sensors inside of football helmets, the researchers were able to see how 631 former football players, whose brains were donated for research to Boston University, have been affected.

The study found that 71% of the brains examined—451 of the 631—had some level of CTE, while 180 showed no sign of the disease. The worst forms of CTE showed up in players who had absorbed the greatest cumulative force of hits to the head, meaning they were hit often and hard. (The individuals who absorb the hardest hits to the head are defensive backs, wide receivers, and running backs.)

Senior author Jesse Mez is an associate professor of neurology at the Chobanian & Avedisian School of Medicine, as well as the associate director of the Boston University’s Alzheimer’s Disease Research Center and codirector of clinical research at the BU CTE Center. Here he clarifies the study findings and where CTE research goes next:

The National Institutes of Health, the Department of Veterans Affairs, and the Department of Defense funded the work.

Source: Doug Most for Boston University

source