Elephants mix it up when it comes to food

A new analysis of the dietary habits of elephants showed surprising variation from meal to meal.

Elephants eat plants. That’s common knowledge. Yet figuring out exactly what kind of plants the iconic herbivores eat is more complicated.

For the new study, a team of conservation biologists used innovative methods to efficiently and precisely analyze the dietary habits of two groups of elephants in Kenya, down to the specific types of plants eaten by which animals in the group.

Their findings on the habits of individual elephants help answer important questions about the foraging behaviors of groups, and help biologists understand the conservation approaches that best keep elephants not only sated but satisfied.

“It’s really important for conservationists to keep in mind that when animals don’t get enough of the foods that they need, they may survive—but they may not prosper,” says Tyler Kartzinel, an assistant professor of environmental studies and of ecology, evolution, and organismal biology at Brown University and author of the study in the journal Royal Society Open Science.

“By better understanding what each individual eats, we can better manage iconic species like elephants, rhinos, and bison to ensure their populations can grow in sustainable ways.”

What’s an elephant’s favorite food?

One of the main tools that the scientists used to conduct their study is called DNA metabarcoding, a cutting-edge genetic technique that allows researchers to identify the composition of biological samples by matching the extracted DNA fragments representing an elephant’s food to a library of plant DNA barcodes.

Brown has been developing applications for this technology, says Kartzinel, and bringing together researchers from molecular biology and the computational side to solve problems faced by conservationists in the field.

This is the first use of DNA metabarcoding to answer a long-term question about social foraging ecology, which is how members of a social group—such as a family—decide what foods to eat, Kartzinel says.

“When I talk to non-ecologists, they are stunned to learn that we have never really had a clear picture of what all of these charismatic large mammals actually eat in nature,” Kartzinel says. “The reason is that these animals are difficult and dangerous to observe from up close, they move long distances, they feed at night and in thick bush, and a lot of the plants they feed on are quite small.”

Not only are the elephants hard to monitor, but their food can be nearly impossible to identify by eye, even for an expert botanist, according to Kartzinel, who has conducted field research in Kenya.

The researchers compared the new genetic technique to a method called stable isotope analysis, which involves a chemical analysis of animal hair. Two of the study authors, George Wittemyer at Colorado State University and Thure Cerling at the University of Utah, had previously shown that elephants switch from eating fresh grasses when it rains to eating trees during the long dry season.

While this advanced study by allowing researchers to identify broad-scale dietary patterns, they still couldn’t discern the different types of plants in the elephant’s diet.

Clues in elephants’ poo

The scientists had saved fecal samples that had been collected in partnership with the non-profit organization Save the Elephants when Wittemyer and Cerling were conducting the stable isotopes analyses almost 20 years ago. Study author Brian Gill, then a Brown postdoctoral associate, determined that the samples were still usable even after many years in storage.

The team coupled combined analyses of carbon stable isotopes from the feces and hair of elephants with dietary DNA metabarcoding, GPS-tracking, and remote-sensing data to evaluate the dietary variation of individual elephants in two groups.

They matched each unique DNA sequence in the sample to a collection of reference plants—developed with the botanical expertise of Paul Musili, director of the East Africa Herbarium at the National Museums of Kenya—and compared the diets of individual elephants through time.

In their analysis, they showed that dietary differences among individuals were often far greater than had been previously assumed, even among family members that foraged together on a given day.

This study helps address a classic paradox in wildlife ecology, Kartzinel says: “How do social bonds hold family groups together in a world of limited resources?”

In other words, given that elephants all seemingly eat the same plants, it’s not obvious why competition for food doesn’t push them apart and force them to forage independently.

The simple answer is that elephants vary their diets based not only on what’s available but also their preferences and physiological needs, says Kartzinel. A pregnant elephant, for example, may have different cravings and requirements at various times in her pregnancy.

While the study wasn’t designed to explain social behavior, these findings help inform theories of why a group of elephants may forage together: The individual elephants don’t always eat exactly the same plants at the same time, so there will usually be enough plants to go around.

These findings may offer valuable insights for conservation biologists. To protect elephants and other major species and create environments in which they can successfully reproduce and grow their populations, they need a variety of plants to eat. This may also decrease the chances of inter-species competition and prevent the animals from poaching human food sources, such as crops.

“Wildlife populations need access to diverse dietary resources to prosper,” Kartzinel says. “Each elephant needs variety, a little bit of spice—not literally in their food, but in their dietary habits.”

The National Science Foundation supported the work.

Source: Brown University

source

These animal interactions are risks for future pandemics

Animal industries in the United States pose serious risk of future pandemics and the US government lacks a comprehensive strategy to address these threats, a new study concludes.

The analysis calls for tightening existing regulations and implementing new ones in order to prevent zoonotic-driven outbreaks.

The report is the first to comprehensively map networks of animal commerce that fuel zoonotic disease risk in the US. It analyzes 36 different animal industries, including fur-farming, the exotic pet trade, hunting and trapping, industrial animal agriculture, backyard chicken production, roadside zoos, and more, to assess the risks each poses of generating a large-scale disease outbreak.

The report states, far from being a problem that only exists elsewhere, many high-risk interactions between humans and animals that happen routinely and customarily inside the US could spark future pandemics. All of the animal industries the report examines are far less regulated than they should be and far less than the public believes they currently are. Today, wide regulatory gaps exist through which pathogens can spillover and spread, leaving the public constantly vulnerable to zoonotic disease.

“COVID has infected more than 100 million Americans and killed over a million of them. But the next pandemic may be far worse and might happen sooner than we think. The stakes are simply too high for the problem to be ignored,” says Ann Linder, one of the report’s lead authors and a research fellow with the Brooks McCormick Jr. Animal Law and Policy Program at Harvard Law School.

The immense and increasing scale of animal use in the United States makes the country uniquely vulnerable to zoonotic outbreaks. For example, the US is the largest importer of live wildlife in the world, importing more than 220 million wild animals a year, many without any health checks or disease testing.

The US also produces more livestock than almost any other nation. In 2022, the US processed more than 10 billion livestock, the largest number ever recorded. Yet, the USDA does not regulate on-farm production of livestock. At slaughterhouses inspections are cursory, with each inspector tasked with examining more than 600 animals per hour for signs of disease.

The US is one of the world’s largest producers of pigs and poultry, two important carriers of influenza viruses—viruses that scientists believe are most likely to produce a large-scale human pandemic.

  • The largest avian influenza outbreak in US history is currently ongoing and has left 58 million poultry dead since it began in 2022. The virus has spread to several species of mammals in the US and has infected a man in Colorado. Even a slight shift in the viruses’ composition could allow it to move rapidly through human populations.
  • The US also has recorded more “swine flu” infections than any other country since 2011. Most of these infections occurred in children exhibiting pigs at state and county fairs, which attract 150 million visitors each year and have given rise to multistate outbreaks of influenza. Despite this, animal fairs remain largely unregulated.

In addition, the people most vulnerable to zoonotic disease in the US are those who work hands-on with farmed animals. Such jobs tend to be disproportionately staffed by people of color and those in rural communities who may be the least likely and the least able to report disease or seek medical care.

Studies estimate that swine workers have a 30 times greater risk of zoonotic influenza infection than the general public, but these viruses have the potential to spread far beyond livestock workers. The CDC estimates the 2009 “swine flu” hospitalized over 900,000 Americans.

Live animal markets in the US (elsewhere called “wet markets“), where animals are stored alive and slaughtered onsite for customers, also pose serious disease risks. New York City alone is home to at least 84 live animal markets.

  • A detailed study of pigs in two live animal food markets in Minneapolis found high rates of influenza viruses not only in and on the animals but also in the air and on surfaces throughout the market.
  • A shocking 65% of workers at the market tested positive for influenza during the 12-week study, as did a 12-year-old customer who became sick after touching the railings of a pig pen and one of the animals.

Wildlife also poses significant risk. Hundreds of millions of live wild animals are imported into the US each year, many without ever being looked at by anyone. Only scant and incomplete information exists about these animals, where they are originating, and where they go after they arrive. For example:

  • The $15 billion US exotic pet trade brings high-risk species of wildlife into American homes, initiating close human-animal interactions that serve as potential flashpoints for spillover of zoonotic disease—with roughly 14% of American households owning one or more exotic animal from among hundreds of species that range from monkeys to monitor lizards.
  • Animals carrying zoonotic disease are sold through legal channels such as pet stores without health checks or veterinary oversight, as well as through the black market.
  • Some exotic animal dealers keep more than 25,000 wild animals together at a single facility, often in poor conditions that facilitate disease spread, before they are shipped off to customers across the country.
  • During a major mpox outbreak, which originated in one of these facilities after it received a shipment of exotic animals from overseas, CDC agents were not able to track down a large number of infected prairie dogs that had been sold through pet stores and swap meets.

Even lesser-known animal industries in the US pose serious risks to human health. Crocodile farms have facilitated the spread of West Nile Virus to humans and mink in fur farms have transmitted COVID-19 to humans.

Still, many industries that generate risk are loosely regulated or not regulated at all. Policy change is often reactive, the report explains, happening only after outbreaks occur. Rarely, it says, do agencies take proactive steps to address zoonotic risk, even when they are aware of the danger to the public. For many industries, the government lacks even basic data and has no system to screen animals for disease or to identify zoonotic threats proactively. In some industries, government action actually drives zoonotic risk and increases human exposure to pathogens.

“While zoonotic risks cannot be eliminated, they can be managed and reduced in ways that make all of us safer. But we need to look them in the eye. The risks that these markets present have been ignored or downplayed for far too long,” says Dale Jamieson, director of New York University’s Center for Environmental and Animal Protection.

This US report is being released ahead of a larger global policy report overseen by the same researchers from Harvard Law School’s Brooks McCormick Jr. Animal Law & Policy Program and New York University’s Center for Environmental and Animal Protection. The full report, which will be released later this year, examines global policy responses to live animal markets in 15 countries and the role these markets play in zoonotic disease transmission. The project aims to provide a comprehensive assessment that will assist global policymakers and increase public awareness of the dangers posed by zoonotic diseases.

Source: NYU

source

Structural racism worsens diabetes crisis

Structural racism and geographic inequity are advancing the global crisis of diabetes, report researchers.

These factors leave people with diabetes 50% more likely to develop cardiovascular disease and twice as likely to die compared to those without diabetes, especially among minority populations.

A narrative literature review recently published in The Lancet, led by Saria Hassan, assistant professor at Emory University School of Medicine and Rollins School of Public Health, and coauthored by faculty of the Emory Global Diabetes Research Center (EGDRC) and Morehouse School of Medicine, addresses and summarizes the current understanding of diabetes disparities by examining differences between and within race and ethnic groups and among young people aged 18 years and younger.

As part of the newly published The Lancet series, “Global Inequity in Diabetes“, the study also evaluates structural racism’s prominent role in diabetes disparities and offers recommendations to improve equity in diabetes care.

“Focusing solely on adults overlooks the degree to which the accelerating epidemic of type 2 diabetes in children and adolescents is contributing to the growing burden of disease and worsening disparities across the US,” Hassan told The Lancet.

The researchers used a conceptual framework to categorize the causes of diabetes disparities across the lifespan, which looked at factors in five domains. In terms of structural racism, the researchers found that it affects diabetes disparities at all levels, from policies and interpersonal relationships to the community level.

“Your environment in many ways dictates your health,” says Hassan. “It’s well-established that obesity, health behaviors, lifestyle, and access to quality care are risk factors for diabetes. However, at the community level, neighborhoods with primarily Black and Hispanic individuals tend to have little space for physical activity, have more food deserts, and higher levels of toxic environmental exposures.”

Estimates indicate that rates of diabetes are almost 1.5 times higher among minority ethnic groups, such as American Indians and Alaska Natives, Blacks, Hispanics, and Asians compared to the white population.

Significant diabetes disparities persist in the US, from the number of people suffering from the disease and have complications to who has access to effective medications. Medical cost and lost wages of people with diabetes contribute to $327 billion annually, further burdening minority groups and those of lower socioeconomic status.

African Americans are 19% less likely and American Indian and Native Americans 41% less likely to access newer diabetes treatment such as GLP1.

“Because of structural racism, Black and Hispanic Americans are more likely to be low-income, which means that they are less likely to be able to afford high co-pay medications or are uninsured and under-insured” says Hassan.

As a result of their review, the researchers provided key recommendations to community partners, researchers, practitioners, health system administrators, and policy makers to reduce disparities.

These recommendations include:

  • Research needs to be action-oriented, community-based, and multidisciplinary.
  • Those who fund research and activities to address diabetes disparities must ensure equitable, sustainable, and cost-effective research that adopts a health equity plan.
  • Practitioners on the front lines need to know and understand the multilevel factors contributing to diabetes disparities.
  • Policy makers need to recognize how policies historically have contributed to health disparities, and work to ensure future policies dismantle these disparities and do not worsen them.

Source: Emory University

source

‘Mask’ lets hepatitis C virus evade immune system

A new way to examine the hepatitis C virus has helped clarify how it evades the human immune system and spreads through the body: it puts on a “mask.”

An estimated 50 million people worldwide are infected with chronic hepatitis C. The hepatitis C virus can cause inflammation and scarring of the liver, and in the worst case, liver cancer.

Hepatitis C was discovered in 1989 and is one of the most studied viruses on the planet. Yet for decades, how it manages to evade the human immune system and spread through the body has been a riddle.

Donning a mask allows the virus to remain hidden while making copies of itself to infect new cells. The mask cloaks the virus in the form of a molecule already in our cells. Disguised by the molecule, our immune systems confuse the virus with something harmless that needn’t be reacted to.

“How the hepatitis C virus manages to hide in our liver cells without being detected by the immune system has always been a bit of a mystery,” says co-lead researcher Jeppe Vinther, associate professor in the biology department at the University of Copenhagen.

“Our revelation of the virus’ masking strategy is important, as it could pave the way for new ways of treating viral infections. And it is likely that other types of viruses use the same trick.”

Hepatitis C mystery solved

The mask the hepatitis virus uses to hide in our cells is called FAD, a molecule composed of vitamin B2 and the energy carrying molecule ATP. FAD is vital for our cells to convert energy. The FAD molecule’s importance and familiarity to our cells makes it ideal camouflage for a malicious virus.

For several years, the research team had a good idea that FAD was helping the virus hide in infected cells, but they lacked a clear way to prove it. To solve the challenge, they turned to Arabidopsis, a well-known experimental plant among researchers.

“We were getting desperate to find a way to prove our hypothesis, which is when we purified an enzyme from the Arabidopsis plant that can split the FAD molecule in two,” says Anna Sherwood from the biology department, who together with Lizandro Rene Rivera Rangel are first authors of the study.

Using the enzyme, the researchers were able to split the FAD and prove that the hepatitis C virus used it as a mask.

Hiding from immune system

Like both the coronavirus and influenza virus, hepatitis C is an RNA virus. Its genetic material consists of RNA that must be copied once the virus enters its host organism. New RNA copies are used to take over new cells, and one end of the RNA’s genetic material is masked by the FAD.

It is very realistic that other RNA viruses use similar masking techniques to spread without being detected by cellular control systems, Vinther says. In fact, researchers have already found another virus that uses the same strategy. And there are likely more.

“All RNA viruses have the same need to hide from the immune system and there is a good chance that this is just the beginning. Now that we’re attuned to this trick, it opens up the possibility of developing new and perhaps improved methods of tracking and treating viral infections in the future,” Vinther says.

The study is published in Nature.

Independent Research Fund Denmark funded the work.

Source: University of Copenhagen

source

The moon holds an Earth-like granite system

New research shows a likely large Earth-like granite system is present on the moon.

The finding, which appears in a Nature paper, may help expand knowledge of geothermal lunar processes.

Granites, which result from magma due to igneous activity, are nearly absent in our solar system outside Earth. Yet over the past decade, evidence from remote sensing systems used by geoscientists including Timothy Glotch of Stony Brook University, has proven that notable silicic features like granites within a volcanic complex are on the moon.

Previously, only a sampling of grains of granite were detected in the hundreds of kilograms of rocks returned by Apollo astronauts, and remote sensing studies since have found only a few small granite or granite like features on the moon.

For the new work, the research team used remote sensing measurements, specifically orbital microwave radiometry and gravity measurements, to detect a large (greater than 50 kilometers (about 31 miles) in diameter) granitic body underneath the Compton-Belkovich Volcanic Complex (CBVC) on the moon’s far side.

“Typically, granites require either plate tectonics or water-bearing magmas to form,” says Glotch, coauthor and professor in the geosciences department. “While the lunar interior contains small amounts of water, the moon has never undergone plate tectonics. Therefore, this discovery of the granitic complex, or batholith underneath the CBVC, points to some not-yet-understood process that is responsible for the granitic formation.”

While Glotch and coauthors are not yet sure what the process is, there could a number of possibilities.

These could include a fractionation of KREEP (potassium-rare-earth-elements-and-phosphorus) basaltic liquids, or partial melting KREE-rich crust. If either of these were the case, it would require an abnormally hydrous mantle underneath the CBVC and a compositionally heterogenous lunar mantle.

The authors theorize that “the surprising magnitude and geographic extent of this feature imply an Earth-like, evolved granite system larger than believed possible on the moon, especially outside of the Procellarum region… a phenomenon previously documented only on Earth.”

In addition to the discovery, the authors say of the remote sensing method that “this work illustrates a new tool for mapping planetary geothermal gradient from orbit through passive microwave radiometry, which can provide a window into crustal and interior heat-producing structures.”

Furthermore, they say that the methods used are generalizable, and “similar uses of passive radiometric data could vastly expand our knowledge of geothermal processes on the moon and other planetary bodies.”

Glotch worked with the lead author, Matthew A. Siegler, of the Planetary Science Institute in Tucson, Arizona, to conceptualize the study. The research is built out of more than 10 years of work by Glotch and other collaborators nationally to use remote sensing measurements of the moon to map the presence and quality properties of anomalously silicic features on the lunar surface and interior.

The researchers previously identified a number of volcanoes, including CBVC and the Gruithuisen domes, as having granite-like compositions. These regions will be the target of a NASA rover mission in 2026.

Source: Stony Brook University

source

Krill inspire robots for ocean exploration

Pleobot is a krill-inspired robot offering potential solutions for underwater locomotion and ocean exploration, both on Earth and moons throughout the solar system.

Picture a network of interconnected, autonomous robots working together in a coordinated dance to navigate the pitch-black surroundings of the ocean while carrying out scientific surveys or search-and-rescue missions.

In a new study in Scientific Reports, a team led by Brown University researchers has presented important first steps in building these types of underwater navigation robots.

In the study, the researchers outline the design of a small robotic platform called Pleobot that can serve as both a tool to help researchers understand the krill-like swimming method and as a foundation for building small, highly maneuverable underwater robots.

Pleobot is currently made of three articulated sections that replicate krill-like swimming called metachronal swimming. To design Pleobot, the researchers took inspiration from krill, which are remarkable aquatic athletes and display mastery in swimming, accelerating, braking, and turning. They demonstrate in the study the capabilities of Pleobot to emulate the legs of swimming krill and provide new insights on the fluid-structure interactions needed to sustain steady forward swimming in krill.

According to the study, Pleobot has the potential to allow the scientific community to understand how to take advantage of 100 million years of evolution to engineer better robots for ocean navigation.

“Experiments with organisms are challenging and unpredictable,” says Sara Oliveira Santos, a PhD candidate at Brown’s School of Engineering and lead author of the new study. “Pleobot allows us unparalleled resolution and control to investigate all the aspects of krill-like swimming that help it excel at maneuvering underwater. Our goal was to design a comprehensive tool to understand krill-like swimming, which meant including all the details that make krill such athletic swimmers.”

The effort is a collaboration between Brown researchers in the lab of Monica Martinez Wilhelmus, assistant professor of engineering, and scientists in the lab of Francisco Cuenca-Jimenez at the Universidad Nacional Autónoma de México.

A major aim of the project is to understand how metachronal swimmers, like krill, manage to function in complex marine environments and perform massive vertical migrations of over 1,000 meters (about 3281 feet)—equivalent to stacking three Empire State Buildings—twice daily.

“We have snapshots of the mechanisms they use to swim efficiently, but we do not have comprehensive data,” says Nils Tack, a postdoctoral associate in the Wilhelmus lab. “We built and programmed a robot that precisely emulates the essential movements of the legs to produce specific motions and change the shape of the appendages. This allows us to study different configurations to take measurements and make comparisons that are otherwise unobtainable with live animals.”

The metachronal swimming technique can lead to remarkable maneuverability that krill frequently display through the sequential deployment of their swimming legs in a back to front wave-like motion. The researchers believe that in the future, deployable swarm systems can be used to map Earth’s oceans, participate in search-and-recovery missions by covering large areas, or be sent to moons in the solar system, such as Europa, to explore their oceans.

“Krill aggregations are an excellent example of swarms in nature: they are composed of organisms with a streamlined body, traveling up to one kilometer each way, with excellent underwater maneuverability,” Wilhelmus says. “This study is the starting point of our long-term research aim of developing the next generation of autonomous underwater sensing vehicles. Being able to understand fluid-structure interactions at the appendage level will allow us to make informed decisions about future designs.”

The researchers can actively control the two leg segments and have passive control of Pleobot’s biramous fins. This is believed to be the first platform that replicates the opening and closing motion of these fins. The construction of the robotic platform was a multi-year project, involving a multi-disciplinary team in fluid mechanics, biology, and mechatronics.

The researchers built their model at 10 times the scale of krill, which are usually about the size of a paperclip. The platform is primarily made of 3D printable parts and the design is open-access, allowing other teams to use Pleobot to continue answering questions on metachronal swimming not just for krill but for other organisms like lobsters.

In the study, the group reveals the answer to one of the many unknown mechanisms of krill swimming: how they generate lift in order not to sink while swimming forward. If krill are not swimming constantly, they will start sinking because they are a little heavier than water. To avoid this, they still have to create some lift even while swimming forward to be able to remain at that same height in the water, says Oliveira Santos.

“We were able to uncover that mechanism by using the robot,” says Yunxing Su, a postdoctoral associate in the lab. “We identified an important effect of a low-pressure region at the back side of the swimming legs that contributes to the lift force enhancement during the power stroke of the moving legs.”

In the coming years, the researchers hope to build on this initial success and further build and test the designs presented in the article. The team is currently working to integrate morphological characteristics of shrimp into the robotic platform, such as flexibility and bristles around the appendages.

A NASA Rhode Island EPSCoR Seed Grant partially funded the work.

Source: Brown University

source

AI could find best meds for high blood pressure

A new artificial intelligence program may help doctors better match people with high blood pressure to the medication most likely to work for them.

For the nearly half of Americans with hypertension, it’s a potential death sentence—close to 700,000 deaths in 2021 were caused by high blood pressure, according to the US Centers for Disease Control and Prevention. It also increases the risk of stroke and chronic heart failure.

But while it’s relatively easy to prevent or moderate if caught early—eat well, exercise more, drink less—it can be tough to treat. Although physicians have a bevy of potential hypertension medications to choose from, each is littered with pros and cons, making prescribing the most effective one a challenge: beta-blockers slow the heart, but can cause asthma; ACE inhibitors relax blood vessels, but can lead to a hacking cough.

The new data-driven model aims to give clinicians real-time hypertension treatment recommendations based on patient-specific characteristics, including demographics, vital signs, past medical history, and clinical test records.

The model, described in a study published in BMC Medical Informatics and Decision Making, has the potential to help reduce systolic blood pressure—measured when the heart is beating rather than resting—more effectively than the current standard of care.

The program’s approach to transparency could also help improve physicians’ trust in artificial intelligence–generated results, the researchers say.

“This is a new machine learning algorithm leveraging information in electronic health records and showcasing the power of AI in health care,” says Ioannis Paschalidis, professor and director of the Rafik B. Hariri Institute for Computing and Computational Science & Engineering at Boston University. “Our data-driven model is not just predicting an outcome, it is suggesting the most appropriate medication to use for each patient.”

Currently, when choosing which medication to prescribe a patient, a doctor considers the patient’s history, treatment goals, and the benefits and risks associated with specific medicines. Oftentimes, selecting which drug to prescribe when there are multiple options—and of the options, neither drug is better or worse than the other—can be a bit of a coin toss.

By contrast, the new model generates a custom hypertension prescription using an individual patient’s profile, giving physicians a list of suggested medications with an associated probability of success. The researchers’ aim was to highlight the treatment that best controls systolic blood pressure for each patient based on its effectiveness in a group of similar patients.

“Our goal is to facilitate a personalization approach for hypertension treatment based on machine learning algorithms seeking to maximize the effectiveness of hypertensive medications at the individual level,” Paschalidis says.

The researchers developed the model using de-identified data from 42,752 hypertensive patients of Boston Medical Center (BMC), Boston University’s primary teaching hospital, collected between 2012 and 2020. Patients were sorted into affinity groups, based on similarities of clinically relevant characteristics, such as demographics, past blood pressure records, and past medical history.

During the study, the model’s effectiveness was compared to the current standard of care, as well as three other algorithms designed to predict appropriate treatment plans. The researchers found it achieved a 70.3% larger reduction in systolic blood pressure than standard of care and performed 7.08% better than the second best model. The algorithm was clinically validated, with the researchers manually reviewing a random sample of 350 cases.

The model also showed the benefits of de-prescribing—reducing or stopping prescriptions for some patients taking multiple medications. According to the researchers, because the algorithm provides physicians with several suggested optimal therapies, it could give valuable insights when the medical community is divided on the effectiveness of one drug versus another, a situation known as clinical equipoise.

“These advanced predictive analytics have the ability to augment a clinician’s decision making and to have a positive impact on the quality of care we deliver, and therefore the outcomes for our patients,” says Rebecca Mishuris, who was previously an assistant professor at Boston University Chobanian & Avedisian School of Medicine and is now Mass General Brigham’s chief medical information officer. “This is an important first step that shows that these models actually perform better than standard of care, and could help us be better doctors.”

While many recognize that machine learning’s ability to handle large amounts of data and uncover patterns and correlations could benefit healthcare, its adoption has been limited, in part due to difficulties interpreting the results—and because of low levels of trust in artificial intelligence.

In the past, machine learning in health care has also been hampered by incomplete or inaccurate data, as well as sparse patient histories, which can skew prediction results. An important aspect of this study was to ensure data was transparent and that clinicians—particularly those without technical expertise—clearly understood how the algorithm worked, and how and why the model proposed specific therapeutic recommendations.

“Using data from the diverse patient population of Boston Medical Center, this model provides the opportunity to tailor care for underrepresented populations, with individualized recommendations to improve outcomes for these patients,” says Nicholas J. Cordella, assistant professor at the Chobanian & Avedisian School of Medicine and medical director for quality and patient safety at Boston Medical Center.

“Personalized medicine and models like this are an opportunity to better serve populations that aren’t necessarily well represented in the national studies or weren’t taken into account when the guidelines were being made.”

The National Science Foundation funded the study.

Source: Boston University

source

Computer problems are a big waste of time

On average, people waste between 11 and 20% of their time in front of computers that don’t work or that are so difficult to understand that they can’t perform the task they want to, a new study shows.

The findings show there are major gains to be achieved for society by rethinking systems and involving users more in their development, the researchers say.

“It’s incredible that the figure is so high. However, most people experience frustration when using computers and can tell a horror story about an important PowerPoint presentation that was not saved or a system that crashed at a critical moment,” says Kasper Hornbæk, a professor at the University of Copenhagen.

“Everyone knows that it is difficult to create IT systems that match people’s needs, but the figure should be much lower, and one thing that it shows is that ordinary people aren’t involved enough when the systems are developed,” he says.

Most frustrations are experienced in connection with the performance of completely ordinary tasks, says Morten Hertzum, a professor at Roskilde University.

“The frustrations are not due to people using their computers for something highly advanced, but because they experience problems in their performance of everyday tasks. This makes it easier to involve users in identifying problems. But it also means that problems that are not identified and solved will probably frustrate a large number of users.”

The study included 234 participants who spend between six and eight hours in front of a computer in their day-to-day work. In one hour, the researchers told them to report the situations in which the computer would not work properly or where they were frustrated about not being able to perform the task they wanted.

The problems the participants most often experienced included that: “the system was slow,” “the system froze temporarily,” “the system crashed,” or “it is difficult to find things.” The participants had backgrounds such as student, accountant, or consultant, but several of them actually worked in the IT industry.

“A number of the participants in the survey were IT professionals, while most of the other participants were highly competent IT and computer users. Nevertheless, they encountered these problems, and it turns out that this involves some fundamental functions,” Hornbæk says.

The participants in the survey also responded that 84% of the episodes had occurred before and that 87% of the episodes could happen again. And, according to Hornbæk, we are having the same fundamental problems today that we had 15 to 20 years ago.

“The two biggest categories of problems are still about insufficient performance and lack of user-friendliness,” he says.

“Our technology can do more today, and it has also become better, but, at the same time, we expect more from it. Even though downloads are faster now, they are often still experienced as frustratingly slow,” Hertzum says.

According to Statistics Denmark, 88% of Danes used computers, laptops, smartphones, tablets, or other mobile devices at work in 2018. In this context, the new study indicates that a half to a whole day of a normal working week may be wasted on computer problems.

“There is a lot of productivity lost in workplaces throughout Denmark because people are unable to perform their ordinary work because the computer is not running as it should. It also causes a lot of frustrations for the individual user,” Hornbæk says.

This means that there are major benefits to be gained for society if we experienced fewer problems in front of our computers. The gains can, for example, be achieved if more resources are invested in rethinking how faults are presented to us on the computer, he says.

“Part of the solution may be to shield us from knowing that the computer is working to solve a problem. In reality, there is no reason why we need to look at an incomprehensible box with commands or a frozen computer. The computer could easily solve the problems without displaying this, while it provided a back-up version of the system for us, so that we could continue to work with our tasks undisturbed,” Hornbæk says.

At the same time, IT developers should involve the users even more when designing the systems to make them as easy to use—and understand—as possible. There are no poor IT users, only poor systems, he says.

“When we’re all surrounded by IT systems that we’re cursing, it’s very healthy to ascertain that it’s probably not the users that are the problem, but those who make the systems. The study clearly shows that there is still much room for improvement, and we therefore hope that it can create more focus on making more user-friendly systems in the future.”

The study appears in the journal ACM Transactions on Computer-Human Interaction.

Source: University of Copenhagen

source

Access to prescription opioids may reduce overdose deaths

Increasing access to prescription opioid painkillers may reduce opioid overdose deaths in the United States, according to a new study.

“When access to prescription opioids is heavily restricted, people will seek out opioids that are unregulated,” says Grant Victor, an assistant professor in the Rutgers School of Social Work and lead author of the study  in the Journal of Substance Use and Addiction Treatment.

“The opposite may also be true; our findings suggest that restoring easier access to opioid pain medications may protect against fatal overdoses.”

America’s opioid crisis has evolved across several waves, with each increasingly fatal. Wave one, which began in the 1990s, was associated with overdose deaths because of the misuse of opioid medications.

A policy implemented during the initial wave created prescription drug monitoring programs (PDMPs), state-based initiatives that track controlled substance prescribing. While the policy made it more difficult to access prescription opioids and rates of prescribing did decrease, it had the unintended consequence of pushing people toward off-market opioids, raising the risk of accidental death, Victor says.

This led to wave two of the crisis, a surge in heroin-related deaths, beginning around 2010, followed by wave three (which started in 2013), fueled by synthetic opioids such as fentanyl.

To measure trends and sociodemographic disparities in access to buprenorphine—a common treatment for opioid use disorder—and opioid painkillers, the researchers examined toxicology data, death records, and available PDMPs from 2,682 accidental overdose deaths that occurred from 2016 to 2021 in Indianapolis, Indiana.

The researchers found fewer than half of all decedents (43.3%) had a PDMP record of any kind, meaning they didn’t even try to access prescription opioids. Of the 10.6% that had been prescribed buprenorphine, most (64.7%) were prescribed treatment more than 30 days prior to death, suggesting they were not actively seeking treatment.

Victor and collaborators also found racial disparities in buprenorphine and opioid prescription trends, with dispersal for Blacks significantly lower than whites (7.3% and 21.9% versus 92.7% and 77.7%, respectively).

“Buprenorphine uptake is associated with significantly reduced rates of nonfatal and fatal overdose,” the researchers write. “Despite these positive treatment outcomes, several barriers remain to the widespread uptake of [medications for opioid use disorder] in the United States,” such as stigma and cost.

“For these reasons, a lack of adequate buprenorphine prescribing, combined with reductions in the availability of opioid analgesics, have left individuals contending with [opioid use disorder] at an elevated risk of overdose,” the researchers conclude.

Given these trends and past research, Victor says it is time to re-evaluate policies that make it nearly impossible to obtain opioid prescriptions, even for those with a legitimate need.

“A big reason that we have such a problem with addiction in this country is because people can’t access legitimate pain medication,” he says. “Our findings support a change in policy.”

Source: Rutgers University

source

Even in protected areas, humans take a toll on wildlife

Tropical mammals living inside protected areas are not spared the effects of human activity even when it occurs outside of the protected boundaries, according to a new study.

By 2030, if the 30 by 30 initiative supported by more than 100 countries is successful, 30% of our land and ocean ecosystems will be designated protected areas meant to safeguard biodiversity and help limit the impacts of climate change.

Based on the largest long-term camera-trap wildlife survey of its kind to date, the new research sheds light on how anthropogenic stressors such as human population density and habitat fragmentation affect 159 mammal species in 16 protected areas across three biogeographic regions.

The study, published in Nature Ecology and Evolution, could inform biodiversity policymaking decisions by 30 by 30 participants.

Comprised of millions of images collected over multiple years from over 1,000 camera-trap sites, the data set was assembled by a large-scale network of research stations that agreed to implement a consistent data-collection protocol as part of a partnership between Conservation International, the Wildlife Conservation Society, and the Smithsonian Institution.

“This data set is just phenomenal—it was a herculean effort unlike anything attempted before,” says Lydia Beaudrot, an assistant professor of biosciences at Rice University.

The study found that specialist species—which occupy specific habitats only—thrive when habitat fragmentation is low and are generally more susceptible to the negative impacts of human activities like hunting and land use than generalist species, which are able to live in more diverse habitats.

Thus, a white-bellied pangolin living in the Bwindi Impenetrable National Park in Uganda should shuffle closer to its center, since specialists are likely to fare better the farther inward they are from the edge of a protected area.

“Habitats are more varied at the edge of the protected area,” says lead author Asunción Semper-Pascual, a postdoctoral researcher at the Norwegian University for Life Sciences.

“There is usually this difference between forest cover and open landscape, such as an area used for agriculture, etc. Some generalist species thrive in this kind of diverse setting because it provides access to different resources.”

Generalist species, such as the tayra—a dog-sized omnivore in the weasel family that is at home both under forest cover and in grasslands or cropland, only thrive near the edge of protected areas if human population density there is low.

Understanding species-specific responses to different anthropogenic stressors can help set conservation priorities and guide protected-area management—locally by focusing on the most vulnerable species in a region and globally by highlighting how landscape-scale factors affect biodiversity beyond the protected perimeter.

“We have to think about the situation holistically,” Beaudrot says. “Conservation is going to work best when it’s tackled in specific contexts and in concert with the people who live there so as to create win-win situations for both the people and the wildlife.”

“As more protected areas are created, we need to think carefully about the factors both within and outside protected areas that influence biodiversity,” Semper-Pascual says.

The Research Council of Norway and the National Science Foundation supported the research.

Source Rice University

source