AI could find best meds for high blood pressure

A new artificial intelligence program may help doctors better match people with high blood pressure to the medication most likely to work for them.

For the nearly half of Americans with hypertension, it’s a potential death sentence—close to 700,000 deaths in 2021 were caused by high blood pressure, according to the US Centers for Disease Control and Prevention. It also increases the risk of stroke and chronic heart failure.

But while it’s relatively easy to prevent or moderate if caught early—eat well, exercise more, drink less—it can be tough to treat. Although physicians have a bevy of potential hypertension medications to choose from, each is littered with pros and cons, making prescribing the most effective one a challenge: beta-blockers slow the heart, but can cause asthma; ACE inhibitors relax blood vessels, but can lead to a hacking cough.

The new data-driven model aims to give clinicians real-time hypertension treatment recommendations based on patient-specific characteristics, including demographics, vital signs, past medical history, and clinical test records.

The model, described in a study published in BMC Medical Informatics and Decision Making, has the potential to help reduce systolic blood pressure—measured when the heart is beating rather than resting—more effectively than the current standard of care.

The program’s approach to transparency could also help improve physicians’ trust in artificial intelligence–generated results, the researchers say.

“This is a new machine learning algorithm leveraging information in electronic health records and showcasing the power of AI in health care,” says Ioannis Paschalidis, professor and director of the Rafik B. Hariri Institute for Computing and Computational Science & Engineering at Boston University. “Our data-driven model is not just predicting an outcome, it is suggesting the most appropriate medication to use for each patient.”

Currently, when choosing which medication to prescribe a patient, a doctor considers the patient’s history, treatment goals, and the benefits and risks associated with specific medicines. Oftentimes, selecting which drug to prescribe when there are multiple options—and of the options, neither drug is better or worse than the other—can be a bit of a coin toss.

By contrast, the new model generates a custom hypertension prescription using an individual patient’s profile, giving physicians a list of suggested medications with an associated probability of success. The researchers’ aim was to highlight the treatment that best controls systolic blood pressure for each patient based on its effectiveness in a group of similar patients.

“Our goal is to facilitate a personalization approach for hypertension treatment based on machine learning algorithms seeking to maximize the effectiveness of hypertensive medications at the individual level,” Paschalidis says.

The researchers developed the model using de-identified data from 42,752 hypertensive patients of Boston Medical Center (BMC), Boston University’s primary teaching hospital, collected between 2012 and 2020. Patients were sorted into affinity groups, based on similarities of clinically relevant characteristics, such as demographics, past blood pressure records, and past medical history.

During the study, the model’s effectiveness was compared to the current standard of care, as well as three other algorithms designed to predict appropriate treatment plans. The researchers found it achieved a 70.3% larger reduction in systolic blood pressure than standard of care and performed 7.08% better than the second best model. The algorithm was clinically validated, with the researchers manually reviewing a random sample of 350 cases.

The model also showed the benefits of de-prescribing—reducing or stopping prescriptions for some patients taking multiple medications. According to the researchers, because the algorithm provides physicians with several suggested optimal therapies, it could give valuable insights when the medical community is divided on the effectiveness of one drug versus another, a situation known as clinical equipoise.

“These advanced predictive analytics have the ability to augment a clinician’s decision making and to have a positive impact on the quality of care we deliver, and therefore the outcomes for our patients,” says Rebecca Mishuris, who was previously an assistant professor at Boston University Chobanian & Avedisian School of Medicine and is now Mass General Brigham’s chief medical information officer. “This is an important first step that shows that these models actually perform better than standard of care, and could help us be better doctors.”

While many recognize that machine learning’s ability to handle large amounts of data and uncover patterns and correlations could benefit healthcare, its adoption has been limited, in part due to difficulties interpreting the results—and because of low levels of trust in artificial intelligence.

In the past, machine learning in health care has also been hampered by incomplete or inaccurate data, as well as sparse patient histories, which can skew prediction results. An important aspect of this study was to ensure data was transparent and that clinicians—particularly those without technical expertise—clearly understood how the algorithm worked, and how and why the model proposed specific therapeutic recommendations.

“Using data from the diverse patient population of Boston Medical Center, this model provides the opportunity to tailor care for underrepresented populations, with individualized recommendations to improve outcomes for these patients,” says Nicholas J. Cordella, assistant professor at the Chobanian & Avedisian School of Medicine and medical director for quality and patient safety at Boston Medical Center.

“Personalized medicine and models like this are an opportunity to better serve populations that aren’t necessarily well represented in the national studies or weren’t taken into account when the guidelines were being made.”

The National Science Foundation funded the study.

Source: Boston University

source

Computer problems are a big waste of time

On average, people waste between 11 and 20% of their time in front of computers that don’t work or that are so difficult to understand that they can’t perform the task they want to, a new study shows.

The findings show there are major gains to be achieved for society by rethinking systems and involving users more in their development, the researchers say.

“It’s incredible that the figure is so high. However, most people experience frustration when using computers and can tell a horror story about an important PowerPoint presentation that was not saved or a system that crashed at a critical moment,” says Kasper Hornbæk, a professor at the University of Copenhagen.

“Everyone knows that it is difficult to create IT systems that match people’s needs, but the figure should be much lower, and one thing that it shows is that ordinary people aren’t involved enough when the systems are developed,” he says.

Most frustrations are experienced in connection with the performance of completely ordinary tasks, says Morten Hertzum, a professor at Roskilde University.

“The frustrations are not due to people using their computers for something highly advanced, but because they experience problems in their performance of everyday tasks. This makes it easier to involve users in identifying problems. But it also means that problems that are not identified and solved will probably frustrate a large number of users.”

The study included 234 participants who spend between six and eight hours in front of a computer in their day-to-day work. In one hour, the researchers told them to report the situations in which the computer would not work properly or where they were frustrated about not being able to perform the task they wanted.

The problems the participants most often experienced included that: “the system was slow,” “the system froze temporarily,” “the system crashed,” or “it is difficult to find things.” The participants had backgrounds such as student, accountant, or consultant, but several of them actually worked in the IT industry.

“A number of the participants in the survey were IT professionals, while most of the other participants were highly competent IT and computer users. Nevertheless, they encountered these problems, and it turns out that this involves some fundamental functions,” Hornbæk says.

The participants in the survey also responded that 84% of the episodes had occurred before and that 87% of the episodes could happen again. And, according to Hornbæk, we are having the same fundamental problems today that we had 15 to 20 years ago.

“The two biggest categories of problems are still about insufficient performance and lack of user-friendliness,” he says.

“Our technology can do more today, and it has also become better, but, at the same time, we expect more from it. Even though downloads are faster now, they are often still experienced as frustratingly slow,” Hertzum says.

According to Statistics Denmark, 88% of Danes used computers, laptops, smartphones, tablets, or other mobile devices at work in 2018. In this context, the new study indicates that a half to a whole day of a normal working week may be wasted on computer problems.

“There is a lot of productivity lost in workplaces throughout Denmark because people are unable to perform their ordinary work because the computer is not running as it should. It also causes a lot of frustrations for the individual user,” Hornbæk says.

This means that there are major benefits to be gained for society if we experienced fewer problems in front of our computers. The gains can, for example, be achieved if more resources are invested in rethinking how faults are presented to us on the computer, he says.

“Part of the solution may be to shield us from knowing that the computer is working to solve a problem. In reality, there is no reason why we need to look at an incomprehensible box with commands or a frozen computer. The computer could easily solve the problems without displaying this, while it provided a back-up version of the system for us, so that we could continue to work with our tasks undisturbed,” Hornbæk says.

At the same time, IT developers should involve the users even more when designing the systems to make them as easy to use—and understand—as possible. There are no poor IT users, only poor systems, he says.

“When we’re all surrounded by IT systems that we’re cursing, it’s very healthy to ascertain that it’s probably not the users that are the problem, but those who make the systems. The study clearly shows that there is still much room for improvement, and we therefore hope that it can create more focus on making more user-friendly systems in the future.”

The study appears in the journal ACM Transactions on Computer-Human Interaction.

Source: University of Copenhagen

source

Access to prescription opioids may reduce overdose deaths

Increasing access to prescription opioid painkillers may reduce opioid overdose deaths in the United States, according to a new study.

“When access to prescription opioids is heavily restricted, people will seek out opioids that are unregulated,” says Grant Victor, an assistant professor in the Rutgers School of Social Work and lead author of the study  in the Journal of Substance Use and Addiction Treatment.

“The opposite may also be true; our findings suggest that restoring easier access to opioid pain medications may protect against fatal overdoses.”

America’s opioid crisis has evolved across several waves, with each increasingly fatal. Wave one, which began in the 1990s, was associated with overdose deaths because of the misuse of opioid medications.

A policy implemented during the initial wave created prescription drug monitoring programs (PDMPs), state-based initiatives that track controlled substance prescribing. While the policy made it more difficult to access prescription opioids and rates of prescribing did decrease, it had the unintended consequence of pushing people toward off-market opioids, raising the risk of accidental death, Victor says.

This led to wave two of the crisis, a surge in heroin-related deaths, beginning around 2010, followed by wave three (which started in 2013), fueled by synthetic opioids such as fentanyl.

To measure trends and sociodemographic disparities in access to buprenorphine—a common treatment for opioid use disorder—and opioid painkillers, the researchers examined toxicology data, death records, and available PDMPs from 2,682 accidental overdose deaths that occurred from 2016 to 2021 in Indianapolis, Indiana.

The researchers found fewer than half of all decedents (43.3%) had a PDMP record of any kind, meaning they didn’t even try to access prescription opioids. Of the 10.6% that had been prescribed buprenorphine, most (64.7%) were prescribed treatment more than 30 days prior to death, suggesting they were not actively seeking treatment.

Victor and collaborators also found racial disparities in buprenorphine and opioid prescription trends, with dispersal for Blacks significantly lower than whites (7.3% and 21.9% versus 92.7% and 77.7%, respectively).

“Buprenorphine uptake is associated with significantly reduced rates of nonfatal and fatal overdose,” the researchers write. “Despite these positive treatment outcomes, several barriers remain to the widespread uptake of [medications for opioid use disorder] in the United States,” such as stigma and cost.

“For these reasons, a lack of adequate buprenorphine prescribing, combined with reductions in the availability of opioid analgesics, have left individuals contending with [opioid use disorder] at an elevated risk of overdose,” the researchers conclude.

Given these trends and past research, Victor says it is time to re-evaluate policies that make it nearly impossible to obtain opioid prescriptions, even for those with a legitimate need.

“A big reason that we have such a problem with addiction in this country is because people can’t access legitimate pain medication,” he says. “Our findings support a change in policy.”

Source: Rutgers University

source

Even in protected areas, humans take a toll on wildlife

Tropical mammals living inside protected areas are not spared the effects of human activity even when it occurs outside of the protected boundaries, according to a new study.

By 2030, if the 30 by 30 initiative supported by more than 100 countries is successful, 30% of our land and ocean ecosystems will be designated protected areas meant to safeguard biodiversity and help limit the impacts of climate change.

Based on the largest long-term camera-trap wildlife survey of its kind to date, the new research sheds light on how anthropogenic stressors such as human population density and habitat fragmentation affect 159 mammal species in 16 protected areas across three biogeographic regions.

The study, published in Nature Ecology and Evolution, could inform biodiversity policymaking decisions by 30 by 30 participants.

Comprised of millions of images collected over multiple years from over 1,000 camera-trap sites, the data set was assembled by a large-scale network of research stations that agreed to implement a consistent data-collection protocol as part of a partnership between Conservation International, the Wildlife Conservation Society, and the Smithsonian Institution.

“This data set is just phenomenal—it was a herculean effort unlike anything attempted before,” says Lydia Beaudrot, an assistant professor of biosciences at Rice University.

The study found that specialist species—which occupy specific habitats only—thrive when habitat fragmentation is low and are generally more susceptible to the negative impacts of human activities like hunting and land use than generalist species, which are able to live in more diverse habitats.

Thus, a white-bellied pangolin living in the Bwindi Impenetrable National Park in Uganda should shuffle closer to its center, since specialists are likely to fare better the farther inward they are from the edge of a protected area.

“Habitats are more varied at the edge of the protected area,” says lead author Asunción Semper-Pascual, a postdoctoral researcher at the Norwegian University for Life Sciences.

“There is usually this difference between forest cover and open landscape, such as an area used for agriculture, etc. Some generalist species thrive in this kind of diverse setting because it provides access to different resources.”

Generalist species, such as the tayra—a dog-sized omnivore in the weasel family that is at home both under forest cover and in grasslands or cropland, only thrive near the edge of protected areas if human population density there is low.

Understanding species-specific responses to different anthropogenic stressors can help set conservation priorities and guide protected-area management—locally by focusing on the most vulnerable species in a region and globally by highlighting how landscape-scale factors affect biodiversity beyond the protected perimeter.

“We have to think about the situation holistically,” Beaudrot says. “Conservation is going to work best when it’s tackled in specific contexts and in concert with the people who live there so as to create win-win situations for both the people and the wildlife.”

“As more protected areas are created, we need to think carefully about the factors both within and outside protected areas that influence biodiversity,” Semper-Pascual says.

The Research Council of Norway and the National Science Foundation supported the research.

Source Rice University

source

Heating method may get PFAS off wastewater filters

A new method could break down PFAS on wastewater treatment filters.

In a recent study, researchers demonstrate an innovative method using thermal induction heating to rapidly break down PFAS left on the surface of two solid materials—granular activated carbon and anion exchange resins—after these materials have been used to filter PFAS from municipal water systems. The goal is to clean the materials before they are properly disposed.

PFAS is a group of synthetic chemicals commonly found in household and industrial products such as firefighting foam, food packaging, and nonstick cookware. The method is based on the Joule heating effect, which uses the process of electromagnetic induction inside a metallic reactor.

“In this study, we explored the use of an engineering technique used to melt metals,” says Feng “Frank” Xiao of the University of Missouri. “This method produced 98% degradation of PFAS on the surface of absorbents like granular activated carbon and anion exchange resins after just 20 seconds, which makes this process highly energy efficient and much faster than conventional methods.”

In recent years, experts have raised concerns about the risks to human health from environmental exposure to PFAS, including development of cancer and other serious health issues. Xiao, whose appointment is in the department of civil and environmental engineering, says that while PFAS can be filtered out of water using adsorbents, the disposal of used or “spent” adsorbents also creates issues of environmental contamination.

“Since the group of chemicals known as PFAS generally resist degradation, they pose considerable challenges to established treatment processes, including the waste disposal practices for materials used as filters like granular activated carbon and anion exchange resins,” Xiao says.

Xiao has spent his career focused on researching ways to safely remove PFAS from the environment, including recently demonstrating similar efficiency with the use of induction heating to rapidly degrade PFAS in soil. He says the current study also draws inspiration from recent proposed regulation by the Environmental Protection Agency (EPA) that, if finalized, would require public water systems in the United States to monitor and reduce PFAS contamination in drinking water and spent adsorbents.

Potential drawbacks of this method include by-products created during this process—organic fluorinated species and hydrogen fluoride. While these by-products are considered toxic to consume through breathing or ingestion, Xiao has a solution.

“If the gaseous organic fluorinated products are not degraded during induction heating, abatement treatment will be necessary to remove or degrade them,” Xiao says. “However, based on my previous studies, some of these products can be degradable by regular thermal approaches. Simultaneously, the generation of hydrogen fluoride is increased, which is desirable because it means greater mineralization, or decomposition, of PFAS. We’ve found hydrogen fluoride can be removed simply using clay or soil at moderate temperatures.”

“Thermal phase transition and rapid degradation of forever chemicals (PFAS) in spent media using induction heating,” appears in the journal ACS ES&T Engineering. The study had funding from the US National Science Foundation CAREER Program, the US Department of Defense SERDP, and the US Geological Survey. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.

Source: University of Missouri

source

Does preschool shortchange kids from working-class backgrounds?

Preschoolers from middle- and upper-class backgrounds are more likely to participate in classroom discussions than are equally capable students from working-class backgrounds, a study finds.

The new study of preschoolers in France also shows that these differences may shape how students are perceived by their peers.

The results, which appear in the Journal of Experimental Psychology: General, shed new light on the persistent and early emerging disparities in education linked to socioeconomic status (SES).

“While preschool attendance has been shown to be beneficial for low-SES students’ achievement, our results suggest that early childhood education is not currently maximizing its potential as an equalizing force,” says lead author Sébastien Goudeau, an assistant professor at Université de Poitiers.

“Early schooling contexts provide unequal opportunities for engagement to children in ways linked to their socioeconomic status, which could serve to maintain or even exacerbate social class disparities in achievement,” says coauthor Andrei Cimpian, a professor in the psychology department at New York University.

“These and other findings call for redesigning aspects of early childhood in ways that foster engagement among all students, regardless of their social class.”

Preschooler engagement

Previous research has primarily focused on deficits in low-SES parents’ knowledge, practices, or resources to explain disparities found in early childhood education. The new study examined how schooling itself at this age might be shortchanging children from lower-income backgrounds.

In doing so, the researchers examined students’ behavioral engagement during whole-class discussions—a core part of the preschool curriculum in Europe and North America.

One study included nearly 100 preschoolers, who were anonymous to the researchers, across four classrooms of Grande Section—the last year in French preschools before first grade—in France’s Nouvelle-Aquitaine region. The classrooms selected had a high degree of SES variability among the students as determined by their parents’ occupation. The researchers videotaped whole-classroom discussions—ranging from eight to 19 in each classroom—and recorded the frequency and duration of each child’s participation.

The results showed that low-SES students spoke less frequently and for less time compared to high-SES students. Notably, these differences were not accounted for by SES differences in oral language proficiency, indicating that low-SES students did not talk less because they lacked the proficiency to do so.

Classmate perceptions

In a second study, the authors sought to understand how preschool children perceive differences among their peers in their levels of school engagement. To do so, they drew a new group of Grande Section participants from the same region; it included nearly 100 preschool students across five classrooms.

To determine the children’s perceptions of their classmates, the researchers posed scenarios involving fictional students aimed at surfacing the students’ views of the types of students who are called upon and who speak longer than others.

For instance, “When the teacher asks the class a question, several children raise their hands. However, the teacher calls on [Theodore/Zélie] more often than other children.” After each scenario, children were asked to explain the protagonist’s behavior: for instance, “Why do you think [Theodore/Zélie] is called on more often than other children?”

The research team then coded the open-ended responses the children provided, looking in particular for whether children mentioned inherent factors having to do with the protagonist’s own characteristics (e.g., “because she/he is smart,” “because she/he has a lot to tell”) or extrinsic factors having to do with the protagonist’s background or the classroom context (e.g., “because the teacher likes her/him,” “because the other children are disobedient”).

For each scenario, after the open-ended explanation question, children were also asked to evaluate the fictional student along the two fundamental dimensions of social judgments: competence and warmth. These included perceived intelligence (“Do you think [the fictional child] is more intelligent than the other children, or less intelligent than the other children?”) as well as how they thought the teacher viewed the fictional student (“Do you think the teacher likes [the child] more than the other children, or less than the other children?”). These comparisons were made with the fictional student’s classmates in mind.

Overall, the fictional child who made frequent and longer contributions to classroom discussions was perceived as possessing more positive characteristics than other children in their class.

“Preschoolers explained differences in engagement during whole-class discussions as a consequence of children’s inherent characteristics, including their competence and warmth,” says Cimpian.

“These results suggest that the patterns of school engagement typical of middle- and high-SES students increase the extent to which they are valued by their preschool peers and—conversely—may undermine low-SES students’ psychological experiences.”

Additional coauthors are from Northwestern University’s Kellogg School of Management and Stanford University’s psychology department.

Source: NYU

source

What should we make of Russia’s brief mutiny?

Following the aborted uprising in Russia, Sovietologist Marcia Beck gauges President Vladimir Putin’s power base, the motives of the mercenary commander, and the ramifications for the nation.

In an emergency televised address to the Russian people on June 24, as Yevgeny Prigozhin’s private army of mercenaries rumbled nearly 500 miles toward Moscow on its “march for justice,” Russian President Vladimir Putin denounced the traitors, vowed punishment, and compared the scenario to the turmoil that resulted in the Russian Revolution.

“A blow like this was dealt to Russia in 1917 when the country was fighting in World War I. But the victory was stolen from it: Intrigues, squabbles, and politicking behind the backs of the army and the nation turned into the greatest turmoil, the destruction of the army and collapse of the state, and the loss of vast territories,” said Putin, who vowed that “this will not happen again.”

Beck, a political scientist in the College of Arts and Sciences at the University of Miami, highlights that the starting point for understanding Putin’s inference is that Prigozhin’s Wagner Group mutiny involved Russians shedding the blood of Russians.

“Putin’s reference is to the Russian Revolution and the ensuing civil war in which Russians brutally shed the blood of other Russians, and which eventually led to declarations of independence on the part of the Russian Empire’s subjugated peoples, including the Ukrainians in 1918, and Russia losing three of its Baltic territories, most of Belarus, and all of Ukraine,” she says.

“What he didn’t mention—but what must be foremost on his mind—is that the civil strife resulted in the overthrow of Russia’s last tsar, Nicholas II,” Beck adds. “That is the outcome that Putin most fears from the reverberations of Prigozhin’s insurrection, regardless of any deal that ended it for the moment.”

While Putin later negotiated to halt the uprising situation and granted immunity to Prigozhin and his troops, Beck doubts that he would allow the Wagner Group paramilitaries to continue as a force, at least in Russia.

“That would be too much of a threat to Putin at this point, especially after the positive reception its members received upon entering Rostov-on-Don. It’s especially notable that no Russians came out on the streets to show support for Putin or opposition to Prigozhin or the Wagner revolt,” she notes.

For most Russians, too cynical about power in Russia and the ability of any independent group to break through the power structure, apathy tends to be the reaction to any machinations taking place with Putin’s “power vertical.”

“Putin thus can’t afford to let any independent group, even one dependent on him for funding and resources, wield the kind of influence and capability of challenging his power that the Wagner revolt exhibited,” Beck says.

Prigozhin, an oligarch who made his fortune in the food industry in the 1990s, was known as “Putin’s chef” after Putin came to power because of all the lucrative catering and other contracts he secured with Russian state institutions. He formed the Wagner group of mercenaries in 2014, when Russia annexed Ukraine’s Crimea region and fomented unrest in the eastern parts of Ukraine, Donetsk, and Luhansk, among pro-Russian separatists, Beck explains.

Since that time, he has remained in the shadows and in fact did not acknowledge his leadership of the Wagner group until after Russia invaded Ukraine in 2022.

In comments Monday, Prigozhin maintained that he never intended to overthrow Putin or challenge the state. His motives in the “march for justice” were in retaliation for a missile strike on one of the militia camps and to save his private army from being subsumed into the Russian military. His soldiers were ordered to sign contracts integrating them into the Russian army by July 1.

Prigozhin has been increasingly outspoken in his criticism of how the Russian war in Ukraine is being executed—poor planning, faulty logistics, lack of equipment and supplies, and incompetent command structure. His vociferous and very public criticisms have been directed at Defense Minister Sergei Shoigu and Chief of the General Staff of the Russian Armed Forces Valery Gerasimov.

Though managed for now, the mutiny poses the most significant threat to Putin’s rule in his 23 years in power. “Whatever happens in the short term, this is likely the beginning of the end of the Putin era,” Beck says.

“Putin has basically eviscerated all state institutions and structures, to the point where they serve only his interest and the interests of those clan members dependent on him,” she adds. “There are no independent institutions of state that have, so far, been capable of rising above the personal interests of Putin or his clan dependents to truly act ‘in the interests of the state.’ ”

The aborted uprising could call into question Putin’s hold on power, which up to this point has been based on a combination of oppression and co-optation.

“Putin either has his enemies killed or jailed. He retains power by managing the various clans that dominate Russia today; taking advantage of the many conflicts among the clans and between their strongest personalities; and doling out state, industry, and economic positions as perks for those who remain loyal to him,” Beck points out.

The Russian leader has continued to maintain power by strengthening his power vertical—all potential opponents are completely dependent on him to maintain their positions, their wealth, and their influence.

Does Prigozhin pose a political threat?

“It’s very unlikely, and we should hope that it never happens,” Beck says. “Prigozhin is a thug in the worst meaning of that term. The man known as ‘Putin’s personal banker,’ the billionaire Yuri Kovalchuk, is supposedly Russia’s ‘second most powerful man,’ but it’s open to question whether he or anyone else could ever control the competing clan members in the way Putin has. Putin has no designated successor and there are no institutional foundations within which a viable successor could have risen through the ranks.”

Beck highlights that Russians historically are known to have a deep and almost visceral fear of disorder and chaos in their own land.

“That’s one reason the broader population tends to support autocratic rule. With Putin’s weaknesses so clearly exposed by the Wagner mercenary revolt, that fear may once again prove justified,” she says. “With no instructional foundation for determining a successor to Putin, the clan battles may well result in Russians fighting Russians as the modern-day tsar suffers an ignominious defeat.”

Source: University of Miami

source

Work requirements for SNAP don’t get people into jobs

Work requirements for SNAP benefits don’t boost employment, a study finds.

As the negotiations to extend the federal debt ceiling neared the early June deadline, a notable change concerning SNAP benefits, commonly known as food stamps, was inserted into the final measure. Work requirements, which have been part of the program since the 1990s, would be imposed on a larger swath of beneficiaries, just as the House-passed bill had mandated. But in the final version of the Fiscal Responsibility Act of 2023, signed into law on June 3, homeless recipients would be exempt.

That was good news to Elena Prager, an assistant professor of economics at the University of Rochester’s Simon Business School. Prager was part of a team of economists whose research shows that SNAP work requirements have a disproportionate, adverse effect on homeless recipients.

“You spend less money on SNAP benefits because people leave the program, but you probably create social costs elsewhere.”

Their study was published in February in American Economic Journal: Economic Policy. The group reports that SNAP work requirements have had “no effects on employment.” Instead, the team shows that the requirements have led to a reduction in beneficiaries, chiefly among homeless recipients with limited to no employment history.

“I don’t know for sure whether our paper caused [the exemption],” Prager says, “but as far as I know, we were the first research team to document that work requirements disproportionately impact homeless people. And it’s pretty plausible that people working in Congress who have been reading our work and reaching out to us made that happen.”

The federal Supplemental Nutrition Assistance Program dates back to the Great Depression. Then called the Food Stamp Program, it has become one of the largest antipoverty programs in the United States, distributing, the researchers note, nearly $70 billion to more than 45 million people, or 14% of the nation’s population, as of 2015. The program provides subsidies that recipients can use exclusively for food.

The program instituted work requirements for most nondisabled recipients in 1996, with passage of the Personal Responsibility and Work Opportunity Reconciliation Act. The rationale for work requirements, as articulated in the bill, was that they would help unemployed recipients make a transition into the paid labor force. Once in the paid labor force, recipients would gain skills, leading to a full-time living wage—and an end to SNAP dependency.

The assumption behind the policy, says Prager, “is that without the requirements, SNAP will discourage people from working because their food needs are already partially taken care of, or because making too much money would mean losing their SNAP eligibility.” In that case, instituting work requirements would increase employment. But the research shows instead “that for SNAP recipients who do not have a job, the barriers to working are something other than a lack of work requirements.”

Because work requirements have been the norm in SNAP for almost three decades now, it was a challenge to design a study that would show what a world without SNAP work requirements would look like. But the Great Recession provided an opportunity.

The federal response to the recession, the American Recovery and Reinvestment Act, included a waiver of SNAP work requirements, which remained in force until November 2013. Using Virginia as a case study, Prager and her coauthors used administrative data from the Virginia Department of Social Services spanning 2007 to 2015. For each SNAP recipient, they collected demographic information, disability and employment status, housing type, earned and unearned income, and the first and last calendar months of SNAP participation periods.

The researchers found that the reintroduction of work requirements caused a quarter of recipients subject to them to lose their SNAP benefits. The loss of benefits was especially acute among those who “prior to the reinstatement of work requirements, are homeless or have no earned income.”

Prager notes that there are documented health effects from the loss of SNAP benefits. But she says there are additional, unanswered questions. For example, does the loss of benefits, which would place added strain on a recipient’s budget, have any impact on evictions? “We know that homeless recipients are disproportionately affected by work requirements,” she says. “We don’t know whether work requirements actually cause more homelessness.”

There’s also the question of whether the policy simply shifts costs.

“You spend less money on SNAP benefits because people leave the program, but you probably create social costs elsewhere,” Prager says.

Coauthors of the study are from MIT, Harvard, the University of California, Berkeley, and the University of Maryland.

Source: University of Rochester

source

Common blood condition may protect against Alzheimer’s

Researchers have found that a common blood condition associated with several diseases may have a protective effect against Alzheimer’s disease.

In the condition, clonal hematopoiesis of indeterminate potential, or CHIP, certain blood stem cells acquire mutations that strengthen their ability to survive and multiply.

As a result, the mutant cells dominate, and just a few cells can give rise to much or even all of the body’s blood and immune cells. In most cases of CHIP, a dominant blood stem cell gives rise to between 4% and 30% of blood and immune cells.

“We were surprised to find that CHIP was actually associated with a substantially lower risk of Alzheimer’s disease.”

Studies by Stanford University Medicine assistant professor of pathology Siddhartha Jaiswal and others have shown that people with CHIP are at much higher risk of developing various diseases. By analyzing medical databases and stored blood samples, Jaiswal and his colleagues have shown that people with CHIP are about twice as likely to develop coronary heart disease, twice as likely to develop chronic liver disease, and 10 times as likely to develop blood cancers such as myeloid leukemias.

Researchers don’t yet fully understand why CHIP is linked to diseases other than blood cancer, though some studies have suggested that CHIP mutations cause increased activation of the immune system.

Jaiswal and his colleagues investigated an association between CHIP and Alzheimer’s disease, expecting to see either no association or a positive association with Alzheimer’s disease.

He and colleagues analyzed a cohort of participants who had been followed over a period of 10 to 15 years, comparing medical records and blood samples.

“We were surprised to find that CHIP was actually associated with a substantially lower risk of Alzheimer’s disease,” Jaiswal says. People with CHIP were found to have a 30% to 50% lower risk of developing the neurodegenerative disorder, compared with those who did not have the CHIP mutation, he says.

“The degree of protection from Alzheimer’s dementia seen in CHIP carriers is similar to carrying an APOE ε2 allele,” says Jaiswal, referring to a genetic variant that’s known to decrease the risk of Alzheimer’s.

The team saw the negative association with Alzheimer’s and CHIP even when they accounted for other risk factors. “We thought there might be some kind of survivor bias—that people with CHIP were more likely to die before developing Alzheimer’s disease—but the decrease in risk still held after adjusting for that,” Jaiswal says. They also analyzed the association in another way, to see if people who had Alzheimer’s disease were less likely to have CHIP. The researchers confirmed that they were.

A paper in Nature Medicine details the findings.

Of course, an association doesn’t mean that there is a cause-and-effect relationship. So, Jaiswal and his colleagues conducted different forms of genetic analyses, finding evidence that CHIP could causally inhibit the development of Alzheimer’s.

The connection between CHIP and Alzheimer’s implied a somewhat unexpected link between brain cells and the cells that give rise to blood and the immune system, Jaiswal says.

The brain has its own immune cells, called microglia, and there is some evidence that microglia combat the inflammation and buildup of toxins that are associated with Alzheimer’s disease. But over the last 10 years, scientists have come to believe that all microglia become established in the brain very early in human development, during the embryo stage. Since the blood-brain barrier normally prevents blood cells from crossing into the brain, there has been no known way for blood stem cells to become brain cells like microglia.

Yet, when Jaiswal and his colleagues looked at brain samples of people with CHIP for the CHIP-associated blood cell mutations, they saw them. And these mutations seem to exist in microglia cells. The investigators found that between 30% and 90% of the microglia in brain samples of those with CHIP harbored the CHIP mutations. The proportion of mutant microglia in any individual brain tended to match the proportion of mutant blood cells in the rest of the body.

“This suggests that cells are migrating from the blood into the brain,” says Jaiswal, adding that the finding is in contradiction to the accepted dogma. “It’s a remarkable finding.”

How this affects the development of Alzheimer’s disease is not yet clear, but researchers know that microglia help fight microbial invaders and clean up waste products in the brain. “One hypothesis is that the mutations that promote a growth advantage in blood stem cells also promote microglial expansion and activity, boosting microglia’s ability to fight the conditions that lead to brain disease,” Jaiswal says.

The scientists also saw that, in the brain samples of people with CHIP, levels of neurofibrillary tangles and amyloid plaques, both associated with Alzheimer’s disease and thought to be causative, were lower.

Jaiswal and his colleagues are planning follow-up studies to learn more about how the mutated microglia might be protecting against Alzheimer’s disease. Jaiswal plans to work with professor of pathology Marius Wernig to transform CHIP blood stem cells into microglia so they can understand how these cells behave differently than normal microglia.

Although there’s much work to be done, the researchers hope that if they are able to understand these mechanisms, it could help guide the development of new therapeutics that could one day protect against Alzheimer’s disease, they says.

Source: Stanford

source

Dog breeds differ in pain sensitivity, but not how vets may expect

Dog breeds differ in pain sensitivity, but these differences don’t always match up with the beliefs people—including veterinarians—hold about breed-specific pain sensitivity.

The results appear in a new study, which also found that a dog’s temperament (specifically in the way they interact with strangers) may influence the way veterinarians view breed pain sensitivity.

“Veterinarians have a fairly strong consensus in their ratings of pain sensitivity in dogs of different breeds, and those ratings are often at odds with ratings from members of the public,” says Margaret Gruen, associate professor of behavioral medicine at North Carolina State and co-corresponding author of a paper describing the research.

“So we wanted to know—first—is any of it true? If we take 15 dogs of 10 breeds rated as high, medium, and low sensitivity and test their sensitivity thresholds, would we see differences, and if so, would they be consistent with what veterinarians believe? Or is it possible that these views are the result of a dog’s emotional reactivity and behavior while interacting with a veterinarian?”

To answer the question the researchers looked at both male and female adult healthy dogs from 10 breeds subjectively rated by veterinarians as having high (chihuahua, German shepherd, Maltese, Siberian husky), average (border collie, Boston terrier, Jack Russell terrier), or low (golden retriever, pitbull, Labrador retriever) pain sensitivity. A total of 149 dogs participated in the study.

To measure pain sensitivity, the researchers looked to methods used in human clinical medicine.

“Reactivity to external stimuli is a measure commonly used in neurology and pain clinics for humans,” says Duncan Lascelles, professor of translational pain research and co-corresponding author of the work. “We have adapted these measures for pet dogs and used them in this study.”

Each dog’s sensitivity to pressure and temperature was tested by pressing a pressure tool (think of both ends of a ball point pen—pointed and blunt) then a warm thermal probe against the top of the back paw. The stimulus was withdrawn immediately when the dog moved their paw. Each test was repeated five times and the results were used to measure sensitivity.

The researchers also conducted two tests of emotional reactivity that were designed to see how the dogs reacted to unfamiliar things or people and to mimic some of the stressful aspects of a visit to the vet: the novel object test and the “disgruntled stranger” test. The novel object was a stuffed monkey that moved and made noise. The disgruntled stranger was a person involved in a loud phone conversation prior to noticing and calling the dog over.

The sensitivity test results were compared to questionnaires that veterinarians and the general public had filled out on breed pain sensitivity.

The researchers found that there are real breed differences in pain sensitivity thresholds, but that those differences don’t always match up with rankings from veterinarians.

For example, Maltese tended to have a high sensitivity threshold, or low pain tolerance, which meant they reacted quickly to pressure and temperature stimulus. This finding was in line with how veterinarians ranked them.

However, veterinarians also thought Siberian huskies were highly sensitive—but test results placed huskies in the mid-range. In fact, several of the larger breeds veterinarians ranked as sensitive actually had an average-to-high pain tolerance.

The researchers note that dogs who were less likely to engage in the novel object and disgruntled stranger scenarios were also sometimes rated as having a lower pain tolerance, which raises the question of whether an animal’s stress level and emotional reactivity at a vet visit could influence a veterinarian’s pain tolerance rating for that breed.

“These behavioral differences might explain the different veterinarian ratings, but not actual pain tolerance between breeds,” says Lascelles. “This study is exciting because it shows us that there are biological differences in pain sensitivity between breeds. Now we can begin looking for potential biological causes to explain these differences, which will enable us to treat individual breeds more effectively.”

“On the behavioral side, these findings show that we need to think about not just pain, but also a dog’s anxiety in the veterinary setting,” Gruen says. “And they can help explain why veterinarians may think about certain breeds’ sensitivity the way they do.”

The research appears in Frontiers in Pain Research. Researchers from Kansas State University and Brigham Young University also contributed to the work.

Support for the work came from the American Kennel Club Canine Health Foundation.

Source: NC State

source