Russia blames Luna-25 crash on computer glitch

WASHINGTON — Russia says its first lunar lander mission in nearly half a century crashed in August because of faulty commands in an onboard computer during a maneuver.

The Luna-25 spacecraft crashed Aug. 19 during a maneuver to lower the spacecraft’s orbit around the moon to set up a landing planned for two days later. Roscosmos said the spacecraft suffered an “emergency condition” that caused its main engine to fire for 127 seconds instead of the planned 84.

In an Oct. 3 statement posted on social media, Roscosmos said that the most likely cause of the crash was “abnormal functioning” of the onboard computer. Specifically, the computer failed to turn on an accelerometer in a device called BIUS-L, which measures the angular velocity of the spacecraft.

As a result, “the on-board control complex received zero signals from the accelerometers of the BIUS-L device,” according to a translation of the Roscosmos statement. “This did not allow, when issuing a corrective pulse, to record the moment the required speed was reached and to timely turn off the spacecraft propulsion system, as a result of which its shutdown occurred according to a temporary setting.”

Luna-25 launched Aug. 10 on a Soyuz-2.1b rocket from Russia’s Vostochny Cosmodrome and arrived in lunar orbit six days later. The lander, weighing an estimated 1,750 kilograms at launch, carried a package of Russian scientific instruments weighing 30 kilograms, and had planned to land at Boguslawsky crater at approximately 70 degrees south latitude.

NASA released Aug. 31 an image from its Lunar Reconnaissance Orbiter (LRO) spacecraft showing the likely impact site of Luna-25, on the steep inner rim of Pontécoulant G crater about 400 kilometers from the planned landing site. The impact site, a crater 10 meters across, does not appear in the previous LRO image of the area taken in June 2022. NASA estimated the impact took place at 7:58 a.m. Eastern Aug. 19.

Luna-25 was the first mission to the moon by Russia or the former Soviet Union since the Luna-24 sample return mission in 1976. Luna-25 encountered years of development delays, and the European Space Agency dropped out of participating on the mission after Russia’s invasion of Ukraine in February 2022.

Yuri Borisov, head of Roscosmos, said the agency would continue its lunar exploration efforts. “No one is going to fold their arms, and we are determined to continue the lunar program,” he said in a translated statement. “Moreover, we are considering the possibility of shifting the Luna-26 and Luna-27 missions to the left in order to get the results we need as quickly as possible.” He did not offer a revised schedule for those future missions.

source

Did comet’s blast spark agriculture in Syria 12,800 years ago?

Agriculture in Syria started with a bang 12,800 years ago as a fragmented comet slammed into the Earth’s atmosphere, say researchers.

The explosion and subsequent environmental changes forced hunter-gatherers in the prehistoric settlement of Abu Hureyra to adopt agricultural practices to boost their chances for survival.

A group of scientists makes these assertions in one of four related research papers, all appearing in the journal Science Open: Airbursts and Cratering Impacts. The papers are the latest results in the investigation of the Younger Dryas Impact Hypothesis, the idea that an anomalous cooling of the Earth almost 13 millennia ago was the result of a cosmic impact.

Life at Abu Hureyra

“In this general region, there was a change from more humid conditions that were forested and with diverse sources of food for hunter-gatherers, to drier, cooler conditions when they could no longer subsist only as hunter-gatherers,” says earth scientist James Kennett, a professor emeritus of the University of California, Santa Barbara. The settlement at Abu Hureyra is famous among archaeologists for its evidence of the earliest known transition from foraging to farming. “The villagers started to cultivate barley, wheat, and legumes,” he notes. “This is what the evidence clearly shows.”

These days, Abu Hureyra and its rich archaeological record lie under Lake Assad, a reservoir created by construction of the Taqba Dam on the Euphrates River in the 1970s. But before this flood, archaeologists managed to extract loads of material to study. “The village occupants,” the researchers state in the paper, “left an abundant and continuous record of seeds, legumes, and other foods.” By studying these layers of remains, the scientists were able to discern the types of plants that were being collected in the warmer, humid days before the climate changed and in the cooler, drier days after the onset of what we know now as the Younger Dryas cool period.

Before the impact, the researchers found, the inhabitants’ prehistoric diet involved wild legumes and wild-type grains, and “small but significant amounts of wild fruits and berries.” In the layers corresponding to the time after cooling, fruits and berries disappeared and their diet shifted toward more domestic-type grains and lentils, as the people experimented with early cultivation methods. By about 1,000 years later, all of the Neolithic “founder crops”—emmer wheat, einkorn wheat, hulled barley, rye, peas, lentils, bitter vetch, chickpeas, and flax—were being cultivated in what is now called the Fertile Crescent. Drought-resistant plants, both edible and inedible, become more prominent in the record as well, reflecting a drier climate that followed the sudden impact winter at the onset of the Younger Dryas.

The evidence also indicates a significant drop in the area’s population, and changes in the settlement’s architecture to reflect a more agrarian lifestyle, including the initial penning of livestock and other markers of animal domestication.

To be clear, Kennett says, agriculture eventually arose in several places on Earth in the Neolithic Era, but it arose first in the Levant (present-day Syria, Jordan, Lebanon, Palestine, Israel, and parts of Turkey) initiated by the severe climate conditions that followed the impact.

And what an impact it must have been.

The ‘black mat’

In the 12,800-year-old layers corresponding to the shift between hunting and gathering and agriculture, the record at Abu Hureyra shows evidence of massive burning. The evidence includes a carbon-rich “black mat” layer with high concentrations of platinum, nanodiamonds, and tiny metallic spherules that could only have been formed under extremely high temperatures—higher than any that could have been produced by human technology at the time. The airburst flattened trees and straw huts, splashing meltglass onto cereals and grains, as well as on the early buildings, tools, and animal bones found in the mound—and most likely on people, too.

This event is not the only such evidence of a cosmic airburst on a human settlement. The authors previously reported a smaller but similar event which destroyed the biblical city at Tall el-Hammam in the Jordan Valley about 1600 BCE.

The black mat layer, nanodiamonds, and melted minerals have also been found at about 50 other sites across North and South America and Europe, the collection of which has been called the Younger Dryas strewnfield. According to the researchers, it’s evidence of a widespread simultaneous destructive event, consistent with a fragmented comet that slammed into the Earth’s atmosphere. The explosions, fires, and subsequent impact winter, they say, caused the extinction of most large animals, including the mammoths, saber-toothed cats, American horses, and American camels, as well as the collapse of the North American Clovis culture.

Explosion in the sky

Because the impact appears to have produced an aerial explosion there is no evidence of craters in the ground. “But a crater is not required,” Kennett says. “Many accepted impacts have no visible crater.” The scientists continue to compile evidence of relatively lower-pressure cosmic explosions—the kind that occur when the shockwave originates in the air and travels downward to the Earth’s surface.

“Shocked quartz is well known and is probably the most robust proxy for a cosmic impact,” he continues. Only forces on par with cosmic-level explosions could have produced the microscopic deformations within quartz sand grains at the time of the impacts, and these deformations have been found in abundance in the minerals gathered from impact craters.

This “crème de la crème” of cosmic impact evidence has also been identified at Abu Hureyra and at other Younger Dryas Boundary (YDB) sites, despite an absence of craters. However, it has been argued that the kind of shock-fractured quartz found in the YDB sites is not equivalent to that found in the large crater-forming sites, so the researchers worked to link these deformations to lower-pressure cosmic events.

Evidence from nuclear tests

To do so, they turned to human-made explosions of the magnitude of cosmic airbursts: nuclear tests conducted at the Alamogordo Bombing Range in New Mexico in 1945 and in Kazakhstan, in 1949 and 1953. Similar to cosmic airbursts, the nuclear explosions occurred above ground, sending shockwaves toward Earth.

“In the papers, we characterize what the morphologies are of these shock fractures in these lower-pressure events,” Kennett says. “And we did this because we wanted to compare it with what we have in the shock-fractured quartz in the Younger Dryas Boundary, to see if there was any comparison or similarities with what we see at the Trinity atomic test site and other atomic bomb explosions.” Between the shocked quartz at the nuclear test sites and the quartz found at Abu Hureyra, the scientists found close associations in their characteristics, namely glass-filled shock fractures, indicative of temperatures greater than 2,000 degrees Celsius, above the melting point of quartz.

“For the first time, we propose that shock metamorphism in quartz grains exposed to an atomic detonation is essentially the same as during a low-altitude, lower-pressure cosmic airburst,” Kennett says. However, the so-called “lower pressure” is still very high—probably greater than 3 GPa or about 400,000 pounds per square inch, equivalent to about five 737 airplanes stacked on a small coin. The novel protocol the researchers developed for identifying shock fractures in quartz grains will be useful in identifying previously unknown airbursts that are estimated to recur every few centuries to millennia.

Taken together, the evidence presented by these papers, according to the scientists, “implies a novel causative link among extraterrestrial impacts, hemispheric environmental and climatic change, and transformative shifts in human societies and culture, including agricultural development.”

Source: UC Santa Barbara

source

Japan’s SLIM moon lander makes lunar flyby

BAKU, Azerbaijan — Japan’s SLIM spacecraft has completed a flyby of the moon as part of a months-long deep space journey to set up a lunar landing attempt.

The Smart Lander for Investigating Moon (SLIM) lander made its closest approach to the moon at 2:47 a.m. Eastern, Oct. 4. It passed just under 5,000 kilometers from the lunar surface at a relative speed of 1.47 kilometers per second. 

The spacecraft is now on a long, looping orbit that will bring it back to the moon late in the year. The orbit will allow it to enter lunar orbit in a more propellant-efficient manner than needing to perform a long braking burn during the recent flyby. 

Japan Aerospace Exploration Agency (JAXA) officials here at the 74th International Astronautical Congress say a landing attempt is expected in January. 

SLIM launched Sept. 6 on a H-2A rocket from Japan’s Tanegashima Space Center along with the XRISM space telescope. The SLIM spacecraft then went through a series of systems checks and orbit-raising maneuvers as part of its circuitous voyage to the moon.

The spacecraft performed its translunar injection burn Sept. 30, firing its main engines for 39 seconds, a post from JAXA’s SLIM social media account confirmed.

The main objective of SLIM is to demonstrate a highly-accurate lunar soft-landing with a lightweight architecture. 

The spacecraft has a dry mass of 200 kilograms and 700-730 kg wet mass. The expected development cost was 18 billion yen ($120 million).

The lander will aim to set down within a 100 meters of its target point on the slope of the mid-latitude Shioli crater. It features five crushable aluminum lattice legs which will help absorb the impact of landing and setting down on a slope.

It will use a vision-based navigation system and carries observational data from Japan’s SELENE orbiter launched in 2007. This system will be used to identify its landing zone during its autonomous descent and landing.

The lander could lead to lower cost exploration missions in the future, according to JAXA. The accuracy of landings will be useful for accessing areas of high scientific interest instead of more general, safer landing zones.

If successful, SLIM would make Japan the fifth country to soft land on the moon. In August India became the fourth nation to achieve the feat with its high latitude Chandrayaan-3 mission landing.

SLIM may not be the next landing on the moon however. Houston-based Intuitive Machines this week unveiled its completed first lunar lander. The Nova-C lander is scheduled for a mid-November launch from Kennedy Space Center on a Falcon 9.

That spacecraft will embark on a direct, five-day journey to the moon and enter lunar orbit. It will attempt to set down in Malapert crater, 300 km away from the lunar south pole. A safe landing would make it the first non-governmental spacecraft to successfully land on the moon.


source

U.S. Space Force awards Booz Allen $630 million contract for satellite systems support

WASHINGTON — Booz Allen Hamilton won a seven-year $630 million contract from the U.S. Space Force for systems engineering and integration of satellite systems used for missile warning, environmental monitoring and surveillance, the company announced Oct. 4.

Based in McLean, Virginia, Booz Allen is a large consulting firm that provides management and technology services. The Space Force contract is for support services of satellite programs run by the Space Systems Command’s space sensing program office, including the Next Generation Overhead Persistent Infrared (OPIR) and legacy constellations. 

The company originally won the Space Sensing Systems Engineering and Integration contract in 2022 but the award was put on hold due to protests.

Booz Allen will be responsible for the integration of different elements of major space sensing programs, such as the satellites and ground stations used for missile warning and missile tracking, environmental monitoring and other data collection from space, Eric Hoffman, vice president and leader of Booz Allen’s space business, told SpaceNews.

The role of the systems integrator is significant in a program like Next-Generation OPIR where there are different contractors working on the satellites and the ground systems, he said. 

“The government has to essentially act as the integrator and make sure all these things come together and ultimately deliver warfighter capability,” Hoffman added. “Our role is to help make that happen, make sure that digital engineering environments are set up and can talk to each other, and the ground systems are synchronized.”

Another task under this contract is to support the users of the infrared imagery and other data collected by satellites. “There’s so much more that can be done with the data that comes off of the various platforms,” Hoffman said. Booz Allen will apply artificial intelligence and machine learning technologies to help operators analyze data, he said.

Contract under protest for 16 months

The Space Sensing SE&I contract was previously held by SAIC. The Space Force recompeted it and selected Booz Allen in January 2022. Shortly after, the award was protested by SAIC and Mantech International. 

After the bid protests were settled, the Space Force was cleared to move forward with the contract award to Booz Allen in June 2023. 

source

Savanna & grassland carbon storage slows climate change

Savannas and grasslands in drier climates around the world store more heat-trapping carbon than scientists thought they did, helping to slow the rate of climate warming, according to a new study.

The findings, published in Nature Climate Change, are based on a reanalysis of datasets from 53 long-term fire-manipulation experiments worldwide, as well as a field-sampling campaign at six of those sites.

The researchers looked at where and why fire has changed the amount of carbon stored in topsoil and found that within savanna-grassland regions, drier ecosystems were more vulnerable to changes in wildfire frequency than humid ecosystems.

“The potential to lose soil carbon with very high fire frequencies was the greatest in dry areas, and the potential to store carbon when fires were less frequent was also the greatest in dry areas,” says lead author Adam Pellegrini, currently an IGCB Exchange Professor at the University of Michigan’s Institute for Global Change Biology. His primary appointment is at the University of Cambridge.

Over the last 20 years, fire suppression due to population expansion and landscape fragmentation caused by the introduction of roads, croplands, and pastures into savannas and grasslands led to smaller wildfires and less burned areas in drier savannas and grasslands.

In dryland savannas, the reduction in the size and frequency of wildfires has led to an estimated 23% increase in stored topsoil carbon. The increase was not foreseen by most of the state-of-the-art ecosystem models used by climate researchers, according to second author Peter Reich, a forest ecologist and professor and director of the Institute for Global Change Biology at the University of Michigan School for Environment and Sustainability.

As a result, the climate-buffering impacts of dryland savannas have likely been underestimated, Reich says. The new study estimates that soils in savanna-grassland regions worldwide have gained 640 million metric tons of carbon over the past two decades.

“Ongoing declines in fire frequencies have probably created an extensive carbon sink in the soils of global drylands that may have been underestimated by ecosystem models,” Reich says. “In other words, in the past couple of decades, global savannas and grasslands have slowed climate warming more than they have accelerated it—despite fires. But there is absolutely no guarantee that will continue in the future.”

Savannas are tropical or subtropical grasslands—in eastern Africa, northern South America, and elsewhere—that contain scattered trees and drought-resistant undergrowth. The new study looked at recent changes in burned area and fire frequency in savannas, other grasslands, seasonal woodlands, and some forests.

Across 888,000 square miles (2.3 million square kilometers) of dryland savanna-grasslands, where fire frequency and burned area declined over the past two decades, soil carbon rose by an estimated 23%.

But in more humid savanna-grassland regions covering 533,000 square miles (1.38 million square kilometers), more frequent wildfires and increased burned area resulted in an estimated 25% loss in soil carbon over the past two decades.

The net change, during that time, was a gain of 0.64 petagrams, or 640 million metric tons, of soil carbon. That works out to a 0.038 petagram (38 million metric ton) increase per year.

“In the grand scheme of things, no, this is not really a massive amount of carbon that will put a dent in heat-trapping anthropogenic emissions,” Pellegrini says. “But no one region—neither the Amazon rainforest nor the US Great Plains grasslands nor Canada’s boreal forest nor dozens of other biomes around the world—can alone store sufficient carbon to make a large contribution to slowing climate change. However, in aggregate, they can.

“Plus, there are several savanna and grassland regions that have soil carbon-credit projects being developed, so understanding their capacity to sequester carbon is relevant to the region—even if it’s not a massive flux globally.”

The US Department of Agriculture, United Kingdom Research and Innovation, the Gordon and Betty Moore Foundation, and the US Department of Energy funded the work. The US National Science Foundation funded the Cedar Creek Long Term Ecological Research program in Minnesota. The US National Park Service, the Sequoia Parks Conservancy, and South African National Parks funded sampling at other sites.

Source: University of Michigan

source

Parents and coaches believe girls aren’t as good at chess

Parents and coaches of youth chess players peg the highest potential rating of girl players to be lower than that of boy players, researchers report.

The Queen’s Gambit miniseries portrayed the life of a fictional chess prodigy, Beth Harmon, who is continuously underestimated in male-dominated competitions. The new study highlights “real-life” evidence of what Harmon faced as a younger player.

Moreover, the study’s authors found that coaches who think “brilliance” is required to succeed in chess also believe that their female mentees would be more likely to stop playing the game due to lack of ability than their male mentees would.

But, at the same time, coaches and parents don’t think girls encounter a less supportive environment than boys do—or that girls might be more likely to stop playing as a result.

“While it is inspiring to see a fictional woman winning in a space dominated by men, real-world women remain underrepresented in chess,” says Sophie Arnold, a doctoral student at New York University and lead author of the paper, which appears in the Journal of Experimental Psychology: General.

“This study identifies one contributing reason as to why: Parents and coaches are biased against the female youth players in their own lives.”

“It is striking that even the parents and coaches who have a vested interest in girls’ success hold biases against them and may also have some blind spots about the barriers to girls’ success,” says senior author Andrei Cimpian, a professor in psychology department at NYU.

“These beliefs are likely to be harmful both to girls who already play chess and to those who could want to: Would you be interested in participating in an activity where your potential is downgraded by your parents and by your coaches before you have even started?” says Jennifer Shahade, a NYU alumna, two-time US Women’s Chess champion, and author of Chess Queens (Hachette, 2022) and Play Like a Girl! (Mongoose Press, 2011), who was involved in the study design.

In the US Chess Federation (“US Chess”), only 13% of players are women, raising questions about what drives the gender disparity. Previous studies have largely focused on potential deficits in chess ability among girls, which overlooks the role of adult leadership.

“This line of scholarship can make the overrepresentation of men in chess seem like it’s a ‘girls and women problem’ rather than a ‘chess problem,’” Arnold says.

In the new study, by contrast, the researchers considered how the important people in girls’ lives—coaches and parents—may be biased against them when assessing their potential, even at a young age, and how these perceptions may help explain the huge gender gap in who plays chess.

To do so, the team interviewed nearly 300 parents and mentors—90% of whom were men—who were recruited through the US Chess Federation. In the survey, they reported their evaluations of and investment in approximately 650 youth players.

In addition, parents and coaches were asked if they thought aptitude in chess requires brilliance—a measure Cimpian and his colleagues have used in the past to detect stereotyping and gender bias in academic fields.

The researchers found bias against girls across multiple measures. Parents and coaches thought that female youth players’ highest potential ratings were on average lower than those of male players—a bias that was exacerbated among parents and mentors who believed that success in chess requires brilliance.

The researchers note that the sample of mothers and female coaches was too small to analyze separately—a reflection of women’s underrepresentation in chess more generally.

Notably, these coaches and parents didn’t recognize that their own presumptions may function as a barrier to girls succeeding in the game. Specifically, coaches who thought brilliance was required to succeed in chess also thought their female mentees would be more likely to stop playing chess due to a lack of ability than their male mentees. And, in fact, parents and coaches did not believe that girls—relative to boys—encounter a less supportive environment in chess and might stop playing chess as a result.

However, not all news was bad. For example, the researchers found no bias in the amount of resources—such as time and money—coaches and parents reported being willing to invest in female relative to male youth players.

“This study provides the first large-scale investigation of bias against young female players and holds implications for the role of parents and mentors in science and technology—areas that, like chess, are culturally associated with intellectual ability and exhibit substantial gender imbalances,” notes Arnold.

Additional coauthors are from NYU and the University of New Hampshire.

Source: NYU

source

Test clarifies form and function in organelles

Scientists have long understood that parts of cells, called organelles, evolved to have certain shapes and sizes because their forms are closely related to how they function. Now, researchers have developed a bacteria-based tool to test whether, as the axiom goes, form follows function.

The tool, which researchers say may someday have practical applications in treating illness, works by precisely targeting and dismantling the outer membrane surrounding organelles, and is being made freely available to other scientists. In an interesting twist, say the researchers, the tool may also be able to dismantle aggregated proteins in cells that often characterize neurodegenerative conditions, such as amyotrophic lateral sclerosis (ALS).

The team focused their work on mitochondria, organelles that serve as the energy engines or powerhouses of cells, including human ones. They also focused on so-called Golgi bodies that act as factories and packagers of a variety of proteins and the nucleus, or control center of a cell.

Results of the researchers’ work appears in Cell Reports.

“We developed a scientific tool to test why cell organelles look the way they do in order to have a certain function,” says Takanari Inoue, a professor of cell biology at the Johns Hopkins University School of Medicine. The tool, says Inoue, may also help reveal why function may change—for better or worse—when an organelle’s shape is different.

In the case of mitochondria, for example, in people with Alzheimer’s disease, they enlarge and become disorganized inside, according to Inoue. In people with an accelerated aging disease called progeria, the nucleus is misshapen.

To develop the tool, a postdoctoral researcher in Inoue’s laboratory, Hideki Nakamura, recruited Listeria bacteria, which, because of their particular effects, cause food-borne illnesses. When Listeria invade an animal cell, they hijack the cell’s stores of actin, a protein that helps them move through the cell, where they soak up nutrients and find a way to escape and infect other cells.

Working with physics experts who can identify and measure the physical force generated by Listeria’s seizing of actin, Nakamura engineered the Listeria-linked actin to assemble proteins and other molecules that connect with the surface of an organelle within a cell, exert force on the surface, and break it open.

Scientists already have other methods to break open organelles inside cells, says Inoue, such as using so-called optical tweezers or stretching out the cell to flatten it. However, those methods probe the cell from the outside, and none of them, he says, can target organelles from inside the cell.

Nakamura, who is now at Kyoto University, Inoue, and their team dubbed the new tool ActuAtor.

In their new set of experiments, the team tested ActuAtor on human epithelial cells that line and cover the surfaces of skin and other organs, and were able to completely fragment mitochondria in the cells 10 minutes after the Listeria-based tool entered the cells. When the team looked at mitochondrial function before and after the mitochondria’s shape was altered, they found no large differences in their ability to generate power for the cell but did find that the cell “recognizes” that the mitochondrial shape is different, and increases efforts to get rid of the misshapen organelles, albeit only slightly.

“In this case, our team concluded that function may not follow form in mitochondria,” says Inoue.

The team also tested the tool on brain cells and different organelles, including nuclei and Golgi bodies, and were able to use ActuAtor to break open the organelles.

Inoue and Nakamura further repurposed ActuAtor to disperse the accumulation of protein granules inside cells that form because of “environmental stress” such as changes in temperature or lack of oxygen. The team says they’ll test this application of the tool on its ability to disperse protein aggregates that clump in brain cells in efforts to treat neurodegenerative diseases such as ALS.

Funding for the research came from the Japanese Science and Technology Agency; the Japan Society for the Promotion of Science; the Mochida Memorial Foundation for Medical and Pharmaceutical Research; the National Institutes of Health; the Human Frontier Science Program; the Department of Defense; the Alfred P. Sloan, McKnight, Klingenstein and Simons, and Vallee awards; the Kavli Institute; and the Air Force Research Laboratory.

Source: Johns Hopkins University

source

Suction cup delivers drugs through your cheek

Researchers have developed a suction cup that allows medications to be absorbed through the mucosal lining of the cheeks.

This new approach could spare millions of patients the pain and fear associated with injections.

Many of today’s medications belong to groups of relatively large molecules such as peptides. They are used to treat a wide range of diseases, including diabetes, obesity, and prostate cancer.

Unfortunately, taking these medications in tablet form is out of the question in most cases because they would break down in the digestive tract or remain too large to reach the bloodstream. Consequently, the patient’s only option is to receive their medication via injection.

A person holds the suction cup on their finger tip.
The suction cup measures around ten millimeters in diameter and six millimeters in height (about .39 inches and .23 inches, respectively). (Credit: Transire Bio)

“It’s an entirely new method of delivering medications that could spare millions of people the fear and pain associated with injections,” says Nevena Paunović, who works at the Chair of Drug Formulation and Delivery at ETH Zurich.

The mucosal lining of the cheek isn’t particularly suitable for delivering medication to the bloodstream. Its dense tissue has so far presented a major obstacle, especially for large molecules like peptides. But the researchers are now about to change this with the suction cup.

Patients press the suction cup—which measures around 10 millimeters in diameter and six millimeters in height (about .39 inches and .23 inches)—onto the lining of their cheek with two fingers. This produces a vacuum that stretches the lining, making it more permeable to the drug contained within the cup’s dome-shaped hollow. But that alone isn’t enough for the drug to reach the blood vessels.

The researchers supplemented the drug with an endogenous agent that turns the cell membranes into fluid, allowing the drug to penetrate to the deeper layers of tissue. Patients are advised to keep the suction cup on the inside of their cheek for a few minutes. That’s enough time for the drug to dissolve in saliva and enter their bloodstream directly via the now permeable mucosal lining.

Compared to the few oral formulations of peptides on the market, the suction cup supports the delivery of a wide range of medications without the need for any major technological adjustments.

The original idea for the suction cup came from Zhi Luo, a former postdoc working with Jean-Christophe Leroux, professor and lead of the Chair of Drug Formulation & Delivery.

At dinner with friends, he suddenly noticed he had half a peppercorn stuck to the inside of his oral cavity. Although uncomfortable, this experience gave him the idea of how to keep drugs in place on slippery surfaces. But before the team could turn the idea into a working prototype, they had a few problems to solve. The biggest challenge was to identify the right shape of the suction cup.

“We had to find out what geometry and how much of a vacuum were required to hold the suction cup in place on the mucosal lining of the cheek and to stretch it sufficiently without causing any damage,” says David Klein Cerrejon of the Chair of Drug Formulation & Delivery.

In addition to producing several prototypes, which the researchers designed and 3D printed themselves, this called for numerous tests using the mucosal lining of a pig’s cheek. To find the right penetration-promoting agent, the researchers tested a broad range of substances of varying concentrations and under a microscope evaluated how the different mixtures penetrated the tissue.

“Since the suction cup is a completely new delivery system, we had to experiment extensively before finding the right substance. It turned out that natural, endogenous substances are extremely well suited for this task,” Klein Cerrejon says.

The researchers then moved on to testing their suction cup and the penetration-promoting agent in authorized trials on dogs, because dogs and humans have very similar mucosal lining in their cheeks. No dogs were harmed by the testing. The researchers were pleased with the results.

“We could see from the blood samples that the suction cup efficiently delivered medications to the dogs’ bloodstreams,” Klein Cerrejon says.

So far, the team has also tested the empty suction cup on 40 people. Not only did the suction cup remain attached for 30 minutes but also received positive feedback from the people testing it. Most of the volunteers said that they would by far prefer the new delivery system over an injection.

The researchers will have to carry out further tests with this new delivery system in preparation for conducting a clinical trial on healthy volunteers. There are also several regulatory hurdles to clear before the suction cup hits the market. For this, the researchers need strong partners and sufficient funding.

The study is published in Science Translational Medicine.

Source: Christoph Elhardt for ETH Zurich

source

Vascular cells help form long-term memories

A new study reveals the crucial role of vascular system cells—known as pericytes—in the formation of long-term memories of life events.

The research shows that pericytes, which wrap around the capillaries—the body’s small blood vessels—work in concert with neurons to help ensure that long-term memories are formed.

“We now have a firmer understanding of the cellular mechanisms that allow memories to be both formed and stored,” says Cristina Alberini, a professor in New York University’s Center for Neural Science and senior author of the paper in Neuron. “It’s important because understanding the cooperation among different cell types will help us advance therapeutics aimed at addressing memory-related afflictions.”

“This work connects important dots between the newly discovered function of pericytes in memory and previous studies showing that pericytes are either lost or malfunction in several neurodegenerative diseases, including Alzheimer’s disease and other dementia,” explains author Benjamin Bessières, a postdoctoral researcher in NYU’s Center for Neural Science.

Pericytes help maintain the structural integrity of the capillaries. Specifically, they control the amount of blood flowing in the brain and play a key role in maintaining the barrier that stops pathogens and toxic substances from leaking out of the capillaries and into brain tissue.

The discovery of the pericytes’ significance in long-term memory emerged because Alberini, Bessières, Kiran Pandey, and their colleagues examined the role of insulin-like growth factor 2 (IGF2)—a protein that was known to increase following learning in brain regions, such as the hippocampus, and to play a critical role in the formation and storage of memories.

They found that IGF2’s highest levels in the brain cells of the hippocampus do not come from neurons or glial cells, or other vascular cells, but, rather, from pericytes.

IGF2’s presence in pericytes, then, raises the following question: How is this linked to memory?

The scientists conducted a series of cognitive experiments using mice, comparing behaviors of those with pericytes that produced IGF2 and those that did not. By removing IGF2-producing capacity in some, the researchers could isolate the significance of both pericytes and IGF2 in neurological processes.

In these experiments, the mice were subjected to a series of memory tests—learning to associate a mild foot shock to a specific context or learning to identify objects placed in a new location.

Their results showed that production of IGF2 by pericytes in the hippocampus was enhanced by the learning event. More specifically, this increase in pericytic IGF2 took place in response to activity of neurons, revealing a concerted, neuron-pericytic action. In addition, IGF2 produced by pericytes were shown to circle back to influence biological responses of neurons that are critical for memory.

“IGF2 produced from pericytes and acting on neurons support the idea that a neurovascular unit regulates neuronal responses as well as functions of the blood barrier and may have repercussions on brain injury and inflammation,” notes Pandey, a postdoctoral researcher in NYU’s Center for Neural Science.

“Cooperation between neurons and pericytes is necessary to assure that long-term memories are formed,” observes Alberini. “Our study provides a new view of the biology of memory—though more research is needed to further understand the roles of pericytes and the vascular system in memory and its diseases.”

The research also included scientists from Cold Spring Harbor Laboratory and the University of Cambridge.

The research had support from the National Institutes of Health, the United Kingdom’s Biotechnology and Biological Sciences Research Council, and Medical Research Council.

Alberini is a founder and equity holder of Ritrova Therapeutics, Inc, a company exploring new treatments for neurodegenerative diseases and neurodevelopmental disorders. The company did not fund this study.

Source: NYU

source

Why are carrots orange?

A new study of the genetic blueprints of more than 600 types of carrot shows that three specific genes are required to give carrots an orange color.

Surprisingly, the three required genes all need to be recessive, or turned off. The paper’s findings shed light on the traits important to carrot improvement efforts and could lead to better health benefits from the vegetable.

“Normally, to make some function, you need genes to be turned on,” says Massimo Iorizzo, an associate professor of horticultural science with North Carolina State University’s Plants for Human Health Institute and co-corresponding author of the paper in Nature Plants. “In the case of the orange carrot, the genes that regulate orange carotenoids—the precursor of vitamin A that have been shown to provide health benefits—need to be turned off.”

Carrots, especially the orange ones, contain high quantities of carotenoids, which can help reduce the risk of diseases like eye disease. The orange carrot is the most abundant plant source of pro-vitamin A in the American diet.

Researchers worked with colleagues from the University of Wisconsin-Madison to sequence 630 carrot genomes in a continuing examination of the history and domestication of the orange carrot.

A 2016 study published in Nature Genetics by the same team provided the first carrot genome sequence and uncovered the gene involved in the pigmentation of yellow carrot.

The researchers performed so-called selective sweeps—structural analyses among five different carrot groups to find areas of the genome that are heavily selected in certain groups. They found that many genes involved in flowering were under selection—mostly to delay the flowering process. Flowering causes the taproot, the edible root that we consume, to turn woody and inedible.

“We found many genes involved in flowering regulation that were selected in multiple populations in orange carrot, likely to adapt to different geographic regions,” Iorizzo says.

The study also adds further evidence that carrots were domesticated in the 9th or 10th century in western and central Asia.

Purple carrots were common in central Asia along with yellow carrots,” Iorizzo says. “Both were brought to Europe, but yellow carrots were more popular, likely due to their taste.”

Orange carrots, which made their appearance in western Europe in about the 15th or 16th century, may have resulted from crossing a white and yellow carrot, Iorizzo says.

“This study basically reconstructed the chronology of when carrot was domesticated and then orange carrot was selected,” he says. “Orange carrot could have resulted from white and yellow carrot crosses, as white and yellow carrots are at the base of the phylogenetic tree for the orange carrot.”

The color and sweeter flavor of the orange carrot drove its popularity and farmers selected for those traits. Different types of orange carrots were developed in northern Europe in the 16th and 17th centuries, which matches the appearance of different shades of orange carrots in paintings from that era.

Orange carrots later grew in popularity as greater understanding of alpha- and beta-carotenes, the precursor of vitamin A in the diet, progressed in the late 19th and early 20th centuries.

“Carotenoids got their name because they were first isolated from carrots,” Iorizzo says.

Additional coauthors are from the NC State and the University of Wisconsin-Madison.

The National Institute of Food and Agriculture and the US Department of Agriculture supported the work.

Source: NC State

source