Maya people shopped at places like today’s supermarkets

More than 500 years ago in the midwestern Guatemalan highlands, Maya people bought and sold goods at markets with far less oversight from their rulers than archeologists previously thought.

That’s according to a new study that shows the ruling K’iche’ elite took a hands-off approach when it came to managing the procurement and trade of obsidian by people outside their region of central control.

In these areas, access to nearby sources of obsidian, a glasslike rock used to make tools and weapons, was managed by local people through independent and diverse acquisition networks. Overtime, the availability of obsidian resources and the prevalence of craftsmen to shape it resulted in a system that is in many ways suggestive of contemporary market-based economies.

“Scholars have generally assumed that the obsidian trade was managed by Maya rulers, but our research shows that this wasn’t the case at least in this area,” says Rachel Horowitz, assistant professor of anthropology at Washington State University and lead author of the study published in the journal Latin American Antiquity.

“People seem to have had a good deal of economic freedom including being able to go to places similar to the supermarkets we have today to buy and sell goods from craftsmen.”

While there are extensive written records from the Maya Postclassic Period (1200-1524 AD) on political organization, much less is known about how societal elites wielded economic power. Horowitz set out to address this knowledge gap for the K’iche’ by examining the production and distribution of obsidian artifacts, which are used as a proxy by archeologists to determine the level of economic development in a region.

She performed geochemical and technological analysis on obsidian artifacts excavated from 50 sites around the K’iche’ capital of Q’umarkaj and surrounding region to determine where the raw material originally came from and techniques of its manufacture.

Her results show that the K’iche’ acquired their obsidian from similar sources in the Central K’iche’ region and Q’umarkaj, indicating a high degree of centralized control. The ruling elite also seemed to manage the trade of more valuable forms of nonlocal obsidian, particularly Pachua obsidian from Mexico, based off its abundance in these central sites.

Outside this core region though, in areas conquered by the K’iche, there was less similarity in obsidian economic networks. Horowitz’s analysis suggests these sites had access to their own sources of obsidian and developed specialized places where people could go to buy blades and other useful implements made from the rock by experts.

“For a long time, there has been this idea that people in the past didn’t have market economies, which when you think about it is kind of weird. Why wouldn’t these people have had markets in the past?” she says. “The more we look into it, the more we realize there were a lot of different ways in which these peoples’ lives were similar to ours.”

The Middle American Research Institute at Tulane University loaned Horowitz the obsidian blades and other artifacts she used for her study. The artifacts were excavated in the 1970s.

Moving forward, Horowitz says she plans to examine more of the collection, the rest of which is housed in Guatemala, to discover further details about how the Maya conducted trade, managed their economic systems, and generally went about their lives.

Source: Washington State University

source

Gene-editing gets fungi to spill secrets to new drugs

A high-efficiency gene-editing tool can get fungi to produce significantly more natural compounds, including some previously unknown to the scientific community, say researchers.

Using the approach that simultaneously modifies multiple sites in fungal genomes, Rice University chemical and biomolecular engineer Xue Sherry Gao and collaborators coax fungi into revealing their best-kept secrets, ramping up the pace of new drug discovery.

It is the first time that the technique, multiplex base-editing (MBE), has been deployed as a tool for mining fungal genomes for medically useful compounds. Compared to single-gene editing, the MBE platform reduces the research timeline by over 80% in equivalent experimental settings, from an estimated three months to roughly two weeks.

Fungi and other organisms produce bioactive small molecules such as penicillin to protect themselves from disease agents. These bioactive natural products (NPs) can be used as drugs or as molecular blueprints for designing new drugs.

The study appears in the Journal of the American Chemical Society.

Gene-editing fungi

Base-editing refers to the use of CRISPR-based tools in order to modify a rung in the spiral ladder of DNA known as a base pair. Previously, gene modifications using base-editing had to be carried out one at a time, making the research process more time-consuming. “We created a new machinery that enables base-editing to work on multiple genomic sites, hence the ‘multiplex,’” Gao says.

Gao and her team first tested the efficacy of their new base-editing platform by targeting genes encoding for pigment in a fungal strain known as Aspergillus nidulans. The effectiveness and precision of MBE-enabled genome edits was readily visible in the changed color displayed by A. nidulans colonies.

‘Cryptic’ genes

“To me, the fungal genome is a treasure,” Gao says, referring to the significant medical potential of fungi-derived compounds. “However, under most circumstances, fungi ‘keep to themselves’ in the laboratory and don’t produce the bioactive small molecules we are looking for. In other words, the majority of genes or biosynthetic gene clusters of interest to us are ‘cryptic,’ meaning they do not express their full biosynthetic potential.

“The genetic, epigenetic, and environmental factors that instruct organisms to produce these medically useful compounds are extremely complicated in fungi,” Gao says. Enabled by the MBE platform, her team can easily delete several of the regulatory genes that restrict the production of bioactive small molecules. “We can observe the synergistic effects of eliminating those factors that make the biosynthetic machinery silent,” she says.

Disinhibited, the engineered fungal strains produce more bioactive molecules, each with their own distinct chemical profiles. Five of the 30 NPs generated in one assay were new, never-before-reported compounds.

“These compounds could be useful antibiotics or anticancer drugs,” Gao says. “We are in the process of figuring out what the biological functions of these compounds are and we are collaborating with groups in the Baylor College of Medicine on pharmacological small-molecule drug discovery.”

Gao’s research plumbs fungal genomes in search of gene clusters that synthesize NPs. “Approximately 50% of clinical drugs approved by the US Food and Drug Administration are NPs or NP-derivatives,” and fungi-derived NPs “are an essential pharmaceutical source,” she says. Penicillin, lovastatin, and cyclosporine are some examples of drugs derived from fungal NPs.

The National Institutes of Health and the Robert A. Welch Foundation supported the research.

Source: Rice University

source

Newfound part of the brain acts as shield and watchdog

Researchers have discovered a previously unknown part of brain anatomy that acts as both a protective barrier and platform from which immune cells monitor the brain for infection and inflammation.

From the complexity of neural networks to basic biological functions and structures, the human brain only reluctantly reveals its secrets.

Advances in neuro-imaging and molecular biology have only recently enabled scientists to study the living brain at a level of detail not previously achievable, unlocking many of its mysteries.

The new study comes from the labs of Maiken Nedergaard, co-director of the Center for Translational Neuromedicine at University of Rochester and the University of Copenhagen and Kjeld Møllgård, a professor of neuroanatomy at the University of Copenhagen.

Nedergaard and her colleagues have transformed our understanding of the fundamental mechanics of the human brain and made significant findings in the field of neuroscience, including detailing the many critical functions of previously overlooked cells in the brain called glia and the brain’s unique process of waste removal, which the lab named the glymphatic system.

“The discovery of a new anatomic structure that segregates and helps control the flow of cerebrospinal fluid (CSF) in and around the brain now provides us much greater appreciation of the sophisticated role that CSF plays not only in transporting and removing waste from the brain, but also in supporting its immune defenses,” says Nedergaard.

The study focuses on the series of membranes that encase the brain, creating a barrier from the rest of the body and keeping the brain bathed in CSF. The traditional understanding of what is collectively called the meningeal layer identifies the three individual layers as dura, arachnoid, and pia matter.

The new layer discovered by the US and Denmark-based research team further divides the space between the arachnoid and pia layers, the subarachnoid space, into two compartments, separated by the newly described layer, which the researchers name SLYM, an abbreviation of Subarachnoidal LYmphatic-like Membrane.

While much of the research in the paper describes the function of SLYM in mice, they also report its presence in the adult human brain as well.

SLYM is a type of membrane that lines other organs in the body, including the lungs and heart, called mesothelium. These membranes typically surround and protect organs, and harbor immune cells. The idea that a similar membrane might exist in the central nervous system was a question first posed by Møllgård, the first author of the study, whose research focuses on developmental neurobiology, and on the systems of barriers that protect the brain.

The new membrane is very thin and delicate, consisting of only a few cells in thickness. Yet SLYM is a tight barrier, allowing only very small molecules to transit and it also seems to separate “clean” and “dirty” CSF.

This last observation hints at the likely role played by SLYM in the glymphatic system, which requires a controlled flow and exchange of CSF, allowing the influx of fresh CSF while flushing the toxic proteins associated with Alzheimer’s and other neurological diseases from the central nervous system.

This discovery will help researchers more precisely understand the mechanics of the glymphatic system.

The SLYM also appears important to the brain’s defenses. The central nervous system maintains its own native population of immune cells, and the membrane’s integrity prevents outside immune cells from entering. In addition, the membrane appears to host its own population of central nervous system immune cells that use SLYM as an observation point close to the surface of the brain from which to scan passing CSF for signs of infection or inflammation.

Discovery of the SLYM opens the door for further study of its role in brain disease. For example, the researchers note that larger and more diverse concentrations of immune cells congregate on the membrane during inflammation and aging. Furthermore, when the membrane was ruptured during traumatic brain injury, the resulting disruption in the flow of CSF impaired the glymphatic system and allowed non-central nervous system immune cells to enter the brain.

These and similar observations suggest that diseases as diverse as multiple sclerosis, central nervous system infections, and Alzheimer’s might be triggered or worsened by abnormalities in SLYM function. They also suggest that the delivery of drugs and gene therapeutics to the brain may be affected by SLYM, which will need to be considered as new generations of biologic therapies are being developed.

The research appears in the journal Science. Additional coauthors are from the University of Copenhagen.

Support for the study came from the Lundbeck Foundation, Novo Nordisk Foundation, the National Institute of Neurological Disorders and Stroke, the US Army Research Office, the Human Frontier Science Program, the Dr. Miriam and Sheldon G. Adelson Medical Research Foundation, and the Simons Foundation.

Source: University of Rochester

source

3D imaging tracks cancer radiation in real time

Precise 3D imaging makes it possible to track radiation, used to treat half of all cancer patients, in real time.

By capturing and amplifying tiny sound waves created when X-rays heat tissues in the body, medical professionals can map the radiation dose within the body, giving them new data to guide treatments. It’s a first-of-its-kind view of an interaction doctors have previously been unable to “see.”

Dose accumulation in an imitation patient made of lard over a delivery time of around 19 seconds, as continuously monitored by the iRAI system. (Credit: U. Michigan Optical Imaging Laboratory)

“Once you start delivering radiation, the body is pretty much a black box,” says Xueding Wang, professor of biomedical engineering and professor of radiology who leads the Optical Imaging Laboratory at the University of Michigan.

“We don’t know exactly where the X-rays are hitting inside the body, and we don’t know how much radiation we’re delivering to the target. And each body is different, so making predictions for both aspects is tricky,” says Wang, corresponding author of the study in Nature Biotechnology.

Radiation is used in treatment for hundreds of thousands of cancer patients each year, bombarding an area of the body with high energy waves and particles, usually X-rays. The radiation can kill cancer cells outright or damage them so that they can’t spread.

These benefits are undermined by a lack of precision, as radiation treatment often kills and damages healthy cells in the areas surrounding a tumor. It can also raise the risk of developing new cancers.

With real-time 3D imaging, doctors can more accurately direct the radiation toward cancerous cells and limit the exposure of adjacent tissues. To do that, they simply need to “listen.”

When X-rays are absorbed by tissues in the body, they are turned into thermal energy. That heating causes the tissue to expand rapidly, and that expansion creates a sound wave.

The acoustic wave is weak and usually undetectable by typical ultrasound technology. The new ionizing radiation acoustic imaging system detects the wave with an array of ultrasonic transducers positioned on the patient’s side. The signal is amplified and then transferred into an ultrasound device for image reconstruction.

With the images in-hand, an oncology clinic can alter the level or trajectory of radiation during the process to ensure safer and more effective treatments.

“In the future, we could use the imaging information to compensate for uncertainties that arise from positioning, organ motion, and anatomical variation during radiation therapy,” says first author Wei Zhang, a research investigator in biomedical engineering. “That would allow us to deliver the dose to the cancer tumor with pinpoint accuracy.”

Another benefit of the new technology is it can be easily added to current radiation therapy equipment without drastically changing the processes that clinicians are used to.

“In future applications, this technology can be used to personalize and adapt each radiation treatment to assure normal tissues are kept to a safe dose and that the tumor receives the dose intended,” says Kyle Cuneo, associate professor of radiation oncology at Michigan Medicine.

“This technology would be especially beneficial in situations where the target is adjacent to radiation sensitive organs such as the small bowel or stomach.”

The University of Michigan has applied for patent protection and is seeking partners to help bring the technology to market. The National Cancer Institute and the Michigan Institute for Clinical and Health Research supported the work.

Source: University of Michigan

source

Coral bleaching makes it hard for fish to spot foes

Mass coral bleaching events make it harder for some species of reef fish to identify competitors, new research reveals.

Scientists studying reefs across five Indo-Pacific regions found that the ability of butterfly fish individuals to identify competitor species and respond appropriately was compromised after widespread loss of coral caused by bleaching.

This change means they make poorer decisions that leave them less able to avoid unnecessary fights, using up precious limited energy. The scientists believe this increases the likelihood of coral loss.

“By recognizing a competitor, individual fish can make decisions about whether to escalate, or retreat from, a contest—conserving valuable energy and avoiding injuries,” says lead author Sally Keith, a senior lecturer in marine biology at Lancaster University.

“These rules of engagement evolved for a particular playing field, but that field is changing. Repeated disturbances, such as bleaching events, alter the abundance and identity of corals—the food source of butterfly fish. It’s not yet clear whether these fish have the capacity to update their rule book fast enough to recalibrate their decisions.”

“The impacts of global change on biodiversity are increasingly obvious,” says coauthor Nate Sanders, a professor in the ecology and evolutionary biology department at the University of Michigan. “This work highlights the importance of studying the behavioral responses of individuals in light of global change.”

For the study in Proceedings of the Royal Society B, the researchers took more than 3,700 observations of 38 species of butterfly fish on reefs before and after coral bleaching events and compared their behaviors.

After coral mortality caused by the bleaching event, signaling between fish of different species was less common, with encounters escalating to chases in more than 90% of cases—up from 72% before the event. Researchers also found the distance of these chases increased following bleaching, with fish expending more energy chasing away potential competitors than they would have done previously.

The researchers believe the environmental disturbances are affecting fish recognition and responses because the bleaching events, in which many corals die, are forcing fish species to change and diversify their diets and territories. Therefore, these large-scale environmental changes are disrupting long-established and co-evolved relationships that allow multiple fish species to coexist.

“We know that biodiversity is being lost—species are vanishing and populations are declining,” Sanders says. “Perhaps by focusing more on how the behavior of individuals responds to global change, we can start to predict how biodiversity might change in the future. And better yet, try to do something about it.”

Additional coauthors are from Lancaster University and the University of Queensland. The Natural Environment Research Council, the Australian Research Council, and the Villum Foundation funded the work.

Source: University of Michigan

source

Animals that carry seeds help regenerate forests

Animals are crucial to reforestation, research finds.

And yet, the world’s wildlife populations have declined by almost 70% in the last 50 years as humans have destroyed and polluted habitats.

Efforts to restore forests have often focused on trees, but a new study in the journal Philosophical Transactions that animals play a key role in the recovery of tree species by carrying a wide variety of seeds into previously deforested areas.

Sergio Estrada-Villegas, a postdoctoral associate at the Yale School of the Environment, led the study with Liza Comita, professor of tropical forest ecology. The project examines a series of regenerating forests in central Panama spanning 20 to 100 years post-abandonment.

“When we talk about forest restoration, people typically think about going out and digging holes and planting seedlings,” Comita says. “That’s actually not a very cost-effective or efficient way to restore natural forests. If you have a nearby preserved intact forest, plus you have your animal seed dispersers around, you can get natural regeneration, which is a less costly and labor-intensive approach.”

The research team analyzed a unique, long-term data set from the forest in Barro Colorado Nature Monument in Panama, which the Smithsonian Tropical Research Institute oversees, to compare what proportion of tree species in forests were dispersed by animals or other methods, like wind or gravity, and how that changes over time as the forest ages. The team focused on the proportion of plants dispersed by four groups of animals: flightless mammals, large birds, small birds, and bats.

Because the area has been intensely studied by biologists at the Smithsonian for about a century, the research team was able to delve into data stemming back decades, including aerial photographs taken in the 1940s-1950s. The area also presents a unique view into forests where there is very little hunting or logging. The results offer the most detailed data of animal seed dispersal across the longest time frame of natural restoration, according to the study.

The role of flightless animals in seed dispersal across all forest ages, from 20 years to old growth, and the variety of animal species involved were among the most important findings of the study and point to the importance of natural regeneration of forests, Comita and Estrada-Villegas say. In tropical forests, more than 80% of tree species can be dispersed by animals.

The researchers say the findings can serve as a road map for natural regeneration of forests that preserve biodiversity and capture and store carbon at a time when the UN Decade of Restoration is highlighting the need for land conservation, and world leaders are working to mitigate climate change stemming from fossil fuel emissions.

Forests soak up carbon dioxide from the atmosphere and store it in biomass and soils. Tropical forests, in particular, play an important role in regulating global climate and supporting high plant and animal diversity, the researchers note.

Estrada-Villegas, an ecologist who studies both bats and plants, says the study highlights how crucial animals are to healthy forests.

“In these tropical environments, animals are paramount to a speedy recovery of forests,” says Estrada-Villegas.

Coauthors are from the Max Planck Institute for Animal Behavior; the Universidad de los Andes in Bogota, Columbia; the Smithsonian Tropical Research Institute in Balboa, Panama; and Clemson University.

Source: Yale University

source

Climate change will make Atlantic tropical storms worse

A warming climate will increase the number of tropical cyclones and their intensity in the North Atlantic, potentially creating more and stronger hurricanes, according to simulations using a high-resolution, global climate model.

“Unfortunately, it’s not great news for people living in coastal regions,” says Christina Patricola, an Iowa State University assistant professor of geological and atmospheric sciences, an affiliate of the US Department of Energy’s Lawrence Berkeley National Laboratory in California, and a study leader.

“Atlantic hurricane seasons will become even more active in the future, and hurricanes will be even more intense,” Patricola says.

The researchers ran climate simulations using the Department of Energy’s Energy Exascale Earth System Model and found that tropical cyclone frequency could increase 66% during active North Atlantic hurricane seasons by the end of this century.

Those seasons are typically characterized by La Niña conditions—unusually cool surface water in the eastern tropical Pacific Ocean—and the positive phase of the Atlantic Meridional Mode—warmer surface temperatures in the northern tropical Atlantic Ocean.

The projected numbers of tropical cyclones could increase by 34% during inactive North Atlantic hurricane seasons. Inactive seasons generally occur during El Niño conditions with warmer surface temperatures in the eastern tropical Pacific Ocean and the negative phase of the Atlantic Meridional Mode with cooler surface temperatures in the northern tropical Atlantic Ocean.

In addition, the simulations project an increase in storm intensity during the active and inactive storm seasons.

“Altogether, the co-occurring increase in (tropical cyclone) number and strength may lead to increased risk to the continental North Atlantic in the future climate,” the researchers write.

Patricola adds: “Anything that can be done to curb greenhouse gas emissions could be helpful to reduce this risk.”

What are North Atlantic tropical cyclones? “Tropical cyclone is a more generic term than hurricane,” Patricola says. “Hurricanes are relatively strong tropical cyclones.”

Exactly, says the National Oceanic and Atmospheric Administration. Tropical cyclone is a general reference to a low-pressure system that forms over tropical waters with thunderstorms near the center of its closed, cyclonic winds. When those rotating winds exceed 39 MPH, the system becomes a named tropical storm. At 74-plus MPH, it becomes a hurricane in the Atlantic and East Pacific oceans, a typhoon in the northern West Pacific.

Patricola and another group of collaborators have also published a second research paper about tropical cyclones, also in Geophysical Research Letters. The paper examines a possible explanation for the relatively constant number of tropical cyclones observed globally from year to year.

Could it be that African Easterly Waves, low pressure systems over the Sahel region of North Africa that take moist tropical winds and raise them up into thunderclouds, are a key to that steady production of storms?

Using regional model simulations, the researchers were able to filter out the African Easterly Waves and see what happened.

As it turned out, the simulations didn’t change the seasonal number of Atlantic tropical cyclones. But, tropical cyclones were stronger, peak formation of the storms shifted from September to August, and the formation region shifted from the coast of North Africa to the Gulf of Mexico.

So African Easterly Waves many not help researchers predict the number of Atlantic tropical cyclones every year, but they do appear to affect important storm characteristics, including intensity and possibly where they make landfall.

Both papers call for more study.

“We are,” Patricola says, “chipping away at the problem of predicting the number of tropical cyclones.”

Source: Iowa State University

source

To prevent HIV, ease intimate partner violence

Women in Sub-Saharan Africa who experience recent intimate partner violence are three times more likely to contract HIV, according to new research.

“Worldwide, more than one in four women experience intimate partner violence in their lifetime,” says McGill University Professor Mathieu Maheu-Giroux, a Canada Research Chair in Population Health Modeling.

“Sub-Saharan Africa is among one of the regions in the world with the highest prevalence of both IPV and HIV. We wanted to examine the effects of intimate partner violence on recent HIV infections and women’s access to HIV care in this region,” he says.

Their study, published in The Lancet HIV, shows considerable overlap between violence against women and the HIV epidemics in some of the highest burdened countries. Among women living with HIV, those experiencing intimate partner violence were 9% less likely to achieve viral load suppression—the ultimate step in HIV treatment.

“The 2021 UN General Assembly, attended and supported by the Government of Canada, adopted the Political Declaration on HIV and AIDS with bold new global targets for 2025. This encompasses a commitment to eliminate all forms of sexual and gender-based violence, including IPV, as a key enabler of the HIV epidemic. Improving our understanding of the relationships between IPV and HIV is essential to meet this commitment,” says Maheu-Giroux.

The researchers found that physical or sexual intimate partner violence in the past year was associated with recent HIV acquisition and less frequent viral load suppression. According to the researchers, IPV could also pose barriers for women in getting HIV care and remaining in care while living with the virus.

“Given the high burden of IPV worldwide, including in Canada, the need to stem the mutually reinforcing threats of IPV and HIV on women’s health and well-being is urgent,” says Salome Kuchukhidze, a PhD candidate studying epidemiology and the lead author of the research.

Source: McGill University

source

‘Semi-sub’ vehicle is tricky to detect

A semi-submersible vehicle may prove that the best way to travel undetected and efficiently in water is not on top, or below, but in between.

The roughly 1.5-foot-long semi-sub prototype, built with off-the-shelf and 3D-printed parts, showed its seaworthiness in water tests, moving quickly with low drag and a low profile.

The researchers detailed the test results in a study in the journal Unmanned Systems.

This vessel-type isn’t new. Authorities have discovered crudely made semi-subs being used for illicit purposes in recent years, but researchers aim to demonstrate how engineer-developed half-submerged vessels can efficiently serve military, commercial, and research purposes.

“A semi-submersible vehicle is relatively inexpensive to build, difficult to detect, and it can go across oceans,” says Konstantin Matveev, a professor of engineering at Washington State University who led the work.

“It’s not so susceptible to waves in comparison to surface ships since most of the body is underwater, so there are some economic advantages as well.”

Since the semi-sub sails mostly at the water line, it does not need to be made of as strong materials as a submarine which has to withstand the pressure of being underwater for long periods of time. The semi-sub also has the advantage of having a small platform in contact with the atmosphere, making it easier to receive and transmit data.

For this study, Matveev and coauthor Pascal Spino, a recent Washington State graduate, piloted the semi-sub in Snake River’s Wawawai Bay in Washington state. They tested its stability and ability to maneuver.

The semi-sub reached a max speed of 1.5 meters per second (roughly 3.4 miles an hour), but at higher speeds, it rises above the water creating more of a wake and expending more energy. At lower speeds, it is almost fully immersed and barely makes a ripple.

The researchers also outfitted the semi-sub with sonar and mapped the bottom of a reservoir near Pullman, Washington to test its ability to collect and transmit data.

While not yet completely autonomous, the semi-sub can be pre-programmed to behave in certain ways, such as running a certain route by itself or responding to particular objects by pursuing them or running away.

While the semi-sub is relatively small at 450 mm long with a 100 mm diameter (about 1.5 foot long and 4 inches in diameter), Matveev says it is possible for larger semi-subs to be built to carry significant cargo. For instance, they could be used to help refuel ships or stations at sea. They could even be scaled up to rival container ships, and since they experience less drag in the water, they would use less fuel creating both an environmental and economic advantage.

For now, Matveev’s lab is continuing work on optimizing the shape of semi-submersible vehicle prototypes to fit specific purposes. He is currently collaborating with the US Naval Academy in Annapolis, Maryland to work on the vehicles’ operational capabilities and compare numerical simulations with results from experiments.

Source: Washington State University

source

2D material may lead to sharper phone photos

A new type of active pixel sensor that uses a novel 2D-material may enable ultra-sharp cell phone photos and create a new class of extremely energy-efficient Internet of Things sensors, researchers say.

“When people are looking for a new phone, what are the specs that they are looking for?” says Saptarshi Das, associate professor of engineering science and mechanics at Penn State and lead author of the paper in Nature Materials.

“Quite often, they are looking for a good camera, and what does a good camera mean to most people? Sharp photos with high resolution.”

Most people just snap a photo of a friend, a family gathering, or a sporting event, and never think about what happens “behind the scenes” inside the phone when one snaps a picture. There is quite a bit happening to enable you to see a photo right after you take it, and this involves image processing, Das says.

“When you take an image, many of the cameras have some kind of processing that goes on in the phone, and in fact, this sometimes makes the photo look even better than what you are seeing with your eyes. These next generation of phone cameras integrate image capture with image processing to make this possible, and that was not possible with older generations of cameras.”

However, the great photos in the newest cameras have a catch—the processing requires a lot of energy.

“There’s an energy cost associated with taking a lot of images,” says Akhil Dodda, a graduate research assistant at Penn State at the time of the study who is now a research staff member at Western Digital, and co-first author of the study.

“If you take 10,000 images, that is fine, but somebody is paying the energy costs for that. If you can bring it down by a hundredfold, then you can take 100 times more images and still spend the same amount of energy. It makes photography more sustainable so that people can take more selfies and other pictures when they are traveling. And this is exactly where innovation in materials comes into the picture.”

Low light phone photos

The innovation in materials outlined in the study revolves around how the researchers added in-sensor processing to active pixel sensors to reduce their energy use. So, they turned to a novel 2D material, which is a class of materials only one or a few atoms thick, molybdenum disulfide. It is also a semiconductor and sensitive to light, which makes it ideal as a potential material to explore for low-energy in-sensor processing of images.

“We found that molybdenum disulfide has very good photosensitive response,” says Darsith Jayachandran, graduate research assistant in engineering and mechanics and co-first author of the study. “From there, we tested it for the other properties we were looking for.”

These properties included sensitivity to low light, which is important for the dynamic range of the sensor. The dynamic range refers to the ability to “see” objects in both low light such as moonlight and bright light such as sunlight. The human eye can see stars at night better than most cameras due to having superior dynamic range.

Molybdenum disulfide also demonstrated strong signal conversion, charge-to-voltage conversion and data transmission capabilities. This makes the material an ideal candidate to enable an active pixel sensor that can do both light sensing and in-sensor image processing.

“From there, we put the sensors into an array,” Jayachandran says. “There are 900 pixels in a nine square millimeter array we developed, and each pixel is about 100 micrometers. They are much more sensitive to light than current CMOS sensors, so they do not require any additional circuitry or energy use. So, each pixel requires much less energy to operate, and this would mean a better cell phone camera that uses a lot less battery.”

Internet of Things benefits

The dynamic range and image processing would enable users to take sharp photos in a variety of adverse conditions for photography, according to Das.

“For example, you could take clearer photos of friends outside at night or on a rainy or foggy day,” Das says. “The camera could do denoising to clear up the fog and the dynamic range would enable say a night photo of a friend with stars in the background.”

Along with enabling a top-rate phone camera in the future, the team also envisions their improved sensor technology could have other applications. This would include better light sensors for Internet of Things (IoT) and Industry 4.0 applications.

Industry 4.0 is the term for a growing movement that combines traditional industry practices and cutting-edge digital technology such as the Internet of Things, cloud data storage, and artificial intelligence/machine learning. The goal is to improve manufacturing by developing more efficient processes and practices through intelligent automation, and sensors are key.

“Sensors that can see through machines while in operation and identify defects are very important in the IoT,” Dodda says. “Conventional sensors consume a lot of energy so that is a problem, but we developed an extremely energy efficient sensor that enables better machine learning, etc. and saves a lot in energy costs.”

The Department of Defense and the National Science Foundation supported the work.

Source: Penn State

source