Multiple spacecraft tell the story of one giant solar storm
This diagram shows the positions of individual spacecraft, as well as Earth and Mars, during the solar outburst on April 17, 2021. The Sun is at the center. The black arrow shows the direction of the initial solar flare. Several spacecraft detected solar energetic particles (SEPs) over 210 degrees around the Sun (blue shaded area). Credit: Solar-MACH

April 17, 2021, was a day like any other day on the sun, until a brilliant flash erupted and an enormous cloud of solar material billowed away from our star. Such outbursts from the sun are not unusual, but this one was unusually widespread, hurling high-speed protons and electrons at velocities nearing the speed of light and striking several spacecraft across the inner solar system.

In fact, it was the first time such high-speed protons and electrons—called  (SEPs)—were observed by spacecraft at five different, well-separated locations between the sun and Earth as well as by spacecraft orbiting Mars. And now these diverse perspectives on the solar storm are revealing that different types of potentially dangerous SEPs can be blasted into space by different solar phenomena and in different directions, causing them to become widespread.

“SEPs can harm our technology, such as satellites, and disrupt GPS,” said Nina Dresing of the Department of Physics and Astronomy, University of Turku in Finland. “Also, humans in space or even on airplanes on polar routes can suffer harmful radiation during strong SEP events.”

Scientists like Dresing are eager to find out where these particles come from exactly—and what propels them to such high speeds—to better learn how to protect people and technology in harm’s way. Dresing led a team of scientists that analyzed what kinds of particles struck each spacecraft and when. The team published its results in the journal Astronomy & Astrophysics.

Currently on its way to Mercury, the BepiColombo spacecraft, a joint mission of ESA (the European Space Agency) and JAXA (Japan Aerospace Exploration Agency), was closest to the blast’s direct firing line and was pounded with the most intense particles. At the same time, NASA’s Parker Solar Probe and ESA’s Solar Orbiter were on opposite sides of the flare, but Parker Solar Probe was closer to the sun, so it took a harder hit than Solar Orbiter did.

Next in line was one of NASA’s two Solar Terrestrial Relations Observatory (STEREO) spacecraft, STEREO-A, followed by the NASA/ESA Solar and Heliospheric Observatory (SOHO) and NASA’s Wind spacecraft, which were closer to Earth and well away from the blast. Orbiting Mars, NASA’s MAVEN and ESA’s Mars Express spacecraft were the last to sense particles from the event.

Altogether, the particles were detected over 210 longitudinal degrees of space (almost two-thirds of the way around the sun)—which is a much wider angle than typically covered by solar outbursts. Plus, each spacecraft recorded a different flood of electrons and protons at its location. The differences in the arrival and characteristics of the particles recorded by the various spacecraft helped the scientists piece together when and under what conditions the SEPs were ejected into space.

These clues suggested to Dresing’s team that the SEPs were not blasted out by a single source all at once but propelled in different directions and at different times potentially by different types of solar eruptions.

“Multiple sources are likely contributing to this event, explaining its wide distribution,” said team member Georgia de Nolfo, a heliophysics research scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Also, it appears that, for this event, protons and electrons may come from different sources.”

The team concluded that the electrons were likely driven into space quickly by the initial flash of light—a solar flare—while the protons were pushed along more slowly, likely by a shock wave from the cloud of solar material, or .

“This is not the first time that people have conjectured that electrons and protons have had different sources for their acceleration,” de Nolfo said. “This measurement was unique in that the multiple perspectives enabled scientists to separate the different processes better, to confirm that electrons and protons may originate from different processes.”

In addition to the flare and coronal mass ejection, spacecraft recorded four groups of radio bursts from the sun during the event, which could have been accompanied by four different particle blasts in different directions. This observation could help explain how the particles became so widespread.

“We had different distinct particle injection episodes—which went into significantly different directions—all contributing together to the widespread nature of the event,” Dressing said.

“This event was able to show how important multiple perspectives are in untangling the complexity of the event,” de Nolfo said.

These results show the promise of future NASA heliophysics missions that will use multiple spacecraft to study widespread phenomena, such as the Geospace Dynamics Constellation (GDC), SunRISE, PUNCH, and HelioSwarm. While single spacecraft can reveal conditions locally, multiple spacecraft orbiting in different locations provide deeper scientific insight and offer a more complete picture of what’s happening in space and around our home planet.

It also previews the work that will be done by future missions such as MUSE, IMAP, and ESCAPADE, which will study explosive solar events and the acceleration of particles into the solar system.

Modeling the origins of life: New evidence for an "RNA World"

Hammerhead sequences copied by the lower-fidelity polymerase drift away from their original RNA sequence (top) and lose their function over time. Hammerheads catalyzed by the higher-fidelity polymerase retain function and evolve fitter sequences (bottom).

Charles Darwin described evolution as “descent with modification.” Genetic information in the form of DNA sequences is copied and passed down from one generation to the next. But this process must also be somewhat flexible, allowing slight variations of genes to arise over time and introduce new traits into the population.

But how did all of this begin? In the , long before cells and proteins and DNA, could a similar sort of evolution have taken place on a simpler scale? Scientists in the 1960s, including Salk Fellow Leslie Orgel, proposed that life began with the “RNA World,” a hypothetical era in which small, stringy RNA molecules ruled the early Earth and established the dynamics of Darwinian evolution.

New research at the Salk Institute now provides fresh insights on the origins of life, presenting compelling evidence supporting the RNA World hypothesis. The study, published in Proceedings of the National Academy of Sciences (PNAS), unveils an RNA enzyme that can make accurate copies of other functional RNA strands, while also allowing new variants of the molecule to emerge over time. These remarkable capabilities suggest the earliest forms of evolution may have occurred on a molecular scale in RNA.

The findings also bring scientists one step closer to re-creating RNA-based life in the laboratory. By modeling these primitive environments in the lab, scientists can directly test hypotheses about how life may have started on Earth, or even other planets.

“We’re chasing the dawn of evolution,” says senior author and Salk President Gerald Joyce. “By revealing these novel capabilities of RNA, we’re uncovering the potential origins of life itself, and how simple molecules could have paved the way for the complexity and diversity of life we see today.”

Scientists can use DNA to trace the history of evolution from modern plants and animals all the way back to the earliest single-celled organisms. But what came before that remains unclear. Double-stranded DNA helices are great for storing . Many of those genes ultimately code for proteins—complex molecular machines that carry out all sorts of functions to keep cells alive.

What makes RNA unique is that these molecules can do a bit of both. They’re made of extended nucleotide sequences, similar to DNA, but they can also act as enzymes to facilitate reactions, much like proteins. So, is it possible that RNA served as the precursor to life as we know it?

Scientists like Joyce have been exploring this idea for years, with a particular focus on RNA polymerase ribozymes—RNA molecules that can make copies of other RNA strands.

Over the last decade, Joyce and his team have been developing RNA polymerase ribozymes in the lab, using a form of directed evolution to produce new versions capable of replicating larger molecules. But most have come with a fatal flaw: they aren’t able to copy the sequences with a high enough accuracy. Over many generations, so many errors are introduced into the sequence that the resulting RNA strands no longer resemble the original sequence and have lost their function entirely.

Until now. The latest RNA polymerase ribozyme developed in the lab includes a number of crucial mutations that allow it to copy a strand of RNA with much higher accuracy.

In these experiments, the RNA strand being copied is a “hammerhead,” a small molecule that cleaves other RNA molecules into pieces. The researchers were surprised to find that not only did the RNA polymerase ribozyme accurately replicate functional hammerheads, but over time, new variations of the hammerheads began to emerge.

These new variants performed similarly, but their mutations made them easier to replicate, which increased their evolutionary fitness and led them to eventually dominate the lab’s hammerhead population.

“We’ve long wondered how simple life was at its beginning and when it gained the ability to start improving itself,” says first author Nikolaos Papastavrou, a research associate in Joyce’s lab.

“This study suggests the dawn of evolution could have been very early and very simple. Something at the level of individual molecules could sustain Darwinian evolution, and that might have been the spark that allowed life to become more complex, going from molecules to cells to multicellular organisms.”

The findings highlight the critical importance of replication fidelity in making evolution possible. The RNA polymerase’s copying accuracy must exceed a critical threshold to maintain heritable information over multiple generations, and this threshold would have risen as the evolving RNAs increased in size and complexity.

Joyce’s team is re-creating this process in laboratory test tubes, applying increasing  on the system to produce better-performing polymerases, with the goal of one day producing an RNA polymerase that can replicate itself. This would mark the beginnings of autonomous RNA life in the laboratory, which the researchers say could be accomplished within the next decade.

The scientists are also interested in what else might occur once this mini “RNA World” has gained more autonomy.

“We’ve seen that selection pressure can improve RNAs with an existing function, but if we let the system evolve for longer with larger populations of RNA molecules, can new functions be invented?” says co-author David Horning, a staff scientist in Joyce’s lab. “We’re excited to answer how early life could ratchet up its own complexity, using the tools developed here at Salk.”

The methods used in the Joyce lab also pave the way for future experiments testing other ideas about the origins of life, including what environmental conditions could have best supported RNA , both on Earth and on other planets.

Even inactive smokers are densely colonized by microbial communities

Sea-floor samples for the study were taken with this deep-sea submersible vehicle (Alvin) from inactive as well as active hydrothermal systems in several thousands of meters of water. Credit: Woods Hole Oceanographic Institution, National Deep Submergence Facility, National Science Foundation.

Under certain conditions microbial communities can grow and thrive, even in places that are seemingly uninhabitable. This is the case at inactive hydrothermal vents on the sea floor. An international team that includes researchers from MARUM—Center for Marine Environmental Sciences at the University of Bremen, is presently working to accurately quantify how much inorganic carbon can be bound in these environments.

With its , darkness, and nutrient deficiency, the deep sea is generally not a hospitable place. But in the presence of heat and a rich influx of energy-rich fluids, as is the case at active , numerous fish, shellfish, and microorganisms are able to settle there. But what happens to these biotic communities when the source of hot fluids is exhausted?

The chimneys form over long time periods when seawater seeps through cracks into the Earth’s crust, is warmed there, then dissolves and takes up minerals on its way back up to the ocean floor. This hot, mineral-rich, and often smokey water seeks the most pervious path through the Earth’s crust and encounters cold, oxygen-rich water at the sea floor.

This results in the precipitation of minerals, which are deposited as chimneys. These hydrothermal vents are energy-rich habitats based on chemosynthesis where microorganisms from the base of the food webs. Depending on the region, chimneys at hydrothermal seeps contain minerals like copper, zinc, gold, or silver. As a result, there is a growing interest in exploiting inactive smokers in deep-sea mining activities.

When the flow of mineral-rich fluids dries up, the black smokers become inactive. Larger organisms migrate away to the next vent, but the microbial communities have ways to adapt to the new conditions.

 

“Even forty years after the discovery of the first hydrothermal fields, we constantly learn new things about how these ecosystems work,” says Dr. Florence Schubotz of MARUM, “particularly relating to the amount of CO2 bound up in inactive smokers, but also with regard to the volume of microbial life, its activity, and rates of production.”

Determining how densely inactive smokers are colonized is the central focus of a research project in which Schubotz is working. The work involves sampling at the exact area where the first hydrothermal vents were discovered in the eastern Pacific around four decades ago.

“The initial results indicate that even inactive smokers are important locations for microbial activity and the production of organic carbon on the sea floor. We are just beginning to understand how the carbon cycle functions in the deep sea. It is certain that carbon is fixed at such hotspots.

“But,” according to Schubotz, “we do not yet understand these ecosystems well enough to estimate the magnitudes involved.” Broad areas of the  have not yet been investigated and still unknown hydrothermal systems await discovery.

Every plate-boundary spreading center is a potential colonization area. The samples from the eastern Pacific will provide a good starting point because there is already a good understanding of the extent of microbial communities at this location. The international team has therefore investigated samples from active and inactive smokers and compared them with each other.

The team obtained the samples during three expeditions in 2019 and 2021, in part with the help of the manned submersible research vehicle Alvin, from the East Pacific Rise (9 degrees north), an oceanic ridge at a Pacific plate boundary. Their objective is to understand better the deep-sea ecosystem and the interactions between various organisms and to calculate how metabolic rates change from active to inactive systems for the first time.

“Without this kind of data,” according to the publication, “our understanding of the element cycles in the inactive-chimney ecosystem and their possible influence on the biochemistry of the deep sea remains incomplete.” The team emphasizes that such investigations are essential before any decisions can be made about  mining.

The biogeochemistry at the sea floor and the interactions of marine ecosystems with the environment are also some of the core research themes within the Cluster of Excellence, “The Ocean Floor—Earth’s Uncharted Interface.”

The findings are published in the journal Nature Microbiology.

GALILEO: Scientists propose a new method to search for light dark matter

A map of dark matter from 2021 using weak gravitational lensing data set. Credit: Dark Energy Survey. darkenergysurvey.org/des-year-3-cosmology-results-papers/.

New research in Physical Review Letters (PRL) has proposed a novel method to detect light dark matter candidates using laser interferometry to measure the oscillatory electric fields generated by these candidates.

Dark matter is one of the most pressing challenges in modern physics, with  being elusive and hard to detect. This has prompted scientists to come up with new and innovative ways to look for these particles.

There are several candidates for dark matter particles, such as WIMPs, light dark matter particles (axions), and the hypothetical gravitino. Light dark matter, including bosonic particles like the QCD (quantum chromo dynamics) , has become a point of interest in recent years.

These particles typically have suppressed interactions with the , making them challenging to detect. However, knowing their characteristics, including their wave-like behavior and coherent nature at galactic scales, helps to design more efficient experiments.

In the new PRL study, researchers from the University of Maryland and Johns Hopkins University have proposed Galactic Axion Laser Interferometer Leveraging Electro-Optics or GALILEO, a new approach to detect both axion and dark photon dark matter over a wide mass range.

Lead researcher Reza Ebadi, a graduate student at the Quantum Technology Center (QTC) at the University of Maryland, spoke to Phys.org about the research and their motivation for developing this new approach, “Although the standard model provides successful explanations of phenomena ranging from sub-nuclear distances to the size of the universe, it is not a complete explanation of nature.”

“It fails to account for cosmological observations from which the existence of dark matter is inferred. We aspire to gain insight into the physical theories operating on galactic scales using small-scale lab experiments.”

Axions and axionlike particles

Axions and axionlike particles were initially proposed to solve problems in particle physics, such as the strong charge-parity (CP) problem. This problem arises from the observation that the strong force doesn’t seem to exhibit a particular type of symmetry violation, called CP violation, as much as theory predicts it should.

This  naturally gives rise to axionlike particles, which share similar properties to axions, with both being bosons.

Axions and axionlike particles are predicted to have very low masses, typically ranging from microelectronvolts to millielectronvolts. This makes them suitable candidates for light dark matter, as they can exhibit wave-like behavior at galactic scales.

In addition to their low mass, axions and axionlike particles interact very weakly with ordinary matter, making them difficult to detect using conventional means.

These are some reasons the researchers have chosen to detect these particles in their experimental setup. However, the method hinges on oscillatory electric fields produced by these particles.

In regions with significant dark matter density, axions and ALPs can undergo coherent oscillations. These coherent oscillations can give rise to detectable signals, such as oscillatory electric fields, which the proposed GALILEO experiment aims to measure.

GALILEO: Scientists propose a new method to search for light dark matter

Projected sensitivities of the GALILEO experiment for axion (Left) and dark photon (Right) dark matter searches. Credit: Physical Review Letters (2024). DOI: 10.1103/PhysRevLett.132.101001

GALILEO

“Light dark matter candidates behave as waves in the solar neighborhood. Such dark matter waves are predicted to induce very weak oscillating electric fields with magnetic fields because of their minuscule interactions with electromagnetism.”

“We focused on the detection of the electric field rather than the magnetic field, which is the target signal in most current and proposed experiments,” explained Ebadi.

Light dark matter-induced electric fields can be detected using electro-optical materials, where the external electric field modifies the material’s properties, such as refractive index.

GALILEO utilizes an asymmetric Michelson interferometer, a device that can measure the changes in refractive index. One arm of the interferometer contains the electro-optical material.

When a probe laser beam is split and sent through the two arms of the interferometer, the arm containing the electro-optical material introduces a variable refractive index. This change in  affects the phase of the laser beam, resulting in an oscillating signal when the beams are merged back together.

By measuring the differential phase velocity between the two arms of the interferometer, GALILEO can detect the frequency of oscillation induced by light dark matter. This oscillatory signal serves as the signature of the presence of dark matter particles.

The sensitivity of the method can be increased by incorporating Fabry-Perot cavities (which increase the length of the interferometer arm, allowing for greater precision) and taking repeated independent measurements.

Laser interferometry and implementing GALILEO

The research relies on precision measurements by laser interferometry.

Ebadi explained, “A prime example of how laser interferometers can be used for precision measurements is LIGO, the ground-based gravitational wave detector.”

“Our proposal uses similar technological advancements as LIGO, such as Fabry-Perot cavities or squeezed light to suppress the quantum noise limit. However, unlike LIGO, the proposed GALILEO interferometer is a tabletop-scale device.”

Even though the work is theoretical, the researchers already have plans to implement the experimental program step-by-step.

Importantly, they want to determine the technical parameters required for an optimized experimental setup, which they plan to use for conducting scientific experiments to search for light dark matter.

Additionally, Ebadi highlights the importance of operating high-finesse Fabry-Perot cavities alongside electro-optical material within the cavity, as well as characterizing the noise budget and setup systematics, which are crucial aspects of the experimental process.

“GALILEO has the potential to be a significant component of the bigger mission of exploring the vast theoretically viable space of ,” concluded Ebadi.

history

There are many things we don’t know about how history unfolds. The process might be impersonal, even inevitable, as some social scientists have suggested; human societies might be doomed to decline. Or, individual actions and environmental conditions might influence our communities’ trajectories. Social scientists have struggled to find a consensus on such fundamental issues.

A new framework by SFI faculty and others suggests a way to unify these perspectives. In a new paper in the Journal of Computer Applications in Archaeology, multidisciplinary researchers describe how using  to analyze historical datasets could reveal surprising patterns that have gone unnoticed in previous models.

A stochastic model, which incorporates uncertainty and randomness, would treat historical shifts not as deterministic but instead as probabilistic. Stochastic models have previously been used to study systems in a range of fields, from biology to physics to information theory, but have remained under-explored in the study of history and archaeology.

Taking this approach to studying  doesn’t only potentially unite previous ideas; it may also yield new ideas about how historical systems change over time. “Adopting a stochastic process also forces us to be precise and explicit in our thinking about the dynamics underlying any particular historical data set,” says physicist and SFI Professor David Wolpert, senior author on the study.

Stochastic processes let the data speak for itself and can remove potential interpretive biases, says SFI External Professor and co-author Stefani Crabtree (Utah State University), who in her work uses diverse methodologies to model systems in social sciences and ecology.

Where previous approaches begin with a preconceived idea and then find data to support it, a stochastic framework instead can “allow the data itself to identify how human social groups evolve, possibly possible unanticipated revelations,” she says. “It could lead to the detection of sometimes surprising patterns that would be missed in more traditional analyses.”

The diverse forces that shape the evolution of history are complex, notes biologist and SFI External Professor Manfred Laubichler (Arizona State University), another co-author on the paper. Laubichler’s work focuses on evolution in its many guises, from genes to knowledge systems. Instead of treating historical evolution as a deterministic process, he says the new framework allows for probabilities of different events to evolve.

Using randomness, the new approach suggests a structured way to identify the causes of historical shifts. Those might be individuals in society who take dramatic actions that change the course of history, or they might be natural forces external to society, like volcanoes or , that nonetheless are critical forcing factors. The proposed  also provides a new way to compare cases from different points in history.

The researchers say their framework could be used to find patterns in archaeological data. (Or, in cases where data is missing from the historical record, it might be used in tandem with machine learning systems.) It might also help elucidate drivers of the Great Acceleration—the exponential growth, ongoing since the beginning of the 1950s, in many areas of earth and social systems. It could help differentiate random occurrences from truly transformative events, says Wolpert.

He notes that the new framework isn’t an end-all solution; rather, by grounding investigations of social dynamics in stochastic models, the researchers hope to unearth new, data-driven tools for finding patterns in historical records.

“Not only does this perspective allow us to unify the analyses of computational history,” Wolpert says, “it also allows us to align how we investigate human history with how it is done in the other historical sciences.”