BREAKING NEWS: Nuclear Explosion Has Similar Effect Strong Solar Flare

Our Cold War history is now offering scientists a chance to better understand the complex space system that surrounds us. Space weather – which can include changes in Earth’s magnetic environment are usually triggered by the Sun’s activity, but recently declassified data on high-altitude nuclear explosion tests have provided a new look at the mechanisms that set off perturbations in that magnetic system. Such information can help support NASA’s efforts to protect satellites and astronauts from the natural radiation inherent in space.

From 1958 to 1962, the U.S. and U.S.S.R. ran high-altitude tests with exotic code names like Starfish, Argus and Teak. The tests have long since ended, and the goals at the time were military. Today, however, they can provide crucial information on how humans can affect space. The tests, and other human-induced space weather, are the focus of a comprehensive new study published in Space Science Reviews.

“The tests were a human-generated and extreme example of some of the space weather effects frequently caused by the Sun,” said Phil Erickson, assistant director at MIT’s Haystack Observatory, Westford, Massachusetts, and co-author on the paper. “If we understand what happened in the somewhat controlled and extreme event that was caused by one of these man-made events, we can more easily understand the natural variation in the near-space environment.”

By and large, space weather – which affects the region of near-Earth space where astronauts and satellites travel – is typically driven by external factors. The Sun sends out millions of high-energy particles, the solar wind, which races out across the solar system before encountering Earth and its magnetosphere, a protective magnetic field surrounding the planet. Most of the charged particles are deflected, but some make their way into near-Earth space and can impact our satellites by damaging onboard electronics and disrupting communications or navigation signals. These particles, along with electromagnetic energy that accompanies them, can also cause auroras, while changes in the magnetic field can induce currents that damage power grids.

The Cold War tests, which detonated explosives at heights from 16 to 250 miles above the surface, mimicked some of these natural effects. Upon detonation, a first blast wave expelled an expanding fireball of plasma, a hot gas of electrically charged particles. This created a geomagnetic disturbance, which distorted Earth’s magnetic field lines and induced an electric field on the surface.

Some of the tests even created artificial radiation belts, akin to the natural Van Allen radiation belts, a layer of charged particles held in place by Earth’s magnetic fields. The artificially trapped charged particles remained in significant numbers for weeks, and in one case, years. These particles, natural and artificial, can affect electronics on high-flying satellites—in fact some failed as a result of the tests.

Although the induced radiation belts were physically similar to Earth’s natural radiation belts, their trapped particles had different energies. By comparing the energies of the particles, it is possible to distinguish the fission-generated particles and those naturally occurring in the Van Allen belts.

Other tests mimicked other natural phenomena we see in space. The Teak test, which took place on Aug. 1, 1958, was notable for the artificial aurora that resulted. The test was conducted over Johnston Island in the Pacific Ocean. On the same day, the Apia Observatory in Western Samoa observed a highly unusual aurora, which are typically only observed in at the poles. The energetic particles released by the test likely followed Earth’s magnetic field lines to the Polynesian island nation, inducing the aurora. Observing how the tests caused aurora, can provide insight into what the natural auroral mechanisms are too.

Later that same year, when the Argus tests were conducted, effects were seen around the world. These tests were conducted at higher altitudes than previous tests, allowing the particles to travel farther around Earth. Sudden geomagnetic storms were observed from Sweden to Arizona and scientists used the observed time of the events to determine the speed at which the particles from the explosion traveled. They observed two high-speed waves: the first travelled at 1,860 miles per second and the second, less than a fourth that speed. Unlike the artificial radiation belts, these geomagnetic effects were short-lived, lasting only seconds.

Atmospheric nuclear testing has long since stopped, and the present space environment remains dominated by natural phenomena. However, considering such historical events allows scientists and engineers to understand the effects of space weather on our infrastructure and technical systems.

Such information adds to a larger body of heliophysics research, which studies our near-Earth space environment in order to better understand the natural causes of space weather. NASA missions such as Magnetospheric Multiscale (MMS), Van Allen Probes and Time History of Events and Macroscale Interactions during Substorms (THEMIS) study Earth’s magnetosphere and the causes of space weather. Other NASA missions, like STEREO, constantly survey the Sun to look for activity that could trigger space weather. These missions help inform scientists about the complex system we live in, and how to protect the satellites we utilize for communication and navigation on a daily basis.

Destruction of a Quantum Monopole Observed

Scientists at Amherst College (USA) and Aalto University (Finland) have made the first experimental observations of the dynamics of isolated monopoles in quantum matter.

The new study provided a surprise: the quantum monopole decays into another analogue of the magnetic monopole. The obtained fundamental understanding of monopole dynamics may help in the future to build even closer analogues of the magnetic monopoles.

Unlike usual magnets, magnetic monopoles are elementary particles that have only a south or a north magnetic pole, but not both. They have been theoretically predicted to exist, but no convincing experimental observations have been reported. Thus physicists are busy looking for analogue objects.

– In 2014, we experimentally realized a Dirac monopole, that is, Paul Dirac’s 80-year-old theory where he originally considered charged quantum particles interacting with a magnetic monopole, says Professor David Hall from Amherst College.

– And in 2015, we created real quantum monopoles, adds Dr. Mikko Möttönen from Aalto University.

Whereas the Dirac monopole experiment simulates the motion of a charged particle in the vicinity of a monopolar magnetic field, the quantum monopole has a point-like structure in its own field resembling that of the magnetic monopole particle itself.

From one quantum monopole to another in less than a second

Now the monopole collaboration led by David Hall and Mikko Möttönen has produced an observation of how one of these unique magnetic monopole analogues spontaneously turns into another in less than a second.

– Sounds easy but we actually had to improve the apparatus to make it happen, says Mr. Tuomas Ollikainen who is the first author of the new work.

The scientists start with an extremely dilute gas of rubidium atoms chilled near absolute zero, at which temperature it forms a Bose-Einstein condensate. Subsequently, they prepare the system in a non-magnetized state and ramp an external magnetic-field zero point into the condensate thus creating an isolated quantum monopole. Then they hold the zero point still and wait for the system to gradually magnetize along the spatially varying magnetic field. The resulting destruction of the quantum monopole gives birth to a Dirac monopole.

– I was jumping in the air when I saw for the first time that we get a Dirac monopole from the decay. This discovery nicely ties together the monopoles we have been producing over the years, says Dr. Möttönen.

Beyond Nobel physics

The quantum monopole is a so-called topological point defect, that is, a single point in space surrounded by a structure in the non-magnetized state of the condensate that cannot be removed by continuous reshaping. Such structures are related to the 2016 Nobel Prize in Physics which was awarded in part for discoveries of topological phase transitions involving quantum whirlpools, or vortices.

– Vortex lines have been studied experimentally in superfluids for decades; monopoles, on the other hand, have been studied experimentally for just a few years, says Prof. Hall.
Although its topology protects the quantum monopole, it can decay since the whole phase of matter changes from non-magnetized to magnetized.

– No matter how robust an ice sculpture you make, it all flows down the drain when the ice melts, says Mr. Ollikainen.

– For the first time, we observed spontaneously appearing Dirac monopoles and the related vortex lines, says Dr. Möttönen.

Stars Regularly Ripped Apart By Black Holes In Colliding Galaxies

Astronomers based at the University of Sheffield have found evidence that stars are ripped apart by supermassive black holes 100 times more often than previously thought.

Until now, such stellar cannibalism – known as Tidal Distruption Events, or TDEs – had only been found in surveys which observed many thousands of galaxies, leading astronomers to believe they were exceptionally rare: only one event every 10,000 to 100,000 years per galaxy.

However, the pioneering study conducted by leading scientists from the University’s Department of Physics and Astronomy, recorded a star being destroyed by a supermassive black hole in a survey of just 15 galaxies – an extremely small sample size by astronomy standards.

“Each of these 15 galaxies is undergoing a ‘cosmic collision’ with a neighbouring galaxy,” said Dr James Mullaney, Lecturer in Astronomy and co-author of the study.
“Our surprising findings show that the rate of TDEs dramatically increases when galaxies collide. This is likely due to the fact that the collisions lead to large numbers of stars being formed close to the central supermassive black holes in the two galaxies as they merge together.”

The supermassive black holes that lurk in the hearts of all large galaxies can be elusive. This is because they don’t shine in a conventional sense due to their gravity being so strong that nothing can escape, not even light itself. However, the release of energy as stars are ripped apart when they move close to the black holes leads to dramatic flares. The galaxies’ nuclei can then appear as bright as all the billions of stars in a typical galaxy combined. In this way, TDEs can be used to locate otherwise dim black holes and study their strong gravity and how they accrete matter.

“Our team first observed the 15 colliding galaxies in the sample in 2005, during a previous project,” said Rob Spence, University of Sheffield PhD student and co-author of the study.

“However, when we observed the sample again in 2015, we noticed that one galaxy – F01004-2237 – appeared strikingly different. This led us to look at data from the Catalina Sky Survey, which monitors the brightness of objects in the sky over time. We found that in 2010, the brightness of F01004-2237 flared dramatically.”

The particular combination of variability and post-flare spectrum observed in F01004-2237 – which is 1.7 billion light years from Earth – was unlike any known supernova or active galactic nucleus, but characteristic of TDEs.

Clive Tadhunter, Professor of Astrophysics and leader of the study, said: “Based on our results for F01004-2237, we expect that TDE events will become common in our own Milky Way galaxy when it eventually merges with the neighbouring Andromeda galaxy in about 5 billion years.

“Looking towards the centre of the Milky Way at the time of the merger we’d see a flare approximately every 10 to 100 years. The flares would be visible to the naked eye and appear much brighter than any other star or planet in the night sky.”

Earth Probably Began With A Solid Shell

Today’s Earth is a dynamic planet with an outer layer composed of giant plates that grind together, sliding past or dipping beneath one another, giving rise to earthquakes and volcanoes. Others separate at undersea mountain ridges, where molten rock spreads out from the centers of major ocean basins.

But new research suggests that this was not always the case. Instead, shortly after Earth formed and began to cool, the planet’s first outer layer was a single, solid but deformable shell. Later, this shell began to fold and crack more widely, giving rise to modern plate tectonics.

The research, described in a paper published February 27, 2017 in the journal Nature, is the latest salvo in a long-standing debate in the geological research community: did plate tectonics start right away—a theory known as uniformitarianism—or did Earth first go through a long phase with a solid shell covering the entire planet? The new results suggest the solid shell model is closest to what really happened.

“Models for how the first continental crust formed generally fall into two groups: those that invoke modern-style plate tectonics and those that do not,” said Michael Brown, a professor of geology at the University of Maryland and a co-author of the study. “Our research supports the latter—a ‘stagnant lid’ forming the planet’s outer shell early in Earth’s history.”

To reach these conclusions, Brown and his colleagues from Curtin University and the Geological Survey of Western Australia studied rocks collected from the East Pilbara Terrane, a large area of ancient granitic crust located in the state of Western Australia. Rocks here are among the oldest known, ranging from 3.5 to about 2.5 billion years of age. (Earth is roughly 4.5 billion years old.) The researchers specifically selected granites with a chemical composition usually associated with volcanic arcs—a telltale sign of plate tectonic activity.

Brown and his colleagues also looked at basalt rocks from the associated Coucal formation. Basalt is the rock produced when volcanoes erupt, but it also forms the ocean floor, as molten basalt erupts at spreading ridges in the center of ocean basins. In modern-day plate tectonics, when ocean floor basalt reaches the continents, it dips—or subducts—beneath the Earth’s surface, where it generates fluids that allow the overlying mantle to melt and eventually create large masses of granite beneath the surface.

Previous research suggested that the Coucal basalts could be the source rocks for the granites in the Pilbara Terrane, because of the similarities in their chemical composition. Brown and his collaborators set out to verify this, but also to test another long-held assumption: could the Coucal basalts have melted to form granite in some way other than subduction of the basalt beneath Earth’s surface? If so, perhaps plate tectonics was not yet happening when the Pilbara granites formed.

To address this question, the researchers performed thermodynamic calculations to determine the phase equilibria of average Coucal basalt. Phase equilibria are precise descriptions of how a substance behaves under various temperature and pressure conditions, including the temperature at which melting begins, the amount of melt produced and its chemical composition.

For example, one of the simplest phase equilibria diagrams describes the behavior of water: at low temperatures and/or high pressures, water forms solid ice, while at high temperatures and/or low pressures, water forms gaseous steam. Phase equilibria gets a bit more involved with rocks, which have complex chemical compositions that can take on very different mineral combinations and physical characteristics based on temperature and pressure.

“If you take a rock off the shelf and melt it, you can get a phase diagram. But you’re stuck with a fixed chemical composition,” Brown said. “With thermodynamic modeling, you can change the composition, pressure and temperature independently. It’s much more flexible and helps us to answer some questions we can’t address with experiments on rocks.”

Using the Coucal basalts and Pilbara granites as a starting point, Brown and his colleagues constructed a series of modeling experiments to reflect what might have transpired in an ancient Earth without plate tectonics. Their results suggest that, indeed, the Pilbara granites could have formed from the Coucal basalts.
More to the point, this transformation could have occurred in a pressure and temperature scenario consistent with a “stagnant lid,” or a single shell covering the entire planet.

Plate tectonics substantially affects the temperature and pressure of rocks within Earth’s interior. When a slab of rock subducts under the Earth’s surface, the rock starts off relatively cool and takes time to gain heat. By the time it reaches a higher temperature, the rock has also reached a significant depth, which corresponds to high pressure—in the same way a diver experiences higher pressure at greater water depth.

In contrast, a “stagnant lid” regime would be very hot at relatively shallow depths and low pressures. Geologists refer to this as a “high thermal gradient.”

“Our results suggest the Pilbara granites were produced by melting of the Coucal basalts or similar materials in a high thermal gradient environment,” Brown said. “Additionally, the composition of the Coucal basalts indicates that they, too, came from an earlier generation of source rocks. We conclude that a multi-stage process produced Earth’s first continents in a ‘stagnant lid’ scenario before plate tectonics began.”

“Earth’s first stable continents did not form by subduction,” Tim Johnson, Michael Brown, Nicholas Gardiner, Christopher Kirkland and Hugh Smithies, was published February 27, 2017 in the journal Nature.

First Evidence Of Rocky Planet Formation In Tatooine System

Evidence of planetary debris surrounding a double sun, ‘Tatooine-like’ system has been found for the first time by a UCL-led team of researchers.

Published today in Nature Astronomy and funded by the Science and Technology Facilities Council and the European Research Council, the study finds the remains of shattered asteroids orbiting a double sun consisting of a white dwarf and a brown dwarf roughly 1000 light-years away in a system called SDSS 1557.

The discovery is remarkable because the debris appears to be rocky and suggests that terrestrial planets like Tatooine – Luke Skywalker’s home world in Star Wars – might exist in the system. To date, all exoplanets discovered in orbit around double stars are gas giants, similar to Jupiter, and are thought to form in the icy regions of their systems.

In contrast to the carbon-rich icy material found in other double star systems, the planetary material identified in the SDSS 1557 system has a high metal content, including silicon and magnesium. These elements were identified as the debris flowed from its orbit onto the surface of the star, polluting it temporarily with at least 1017 g (or 1.1 trillion US tons) of matter, equating it to an asteroid at least 4 km in size.

Lead author, Dr Jay Farihi (UCL Physics & Astronomy), said: “Building rocky planets around two suns is a challenge because the gravity of both stars can push and pull tremendously, preventing bits of rock and dust from sticking together and growing into full-fledged planets. With the discovery of asteroid debris in the SDSS 1557 system, we see clear signatures of rocky planet assembly via large asteroids that formed, helping us understand how rocky exoplanets are made in double star systems.”

In the Solar System, the asteroid belt contains the leftover building blocks for the terrestrial planets Mercury, Venus, Earth, and Mars, so planetary scientists study the asteroids to gain a better understanding of how rocky, and potentially habitable planets are formed. The same approach was used by the team to study the SDSS 1557 system as any planets within it cannot yet be detected directly but the debris is spread in a large belt around the double stars, which is a much larger target for analysis.

The discovery came as a complete surprise, as the team assumed the dusty white dwarf was a single star but co-author Dr Steven Parsons (University of Valparaíso and University of Sheffield), an expert in double star (or binary) systems noticed the tell-tale signs. “We know of thousands of binaries similar to SDSS 1557 but this is the first time we’ve seen asteroid debris and pollution. The brown dwarf was effectively hidden by the dust until we looked with the right instrument”, added Parsons, “but when we observed SDSS 1557 in detail we recognised the brown dwarf’s subtle gravitational pull on the white dwarf.”

The team studied the binary system and the chemical composition of the debris by measuring the absorption of different wavelengths of light or ‘spectra’, using the Gemini Observatory South telescope and the European Southern Observatory Very Large Telescope, both located in Chile.

Co-author Professor Boris Gänsicke (University of Warwick) analysed these data and found they all told a consistent and compelling story. “Any metals we see in the white dwarf will disappear within a few weeks, and sink down into the interior, unless the debris is continuously flowing onto the star. We’ll be looking at SDSS 1557 next with Hubble, to conclusively show the dust is made of rock rather than ice.”

NASA Telescope Reveals Largest Batch Of Earth-Size, Habitable-Zone Planets Around Single Star

NASA’s Spitzer Space Telescope has revealed the first known system of seven Earth-size planets around a single star. Three of these planets are firmly located in the habitable zone, the area around the parent star where a rocky planet is most likely to have liquid water.

The discovery sets a new record for greatest number of habitable-zone planets found around a single star outside our solar system. All of these seven planets could have liquid water — key to life as we know it — under the right atmospheric conditions, but the chances are highest with the three in the habitable zone.

“This discovery could be a significant piece in the puzzle of finding habitable environments, places that are conducive to life,” said Thomas Zurbuchen, associate administrator of the agency’s Science Mission Directorate in Washington. “Answering the question ‘are we alone’ is a top science priority and finding so many planets like these for the first time in the habitable zone is a remarkable step forward toward that goal.”

At about 40 light-years (235 trillion miles) from Earth, the system of planets is relatively close to us, in the constellation Aquarius. Because they are located outside of our solar system, these planets are scientifically known as exoplanets.

This exoplanet system is called TRAPPIST-1, named for The Transiting Planets and Planetesimals Small Telescope (TRAPPIST) in Chile. In May 2016, researchers using TRAPPIST announced they had discovered three planets in the system. Assisted by several ground-based telescopes, including the European Southern Observatory’s Very Large Telescope, Spitzer confirmed the existence of two of these planets and discovered five additional ones, increasing the number of known planets in the system to seven.

The new results were published Wednesday in the journal Nature, and announced at a news briefing at NASA Headquarters in Washington.

Using Spitzer data, the team precisely measured the sizes of the seven planets and developed first estimates of the masses of six of them, allowing their density to be estimated.

Based on their densities, all of the TRAPPIST-1 planets are likely to be rocky. Further observations will not only help determine whether they are rich in water, but also possibly reveal whether any could have liquid water on their surfaces. The mass of the seventh and farthest exoplanet has not yet been estimated — scientists believe it could be an icy, “snowball-like” world, but further observations are needed.

“The seven wonders of TRAPPIST-1 are the first Earth-size planets that have been found orbiting this kind of star,” said Michael Gillon, lead author of the paper and the principal investigator of the TRAPPIST exoplanet survey at the University of Liege, Belgium. “It is also the best target yet for studying the atmospheres of potentially habitable, Earth-size worlds.”

In contrast to our sun, the TRAPPIST-1 star — classified as an ultra-cool dwarf — is so cool that liquid water could survive on planets orbiting very close to it, closer than is possible on planets in our solar system. All seven of the TRAPPIST-1 planetary orbits are closer to their host star than Mercury is to our sun. The planets also are very close to each other. If a person were standing on one of the planet’s surface, they could gaze up and potentially see geological features or clouds of neighboring worlds, which would sometimes appear larger than the moon in Earth’s sky.

The planets may also be tidally locked to their star, which means the same side of the planet is always facing the star, therefore each side is either perpetual day or night. This could mean they have weather patterns totally unlike those on Earth, such as strong winds blowing from the day side to the night side, and extreme temperature changes.

Spitzer, an infrared telescope that trails Earth as it orbits the sun, was well-suited for studying TRAPPIST-1 because the star glows brightest in infrared light, whose wavelengths are longer than the eye can see. In the fall of 2016, Spitzer observed TRAPPIST-1 nearly continuously for 500 hours. Spitzer is uniquely positioned in its orbit to observe enough crossing — transits — of the planets in front of the host star to reveal the complex architecture of the system. Engineers optimized Spitzer’s ability to observe transiting planets during Spitzer’s “warm mission,” which began after the spacecraft’s coolant ran out as planned after the first five years of operations.

“This is the most exciting result I have seen in the 14 years of Spitzer operations,” said Sean Carey, manager of NASA’s Spitzer Science Center at Caltech/IPAC in Pasadena, California. “Spitzer will follow up in the fall to further refine our understanding of these planets so that the James Webb Space Telescope can follow up. More observations of the system are sure to reveal more secrets.”

Following up on the Spitzer discovery, NASA’s Hubble Space Telescope has initiated the screening of four of the planets, including the three inside the habitable zone. These observations aim at assessing the presence of puffy, hydrogen-dominated atmospheres, typical for gaseous worlds like Neptune, around these planets.

In May 2016, the Hubble team observed the two innermost planets, and found no evidence for such puffy atmospheres. This strengthened the case that the planets closest to the star are rocky in nature.

“The TRAPPIST-1 system provides one of the best opportunities in the next decade to study the atmospheres around Earth-size planets,” said Nikole Lewis, co-leader of the Hubble study and astronomer at the Space Telescope Science Institute in Baltimore. NASA’s planet-hunting Kepler space telescope also is studying the TRAPPIST-1 system, making measurements of the star’s minuscule changes in brightness due to transiting planets. Operating as the K2 mission, the spacecraft’s observations will allow astronomers to refine the properties of the known planets, as well as search for additional planets in the system. The K2 observations conclude in early March and will be made available on the public archive.

Spitzer, Hubble, and Kepler will help astronomers plan for follow-up studies using NASA’s upcoming James Webb Space Telescope, launching in 2018. With much greater sensitivity, Webb will be able to detect the chemical fingerprints of water, methane, oxygen, ozone, and other components of a planet’s atmosphere. Webb also will analyze planets’ temperatures and surface pressures — key factors in assessing their habitability.

‘Quartz’ Crystals At Earth’s Core Power Its Magnetic Field

The Earth’s core consists mostly of a huge ball of liquid metal lying at 3000 km beneath its surface, surrounded by a mantle of hot rock. Notably, at such great depths, both the core and mantle are subject to extremely high pressures and temperatures. Furthermore, research indicates that the slow creeping flow of hot buoyant rocks — moving several centimeters per year — carries heat away from the core to the surface, resulting in a very gradual cooling of the core over geological time. However, the degree to which the Earth’s core has cooled since its formation is an area of intense debate amongst Earth scientists.

In 2013 Kei Hirose, now Director of the Earth-Life Science Institute (ELSI) at the Tokyo Institute of Technology (Tokyo Tech), reported that the Earth’s core may have cooled by as much as 1000 degrees Celsius since its formation 4.5 billion years ago. This large amount of cooling would be necessary to sustain the geomagnetic field, unless there was another as yet undiscovered source of energy. These results were a major surprise to the deep Earth community, and created what Peter Olson of Johns Hopkins University referred to as, “the New Core Heat Paradox,” in an article published in Science.

Core cooling and energy sources for the geomagnetic field were not the only difficult issues faced by the team. Another unresolved matter was uncertainty about the chemical composition of the core. “The core is mostly iron and some nickel, but also contains about 10% of light alloys such as silicon, oxygen, sulfur, carbon, hydrogen, and other compounds,” Hirose, lead author of the new study to be published in the journal Nature. “We think that many alloys are simultaneously present, but we don’t know the proportion of each candidate element.”

Now, in this latest research carried out in Hirose’s lab at ELSI, the scientists used precision cut diamonds to squeeze tiny dust-sized samples to the same pressures that exist at the Earth’s core. The high temperatures at the interior of the Earth were created by heating samples with a laser beam. By performing experiments with a range of probable alloy compositions under a variety of conditions, Hirose’s and colleagues are trying to identify the unique behavior of different alloy combinations that match the distinct environment that exists at the Earth’s core.

The search of alloys began to yield useful results when Hirose and his collaborators began mixing more than one alloy. “In the past, most research on iron alloys in the core has focused only on the iron and a single alloy,” says Hirose. “But in these experiments we decided to combine two different alloys containing silicon and oxygen, which we strongly believe exist in the core.”

The researchers were surprised to find that when they examined the samples in an electron microscope, the small amounts of silicon and oxygen in the starting sample had combined together to form silicon dioxide crystals — the same composition as the mineral quartz found at the surface of the Earth.

“This result proved important for understanding the energetics and evolution of the core,” says John Hernlund of ELSI, a co-author of the study. “We were excited because our calculations showed that crystallization of silicon dioxide crystals from the core could provide an immense new energy source for powering the Earth’s magnetic field.” The additional boost it provides is plenty enough to solve Olson’s paradox.

The team has also explored the implications of these results for the formation of the Earth and conditions in the early Solar System. Crystallization changes the composition of the core by removing dissolved silicon and oxygen gradually over time. Eventually the process of crystallization will stop when then core runs out of its ancient inventory of either silicon or oxygen.

“Even if you have silicon present, you can’t make silicon dioxide crystals without also having some oxygen available,” says ELSI scientist George Helffrich, who modeled the crystallization process for this study. “But this gives us clues about the original concentration of oxygen and silicon in the core, because only some silicon:oxygen ratios are compatible with this model.”