A New Filter To Better Map The Dark Universe

The earliest known light in our universe, known as the cosmic microwave background, was emitted about 380,000 years after the Big Bang. The patterning of this relic light holds many important clues to the development and distribution of large-scale structures such as galaxies and galaxy clusters.

Distortions in the cosmic microwave background (CMB), caused by a phenomenon known as lensing, can further illuminate the structure of the universe and can even tell us things about the mysterious, unseen universe — including dark energy, which makes up about 68 percent of the universe and accounts for its accelerating expansion, and dark matter, which accounts for about 27 percent of the universe.

Set a stemmed wine glass on a surface, and you can see how lensing effects can simultaneously magnify, squeeze, and stretch the view of the surface beneath it. In lensing of the CMB, gravity effects from large objects like galaxies and galaxy clusters bend the CMB light in different ways. These lensing effects can be subtle (known as weak lensing) for distant and small galaxies, and computer programs can identify them because they disrupt the regular CMB patterning.

There are some known issues with the accuracy of lensing measurements, though, and particularly with temperature-based measurements of the CMB and associated lensing effects.

While lensing can be a powerful tool for studying the invisible universe, and could even potentially help us sort out the properties of ghostly subatomic particles like neutrinos, the universe is an inherently messy place.

And like bugs on a car’s windshield during a long drive, the gas and dust swirling in other galaxies, among other factors, can obscure our view and lead to faulty readings of the CMB lensing.

There are some filtering tools that help researchers to limit or mask some of these effects, but these known obstructions continue to be a major problem in the many studies that rely on temperature-based measurements.

The effects of this interference with temperature-based CMB studies can lead to erroneous lensing measurements, said Emmanuel Schaan, a postdoctoral researcher and Owen Chamberlain Postdoctoral Fellow in the Physics Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

“You can be wrong and not know it,” Schaan said. “The existing methods don’t work perfectly — they are really limiting.”

To address this problem, Schaan teamed up with Simone Ferraro, a Divisional Fellow in Berkeley Lab’s Physics Division, to develop a way to improve the clarity and accuracy of CMB lensing measurements by separately accounting for different types of lensing effects.

“Lensing can magnify or demagnify things. It also distorts them along a certain axis so they are stretched in one direction,” Schaan said.

The researchers found that a certain lensing signature called shearing, which causes this stretching in one direction, seems largely immune to the foreground “noise” effects that otherwise interfere with the CMB lensing data. The lensing effect known as magnification, meanwhile, is prone to errors introduced by foreground noise. Their study, published May 8 in the journal Physical Review Letters, notes a “dramatic reduction” in this error margin when focusing solely on shearing effects.

The sources of the lensing, which are large objects that stand between us and the CMB light, are typically galaxy groups and clusters that have a roughly spherical profile in temperature maps, Ferraro noted, and the latest study found that the emission of various forms of light from these “foreground” objects only appears to mimic the magnification effects in lensing but not the shear effects.

“So we said, ‘Let’s rely only on the shear and we’ll be immune to foreground effects,'” Ferraro said. “When you have many of these galaxies that are mostly spherical, and you average them, they only contaminate the magnification part of the measurement. For shear, all of the errors are basically gone.”

He added, “It reduces the noise, allowing us to get better maps. And we’re more certain that these maps are correct,” even when the measurements involve very distant galaxies as foreground lensing objects.

The new method could benefit a range of sky-surveying experiments, the study notes, including the POLARBEAR-2 and Simons Array experiments, which have Berkeley Lab and UC Berkeley participants; the Advanced Atacama Cosmology Telescope (AdvACT) project; and the South Pole Telescope — 3G camera (SPT-3G). It could also aid the Simons Observatory and the proposed next-generation, multilocation CMB experiment known as CMB-S4 — Berkeley Lab scientists are involved in the planning for both of these efforts.

The method could also enhance the science yield from future galaxy surveys like the Berkeley Lab-led Dark Energy Spectroscopic Instrument (DESI) project under construction near Tucson, Arizona, and the Large Synoptic Survey Telescope (LSST) project under construction in Chile, through joint analyses of data from these sky surveys and the CMB lensing data.

Increasingly large datasets from astrophysics experiments have led to more coordination in comparing data across experiments to provide more meaningful results. “These days, the synergies between CMB and galaxy surveys are a big deal,” Ferraro said.

In this study, researchers relied on simulated full-sky CMB data. They used resources at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) to test their method on each of the four different foreground sources of noise, which include infrared, radiofrequency, thermal, and electron-interaction effects that can contaminate CMB lensing measurements.

The study notes that cosmic infrared background noise, and noise from the interaction of CMB light particles (photons) with high-energy electrons have been the most problematic sources to address using standard filtering tools in CMB measurements. Some existing and future CMB experiments seek to lessen these effects by taking precise measurements of the polarization, or orientation, of the CMB light signature rather than its temperature.

“We couldn’t have done this project without a computing cluster like NERSC,” Schaan said. NERSC has also proved useful in serving up other universe simulations to help prepare for upcoming experiments like DESI.

The method developed by Schaan and Ferraro is already being implemented in the analysis of current experiments’ data. One possible application is to develop more detailed visualizations of dark matter filaments and nodes that appear to connect matter in the universe via a complex and changing cosmic web.

Dark Matter Detector Observes Rarest Event Ever Recorded

How do you observe a process that takes more than one trillion times longer than the age of the universe? The XENON Collaboration research team did it with an instrument built to find the most elusive particle in the universe—dark matter. In a paper to be published tomorrow in the journal Nature, researchers announce that they have observed the radioactive decay of xenon-124, which has a half-life of 1.8 X 1022 years.

“We actually saw this decay happen. It’s the longest, slowest process that has ever been directly observed, and our dark matter detector was sensitive enough to measure it,” said Ethan Brown, an assistant professor of physics at Rensselaer, and co-author of the study. “It’s an amazing to have witnessed this process, and it says that our detector can measure the rarest thing ever recorded.”

The XENON Collaboration runs XENON1T, a 1,300-kilogram vat of super-pure liquid xenon shielded from cosmic rays in a cryostat submerged in water deep 1,500 meters beneath the Gran Sasso mountains of Italy. The researchers search for dark matter (which is five times more abundant than ordinary matter, but seldom interacts with ordinary matter) by recording tiny flashes of light created when particles interact with xenon inside the detector. And while XENON1T was built to capture the interaction between a dark matter particle and the nucleus of a xenon atom, the detector actually picks up signals from any interactions with the xenon.

The evidence for xenon decay was produced as a proton inside the nucleus of a xenon atom converted into a neutron. In most elements subject to decay, that happens when one electron is pulled into the nucleus. But a proton in a xenon atom must absorb two electrons to convert into a neutron, an event called “double-electron capture.”

Double-electron capture only happens when two of the electrons are right next to the nucleus at just the right time, Brown said, which is “a rare thing multiplied by another rare thing, making it ultra-rare.”

When the ultra-rare happened, and a double-electron capture occurred inside the detector, instruments picked up the signal of electrons in the atom re-arranging to fill in for the two that were absorbed into the nucleus.

“Electrons in double-capture are removed from the innermost shell around the nucleus, and that creates room in that shell,” said Brown. “The remaining electrons collapse to the ground state, and we saw this collapse process in our detector.”

The achievement is the first time scientists have measured the half-life of this xenon isotope based on a direct observation of its radioactive decay.

“This is a fascinating finding that advances the frontiers of knowledge about the most fundamental characteristics of matter,” said Curt Breneman, dean of the School of Science. “Dr. Brown’s work in calibrating the detector and ensuring that the xenon is scrubbed to the highest possible standard of purity was critical to making this important observation.”

The XENON Collaboration includes more than 160 scientists from Europe, the United States, and the Middle East, and, since 2002, has operated three successively more sensitive liquid xenon detectors in the Gran Sasso National Laboratory in Italy. XENON1T, the largest detector of its type ever built, acquired data from 2016 until December 2018, when it was switched off. Scientists are currently upgrading the experiment for the new XENONnT phase, which will feature an active detector mass three times larger than XENON1T. Together with a reduced background level, this will boost the detector’s sensitivity by an order of magnitude.

BREAKING NEWS: Scientists Set to Unveil First Picture of a Black Hole

On Wednesday, astronomers across the globe will hold “six major press conferences” simultaneously to announce the first results of the Event Horizon Telescope (EHT), which was designed precisely for that purpose.

Of all the forces or objects in the Universe that we cannot see – including dark energy and dark matter – none has frustrated human curiosity so much as the invisible digestive system that swallow stars like so many specks of dust.

“More than 50 years ago, scientists saw that there was something very bright at the center of our galaxy,” says Paul McNamara, an astrophysicist at the European Space Agency and an expert on black holes.

“It has a gravitational pull strong enough to make stars orbit around it very quickly – as fast as 20 years.”

To put that in perspective, our Solar System takes about 230 million years to circle the center of the Milky Way.

Eventually, astronomers speculated that these bright spots were in fact “black holes” – a term coined by American physicist John Archibald Wheeler in the mid-1960s – surrounded by a swirling band of white-hot gas and plasma.

__________________

Science Of Cycles keeps you tuned-in and knowledgeable of what we are discovering, and how some of these changes will affect our communities and ways of living.

 

Dark Matter Is Not Made Up Of Tiny Black Holes

An international team of researchers has put a theory speculated by the late Stephen Hawking to its most rigorous test to date, and their results have ruled out the possibility that primordial black holes smaller than a tenth of a millimeter make up most of dark matter. Details of their study have been published in this week’s Nature Astronomy.

Scientists know that 85 per cent of the matter in the Universe is made up of dark matter. Its gravitational force prevents stars in our Milky Way from flying apart. However, attempts to detect such dark matter particles using underground experiments, or accelerator experiments including the world’s largest accelerator, the Large Hadron Collider, have failed so far.

This has led scientists to consider Hawking’s 1974 theory of the existence of primordial black holes, born shortly after the Big Bang, and his speculation that they could make up a large fraction of the elusive dark matter scientists are trying to discover today.

An international team of researchers, led by Kavli Institute for the Physics and Mathematics of the Universe Principal Investigator Masahiro Takada, PhD candidate student Hiroko Niikura, Professor Naoki Yasuda, and including researchers from Japan, India and the US, have used the gravitational lensing effect to look for primordial black holes between Earth and the Andromeda galaxy. Gravitational lensing, an effect first suggested by Albert Einstein, manifests itself as the bending of light rays coming from a distant object such as a star due to the gravitational effect of an intervening massive object such as a primordial black hole. In extreme cases, such light bending causes the background star to appear much brighter than it originally is.

However, gravitational lensing effects are very rare events because it requires a star in the Andromeda galaxy, a primordial black hole acting as the gravitational lens, and an observer on Earth to be exactly in line with one another. So to maximize the chances of capturing an event, the researchers used the Hyper Suprime-Cam digital camera on the Subaru telescope in Hawaii, which can capture the whole image of the Andromeda galaxy in one shot. Taking into account how fast primordial black holes are expected to move in interstellar space, the team took multiple images to be able to catch the flicker of a star as it brightens for a period of a few minutes to hours due to gravitational lensing.

From 190 consecutive images of the Andromeda galaxy taken over seven hours during one clear night, the team scoured the data for potential gravitational lensing events. If dark matter consists of primordial black holes of a given mass, in this case masses lighter than the moon, the researchers expected to find about 1000 events. But after careful analyses, they could only identify one case. The team’s results showed primordial black holes can contribute no more than 0.1 per cent of all dark matter mass. Therefore, it is unlikely the theory is true.

The researchers are now planning to further develop their analysis of the Andromeda galaxy. One new theory they will investigate is to find whether binary black holes discovered by gravitational wave detector LIGO are in fact primordial black holes.

Matter-Antimatter Asymmetry In Charmed Quarks

Physicists in the College of Arts and Sciences at Syracuse University have confirmed that matter and antimatter decay differently for elementary particles containing charmed quarks.

Distinguished Professor Sheldon Stone says the findings are a first, although matter-antimatter asymmetry has been observed before in particles with strange quarks or beauty quarks.

He and members of the College’s High-Energy Physics (HEP) research group have measured, for the first time and with 99.999-percent certainty, a difference in the way D0 mesons and anti-D0 mesons transform into more stable byproducts.

Mesons are subatomic particles composed of one quark and one antiquark, bound together by strong interactions.

“There have been many attempts to measure matter-antimatter asymmetry, but, until now, no one has succeeded,” says Stone, who collaborates on the Large Hadron Collider beauty (LHCb) experiment at the CERN laboratory in Geneva, Switzerland. “It’s a milestone in antimatter research.”

The findings may also indicate new physics beyond the Standard Model, which describes how fundamental particles interact with one another. “Till then, we need to await theoretical attempts to explain the observation in less esoteric means,” he adds.

Every particle of matter has a corresponding antiparticle, identical in every way, but with an opposite charge. Precision studies of hydrogen and antihydrogen atoms, for example, reveal similarities to beyond the billionth decimal place.

When matter and antimatter particles come into contact, they annihilate each other in a burst of energy — similar to what happened in the Big Bang, some 14 billion years ago.

“That’s why there is so little naturally occurring antimatter in the Universe around us,” says Stone, a Fellow of the American Physical Society, which has awarded him this year’s W.K.H. Panofsky Prize in Experimental Particle Physics.

The question on Stone’s mind involves the equal-but-opposite nature of matter and antimatter. “If the same amount of matter and antimatter exploded into existence at the birth of the Universe, there should have been nothing left behind but pure energy. Obviously, that didn’t happen,” he says in a whiff of understatement.

Thus, Stone and his LHCb colleagues have been searching for subtle differences in matter and antimatter to understand why matter is so prevalent.

The answer may lie at CERN, where scientists create antimatter by smashing protons together in the Large Hadron Collider (LHC), the world’s biggest, most powerful particular accelerator. The more energy the LHC produces, the more massive are the particles — and antiparticles — formed during collision.

It is in the debris of these collisions that scientists such as Ivan Polyakov, a postdoc in Syracuse’s HEP group, hunt for particle ingredients.

“We don’t see antimatter in our world, so we have to artificially produce it,” he says. “The data from these collisions enables us to map the decay and transformation of unstable particles into more stable byproducts.”

HEP is renowned for its pioneering research into quarks — elementary particles that are the building blocks of matter. There are six types, or flavors, of quarks, but scientists usually talk about them in pairs: up/down, charm/strange and top/bottom. Each pair has a corresponding mass and fractional electronic charge.

In addition to the beauty quark (the “b” in “LHCb”), HEP is interested in the charmed quark. Despite its relatively high mass, a charmed quark lives a fleeting existence before decaying into something more stable.

Recently, HEP studied two versions of the same particle. One version contained a charmed quark and an antimatter version of an up quark, called the anti-up quark. The other version had an anti-charm quark and an up quark.

Using LHC data, they identified both versions of the particle, well into the tens of millions, and counted the number of times each particle decayed into new byproducts.

“The ratio of the two possible outcomes should have been identical for both sets of particles, but we found that the ratios differed by about a tenth of a percent,” Stone says. “This proves that charmed matter and antimatter particles are not totally interchangeable.”

Adds Polyakov, “Particles might look the same on the outside, but they behave differently on the inside. That is the puzzle of antimatter.”

The idea that matter and antimatter behaves differently is not new. Previous studies of particles with strange quarks and bottom quarks have confirmed as such.

What makes this study unique, Stone concludes, is that it is the first time anyone has witnessed particles with charmed quarks being asymmetrical: “It’s one for the history books.”

HEP’s work is supported by the National Science Foundation.

Astronomers Discover 83 Supermassive Black Holes In The Early Universe

Astronomers from Japan, Taiwan and Princeton University have discovered 83 quasars powered by supermassive black holes in the distant universe, from a time when the universe was less than 10 percent of its present age.

“It is remarkable that such massive dense objects were able to form so soon after the Big Bang,” said Michael Strauss, a professor of astrophysical sciences at Princeton University who is one of the co-authors of the study. “Understanding how black holes can form in the early universe, and just how common they are, is a challenge for our cosmological models.”

This finding increases the number of black holes known at that epoch considerably, and reveals, for the first time, how common they are early in the universe’s history. In addition, it provides new insight into the effect of black holes on the physical state of gas in the early universe in its first billion years. The research appears in a series of five papers published in The Astrophysical Journal and the Publications of the Astronomical Observatory of Japan.

Supermassive black holes, found at the centers of galaxies, can be millions or even billions of times more massive than the sun. While they are prevalent today, it is unclear when they first formed, and how many existed in the distant early universe. A supermassive black hole becomes visible when gas accretes onto it, causing it to shine as a “quasar.” Previous studies have been sensitive only to the very rare, most luminous quasars, and thus the most massive black holes. The new discoveries probe the population of fainter quasars, powered by black holes with masses comparable to most black holes seen in the present-day universe.

The research team used data taken with a cutting-edge instrument, “Hyper Suprime-Cam” (HSC), mounted on the Subaru Telescope of the National Astronomical Observatory of Japan, which is located on the summit of Maunakea in Hawaii. HSC has a gigantic field-of-view — 1.77 degrees across, or seven times the area of the full moon — mounted on one of the largest telescopes in the world. The HSC team is surveying the sky over the course of 300 nights of telescope time, spread over five years.

The team selected distant quasar candidates from the sensitive HSC survey data. They then carried out an intensive observational campaign to obtain spectra of those candidates, using three telescopes: the Subaru Telescope; the Gran Telescopio Canarias on the island of La Palma in the Canaries, Spain; and the Gemini South Telescope in Chile. The survey has revealed 83 previously unknown very distant quasars. Together with 17 quasars already known in the survey region, the researchers found that there is roughly one supermassive black hole per cubic giga-light-year — in other words, if you chunked the universe into imaginary cubes that are a billion light-years on a side, each would hold one supermassive black hole.

The sample of quasars in this study are about 13 billion light-years away from the Earth; in other words, we are seeing them as they existed 13 billion years ago. As the Big Bang took place 13.8 billion years ago, we are effectively looking back in time, seeing these quasars and supermassive black holes as they appeared only about 800 million years after the creation of the (known) universe.

It is widely accepted that the hydrogen in the universe was once neutral, but was “reionized” — split into its component protons and electrons — around the time when the first generation of stars, galaxies and supermassive black holes were born, in the first few hundred million years after the Big Bang. This is a milestone of cosmic history, but astronomers still don’t know what provided the incredible amount of energy required to cause the reionization. A compelling hypothesis suggests that there were many more quasars in the early universe than detected previously, and it is their integrated radiation that reionized the universe.

“However, the number of quasars we observed shows that this is not the case,” explained Robert Lupton, a 1985 Princeton Ph.D. alumnus who is a senior research scientist in astrophysical sciences. “The number of quasars seen is significantly less than needed to explain the reionization.” Reionization was therefore caused by another energy source, most likely numerous galaxies that started to form in the young universe.

The present study was made possible by the world-leading survey ability of Subaru and HSC. “The quasars we discovered will be an interesting subject for further follow-up observations with current and future facilities,” said Yoshiki Matsuoka, a former Princeton postdoctoral researcher now at Ehime University in Japan, who led the study. “We will also learn about the formation and early evolution of supermassive black holes, by comparing the measured number density and luminosity distribution with predictions from theoretical models.”

Based on the results achieved so far, the team is looking forward to finding yet more distant black holes and discovering when the first supermassive black hole appeared in the universe.

Explaining A Universe Composed Of Matter

The universe consists of a massive imbalance between matter and antimatter. Antimatter and matter are actually the same, but have opposite charges, but there’s hardly any antimatter in the observable universe, including the stars and other galaxies. In theory, there should be large amounts of antimatter, but the observable universe is mostly matter.

“We’re here because there’s more matter than antimatter in the universe,” says Professor Jens Oluf Andersen at the Norwegian University of Science and Technology’s (NTNU)Department of Physics. This great imbalance between matter and antimatter is all tangible matter, including life forms, exists, but scientists don’t understand why.

Physics uses a standard model to explain and understand how the world is connected. The standard model is a theory that describes all the particles scientists are familiar with. It accounts for quarks, electrons, the Higgs boson particle and how they all interact with each other. But the standard model cannot explain the fact that the world consists almost exclusively of matter. So there must be something we don’t yet understand.

When antimatter and matter meet, they annihilate, and the result is light and nothing else. Given equal amounts of matter and antimatter, nothing would remain once the reaction was completed. As long as we don’t know why more matter exists, we can’t know why the building blocks of anything else exist, either. “This is one of the biggest unsolved problems in physics,” says Andersen.

Researchers call this the “baryon asymmetry” problem. Baryons are subatomic particles, including protons and neutrons. All baryons have a corresponding antibaryon, which is mysteriously rare. The standard model of physics explains several aspects of the forces of nature. It explains how atoms become molecules, and it explains the particles that make up atoms.

“The standard model of physics includes all the particles we know about. The newest particle, the Higgs boson, was discovered in 2012 at CERN, says Andersen. With this discovery, an important piece fell into place. But not the final one. The standard model works perfectly to explain large parts of the universe, so researchers are intrigued when something doesn’t fit. Baryon asymmetry belongs in this category.

Physicists do have their theories as to why there is more matter, and thus why we undeniably exist. “One theory is that it’s been this way since the Big Bang,” says Andersen. In other words, the imbalance between matter and antimatter is a basic precondition that has existed more or less from the beginning.

Quarks are among nature’s smallest building blocks. An early surplus of quarks relative to antiquarks was propagated as larger units formed. But Andersen doesn’t care for this explanation. “We’re still not happy with that idea, because it doesn’t tell us much,” he says.

So why was this imbalance present from the beginning? Why did quarks initially outnumber antiquarks? “In principle, it’s possible to generate asymmetry within the standard model of physics—that is, the difference between the amount of matter and antimatter. But we run into two problems,” says Andersen.

First of all, scientists have to go way back in time, to just after the Big Bang when everything started—we’re talking about 10 picoseconds, or 10-11 seconds after the Big Bang.

The second problem is that temperatures have to be around 1 trillion degrees Kelvin, or 1015 degrees. That’s scorching—consider that the sun’s surface is only about 5700 degrees. Regardless, it is not sufficient to explain baryonic matter. “It can’t work. In the standard model, we don’t have enough matter,” Andersen says. “The problem is that the jump in the expectation value of the Higgs field is too small,” he adds for the benefit those with only a minimum grasp of physics.

“It’s probably not just our imagination that’s imposing limits, but lots of possibilities exist,” says Andersen. These possibilities therefore need to work together with the standard model. “What we’re really looking for is an extension of the standard model. Something that fits into it.”

Neither he nor other physicists doubt that the standard model is right. The model is continuously tested at CERN and other particle accelerators. It’s just that the model isn’t yet complete. Andersen and his colleagues are investigating various possibilities for the model to fit with the imbalance between matter and antimatter. The latest results were recently published in Physical Review Letters.

“Actually, we’re talking about phase transitions,” says Andersen. His group is considering processes of change in matter, like water turning into steam or ice under changing conditions. They’re also considering whether matter came about as a result of an electroweak phase transition (EWPT) and formed a surplus of baryons just after the Big Bang. The electroweak phase transition occurs by the formation of bubbles. The new phase expands, a bit like water bubbles, and takes over the entire universe.

Andersen and his colleagues tested the so-called “two Higgs doublet” model (2HDM), one of the simplest extensions of the standard model. They searched for possible areas where the right conditions are present to create matter. “Several scenarios exist for how the baryon asymmetry was created. We studied the electroweak phase transition using the 2HDM model. This phase transition takes place in the early stage of our universe,” says Andersen.

The process is comparable to boiling water. When water reaches 100 degrees Celsius, gas bubbles form and rise up. These gas bubbles contain water vapour which is the gas phase. Water is a liquid. When it transitions from the gas phase to the liquid phase in the early universe during a process in which the universe expands and is cooled, a surplus of quarks is produced compared to antiquarks, generating the baryon asymmetry.

Last but not least, the researchers are also doing mathematics. In order for the models to work in sync, parameters or numerical values have to fit so that both models are right at the same time. So the work is about finding these parameters. In the most recent article in Physical Review Letters, Andersen and his colleagues narrowed down the mathematical area in which matter can be created and at the same time correspond to both models. They have now narrowed the possibilities.

“For the new model (2HDM) to match what we already know from CERN, for example, the parameters in the model can’t be just anything. On the other hand, to be able to produce enough baryon asymmetry, the parameters also have to be within a certain range. So that’s why we’re trying to narrow the parameter range. But that’s still a long way off,” says Andersen. In any case, the researchers have made a bit of headway on the road to understanding why we and everything else are here.