Scientists Close-in On the True Mass of Milky Way

It’s a problem of galactic complexity, but researchers are getting closer to accurately measuring the mass of the Milky Way Galaxy.

In the latest of a series of papers that could have broader implications for the field of astronomy, McMaster astrophysicist Gwendolyn Eadie, working with her PhD supervisor William Harris and with a Queen’s University statistician, Aaron Springford, has refined Eadie and Harris’s own method for measuring the mass of the galaxy that is home to our solar system.

The short answer, using the refined method, is between 4.0 X 1011 and 5.8 X 1011 solar masses. In simpler terms, that’s about the mass of our Sun, multiplied by 400 to 580 billion. The Sun, for the record, has a mass of two nonillion (that’s 2 followed by 30 zeroes) kilograms, or 330,000 times the mass of Earth. This Galactic mass estimate includes matter out to 125 kiloparsecs from the center of the Galaxy (125 kiloparsecs is almost 4 X 1018 kilometers). When the mass estimate is extended out to 300kpc, the mass is approximately 9 X 1011 solar masses.

Measuring the mass of our home galaxy, or any galaxy, is particularly difficult. A galaxy includes not just stars, planets, moons, gases, dust and other objects and material, but also a big helping of dark matter, a mysterious and invisible form of matter that is not yet fully understood and has not been directly detected in the lab.
Astronomers and cosmologists, however, can infer the presence of dark matter through its gravitational influence on visible objects.

Eadie, a PhD candidate in Physics and Astronomy at McMaster University, has been studying the mass of the Milky Way and its dark-matter component since she started graduate school. She uses the velocities and positions of globular star clusters that orbit the Milky Way. The orbits of globular clusters are determined by the galaxy’s gravity, which is dictated by its massive dark matter component.

Previously, Eadie had developed a technique for using Globular Cluster (GCs) velocities, even when the data was incomplete.

The total velocity of a GC must be measured in two directions: one along our line-of-sight, and one across the plane of the sky, called the proper motion. Researchers have not yet measured the proper motions of all the GCs around the Milky Way. Eadie, however, had previously developed a way to use these velocities that are only partially known, in addition to the velocities that are fully known, to estimate the mass of the galaxy.

Now, Eadie has used a statistical method called a hierarchical Bayesian analysis that includes not only complete and incomplete data, but also incorporates measurement uncertainties in an extremely complex but more complete statistical formula. To make the newest calculation, the authors took into account that data are merely measurements of the positions and velocities of the globular clusters and not necessarily the true values. They now treat the true positions and velocities as parameters in the model (which meant adding 572 new parameters to the existing method).

Bayesian statistical methods are not new, but their application to astronomy is still in its early stages, and Eadie believes their capacity to accommodate uncertainty while still producing meaningful results opens many new opportunities in the field.

“As the era of Big Data approaches, I think it is important that we think carefully about the statistical methods we use in data analysis, especially in astronomy, where the data may be incomplete and have varying degrees of uncertainty,” she says.

Bayesian hierarchies have been useful in other fields but are just starting to be applied in astronomy, Eadie explained.

Team of Researchers Catalog Tens of Thousands of Galaxies Beyond Our Milky Way

A team of researchers has compiled a special catalog to help astronomers figure out the true distances to tens of thousands of galaxies beyond our own Milky Way.

The catalog, called NED-D, is a critical resource, not only for studying these galaxies, but also for determining the distances to billions of other galaxies strewn throughout the universe. As the catalog continues to grow, astronomers can increasingly rely on it for ever-greater precision in calculating both how big the universe is and how fast it is expanding. NED-D is part of the NASA/IPAC Extragalactic Database (NED), an online repository containing information on more than 100 million galaxies.

“We’re thrilled to present this catalog of distances to galaxies as a valuable resource to the astronomical community,” said Ian Steer, NED team member, curator of NED-D, and lead author of a new report about the database appearing in The Astronomical Journal. “Learning a cosmic object’s distance is key to understanding its properties.”

Steer and colleagues presented the paper this week at the 229th meeting of the American Astronomical Society in Grapevine, Texas.

Since other galaxies are extremely far away, there’s no tape measure long enough to measure their distances from us. Instead, astronomers rely on extremely bright objects, such as Type La supernovae and pulsating stars called Cepheids variables, as indicators of distance. To calculate how far away a distant galaxy is, scientists use known mathematical relationships between distance and other properties of objects, such as their total emitted energy. More objects useful for these calculations have emerged in recent years. NED-D has revealed that there are now more than six dozen different indicators used to estimate such distances.

NED-D began as a small database pulled together in 2005 by Steer. He began serving at NED the following year to build out the database, poring over the scores of astronomical studies posted online daily, identifying newly calculated distance estimates as well as fresh analyses of older data.

From its humble origins a little over a decade ago, NED-D now hosts upwards of 166,000 distance estimates for more than 77,000 galaxies, along with estimates for some ultra-distant supernovae and energetic gamma ray bursts. To date, NED-D has been cited by researchers in hundreds of studies.

Besides providing a one-stop tabulation of the ever-increasing distance estimates published in the astronomical literature, NED-D—as well as the broader NED—can serve as “discovery engines.” By pooling tremendous amounts of searchable data, the information repositories can allow scientists to identify novel, exotic phenomena that otherwise would get lost in a deluge of observations. An example is the discovery of “super luminous” spiral galaxies by NED team members, reported last year, which were identified among nearly a million individual galaxies in the NED database.

“NED and its associated databases, including NED-D, are in the process of transforming from data look-up services to legitimate discovery engines for science,” said Steer. “Using NED today, astronomers can sift through mountains of ‘big data’ and discover additional new and amazing perspectives on our universe.”

Number of Known Black Holes Expected to Double in Two Years

Researchers from the University of Waterloo have developed a method that will detect roughly 10 black holes per year, doubling the number currently known within two years, and it will likely unlock the history of black holes in a little more than a decade.

blackhole_event_horizon_scienceofcycles

Avery Broderick, a professor in the Department of Physics and Astronomy at the University of Waterloo, and Mansour Karami, a PhD student also from the Faculty of Science, worked with colleagues in the United States and Iran to come up with the method that has implications for the emerging field of gravitational wave astronomy and the way in which we search for black holes and other dark objects in space. It was published this week in The Astrophysical Journal.

“Within the next 10 years, there will be sufficient accumulated data on enough black holes that researchers can statistically analyze their properties as a population,” said Broderick, also an associate faculty member at the Perimeter Institute for Theoretical Physics. “This information will allow us to study stellar mass black holes at various stages that often extend billions of years.”

Black holes absorb all light and matter and emit zero radiation, making them impossible to image, let alone detect against the black background of space. Although very little is known about the inner workings of black holes, we do know they play an integral part in the lifecycle of stars and regulate the growth of galaxies. The first direct proof of their existence was announced earlier this year by the Laser Interferometer Gravitational-Wave Observatory (LIGO) when it detected gravitational waves from the collision of two black holes merging into one.

“We don’t yet know how rare these events are and how many black holes are generally distributed across the galaxy,” said Broderick. “For the first time we’ll be placing all the amazing dynamical physics that LIGO sees into a larger astronomical context.”

Broderick and his colleagues propose a bolder approach to detecting and studying black holes, not as single entities, but in large numbers as a system by combining two standard astrophysical tools in use today: microlensing and radio wave interferometry.

Gravitational microlensing occurs when a dark object such as a black hole passes between us and another light source, such as a star. The star’s light bends around the object’s gravitational field to reach Earth, making the background star appear much brighter, not darker as in an eclipse. Even the largest telescopes that observe microlensing events in visible light have a limited resolution, telling astronomers very little about the object that passed by. Instead of using visible light, Broderick and his team propose using radio waves to take multiple snapshots of the microlensing event in real time.

“When you look at the same event using a radio telescope – interferometry – you can actually resolve more than one image. That’s what gives us the power to extract all kinds of parameters, like the object’s mass, distance and velocity,” said Karami, a doctoral student in astrophysics at Waterloo.

Taking a series of radio images over time and turning them into a movie of the event will allow them to extract another level of information about the black hole itself.

Cosmic Dust Found in City Rooftop Gutters

A small team of researchers with Imperial College London, the Natural History Museum in London, Project Stardust in Norway and Université Libre de Bruxelles in Belgium, has found samples of cosmic dust in the gutters of buildings in three major cities. In their paper published in the journal Geology, the team describes how they found cosmic dust particles the samples, what they look like and what they may reveal about the origins of the solar system.

cosmic_dust_scienceofcycles

Up till now, researchers looking for space dust have usually had to travel to the Antarctic – it was thought the tiny particles, believed to be left over remnants of the formation of the solar system, would be too difficult to find in places where there is a proliferation of other dust types, particularly in areas where people live. John Larson, an amateur space scientist with Project Stardust, came to researchers at Imperial College suggesting that maybe space dust could be found on rooftops. The team traveled to Oslo, Berlin and Paris and obtained 300 kilograms of dirt samples from rain gutters on rooftops. Back in the lab, they used magnets to pull possible cosmic dust grains from within the muck. They report that they found and identified approximately 500 samples.

The team also report that the dust grains they found were larger than those typically found in Antarctica – they measured approximately 0.3 millimeters as opposed to the customary average of 0.01 millimeters. They also noted the grains had fewer feather-like crystals than those found in Antarctica. They suggest the differences are likely due to age – those from Antarctica are typically much older, which would mean the planets would have been aligned slightly differently when they fell to Earth.

That differences indicate dust particles falling through the atmosphere would have been traveling much faster in more recent times due to a difference in trajectory – up to 12 kilometers per second, the fastest ever recorded for space dust. Those differences may illuminate the movement of the planets relative to one another over time, helping to understand the history of the solar system.

UPDATE: New Study Suggest Cosmic Ray Origin Now Include ‘Dark Matter’

Observing the constant rain of cosmic rays hitting Earth can provide information on the “magnetic weather” in other parts of the Galaxy. A new high-precision measurement of two cosmic-ray elements, boron and carbon, supports a specific model of the magnetic turbulence that deflects cosmic rays on their journey through the Galaxy.

dark-matter3-science-of-cycles

The data, which come from the Alpha Magnetic Spectrometer (AMS) aboard the International Space Station, appear to rule out alternative models for cosmic-ray propagation. By ruling out these models, the AMS results support the alternative explanation – a new primary cosmic ray source that emits positrons. Candidates include pulsars and dark matter, but a lot of mystery still surrounds the unexplained positron data.

The majority of cosmic rays are particles or nuclei produced in supernovae or other astrophysical sources. However, as these so-called primary cosmic rays travel through the Galaxy to Earth, they collide with gas atoms in the interstellar medium. The collisions produce a secondary class of cosmic rays with masses and energies that differ from primary cosmic rays.

interstellar-collision-science_of_cycles

To investigate the relationship of the two classes, astrophysicists often look at the ratio of the number of detection’s of two nuclei, such as boron and carbon. For the most part, carbon cosmic rays have a primary origin, whereas boron is almost exclusively created in secondary processes. A relatively high boron-to-carbon (B/C) ratio in a certain energy range implies that the relevant cosmic rays are traversing a lot of gas before reaching us. “The B/C ratio tells you how cosmic rays propagate through space,” says AMS principal investigator Samuel Ting of MIT.

Previous measurements of the B/C ratio have had large errors of 15% or more, especially at high energy, mainly because of the brief data collection time available for balloon-based detectors. But the AMS has been operating on the Space Station for five years, and over this time it has collected more than 80 billion cosmic rays. The AMS detectors measure the charges of these cosmic rays, allowing the elements to be identified. The collaboration has detected over ten million carbon and boron nuclei, with energies per nucleon ranging from a few hundred MeV up to a few TeV.

The B/C ratio decreases with energy because higher-energy cosmic rays tend to take a more direct path to us (and therefore experience fewer collisions producing boron). By contrast, lower-energy cosmic rays are diverted more strongly by magnetic fields, so they bounce around like pinballs among magnetic turbulence regions in the Galaxy. Several theories have been proposed to describe the size and spacing of these turbulent regions, and these theories lead to predictions for the energy dependence of the B/C ratio. However, previous B/C observations have not been precise enough to favor one theory over another. The AMS data show very clearly that the B/C ratio is proportional to the energy raised to the -1/3 power. This result matches a prediction based on a theory of magnetohydrodynamics developed in 1941 by the Russian mathematician Andrey Kolmogorov.

These results conflict with models that predict that the B/C ratio should exhibit some more complex energy dependence, such as kinks in the B/C spectrum at specific energies. Theorists proposed these models to explain anomalous observations – by AMS and other experiments – that showed an increase in the number of positrons (anti-electrons) reaching Earth relative to electrons at high energy. The idea was that these “excess” positrons are – like boron – produced in collisions between cosmic rays and interstellar gas. But such a scenario would require that cosmic rays encounter additional scattering sites, not just magnetically turbulent regions. By ruling out these models, the AMS results support the alternative explanation – a new primary cosmic ray source that emits positrons. Candidates include pulsars and dark matter, but a lot of mystery still surrounds the unexplained positron data.

Igor Moskalenko from Stanford University is very surprised at the close match between the data and the Kolmogorov model. He expected that the ratio would deviate from a single power law in a way that might provide clues to the origin of the excess positrons. “This is a dramatic result that should lead to much better understanding of interstellar magnetohydrodynamic turbulence and propagation of cosmic rays,” he says. “On the other hand, it is very much unexpected in that it makes recent discoveries in astrophysics of cosmic rays even more puzzling.”

More Hints of Exotic Cosmic-Ray Origin

Observing the constant rain of cosmic rays hitting Earth can provide information on the “magnetic weather” in other parts of the Galaxy. A new high-precision measurement of two cosmic-ray elements, boron and carbon, supports a specific model of the magnetic turbulence that deflects cosmic rays on their journey through the Galaxy.

dark-matter3-science-of-cycles

The data, which come from the Alpha Magnetic Spectrometer (AMS) aboard the International Space Station, appear to rule out alternative models for cosmic-ray propagation. By ruling out these models, the AMS results support the alternative explanation – a new primary cosmic ray source that emits positrons. Candidates include pulsars and dark matter, but a lot of mystery still surrounds the unexplained positron data.

The majority of cosmic rays are particles or nuclei produced in supernovae or other astrophysical sources. However, as these so-called primary cosmic rays travel through the Galaxy to Earth, they collide with gas atoms in the interstellar medium. The collisions produce a secondary class of cosmic rays with masses and energies that differ from primary cosmic rays.

interstellar-collision-science_of_cycles

To investigate the relationship of the two classes, astrophysicists often look at the ratio of the number of detection’s of two nuclei, such as boron and carbon. For the most part, carbon cosmic rays have a primary origin, whereas boron is almost exclusively created in secondary processes. A relatively high boron-to-carbon (B/C) ratio in a certain energy range implies that the relevant cosmic rays are traversing a lot of gas before reaching us. “The B/C ratio tells you how cosmic rays propagate through space,” says AMS principal investigator Samuel Ting of MIT.

Previous measurements of the B/C ratio have had large errors of 15% or more, especially at high energy, mainly because of the brief data collection time available for balloon-based detectors. But the AMS has been operating on the Space Station for five years, and over this time it has collected more than 80 billion cosmic rays. The AMS detectors measure the charges of these cosmic rays, allowing the elements to be identified. The collaboration has detected over ten million carbon and boron nuclei, with energies per nucleon ranging from a few hundred MeV up to a few TeV.

The B/C ratio decreases with energy because higher-energy cosmic rays tend to take a more direct path to us (and therefore experience fewer collisions producing boron). By contrast, lower-energy cosmic rays are diverted more strongly by magnetic fields, so they bounce around like pinballs among magnetic turbulence regions in the Galaxy. Several theories have been proposed to describe the size and spacing of these turbulent regions, and these theories lead to predictions for the energy dependence of the B/C ratio. However, previous B/C observations have not been precise enough to favor one theory over another. The AMS data show very clearly that the B/C ratio is proportional to the energy raised to the -1/3 power. This result matches a prediction based on a theory of magnetohydrodynamics developed in 1941 by the Russian mathematician Andrey Kolmogorov.

These results conflict with models that predict that the B/C ratio should exhibit some more complex energy dependence, such as kinks in the B/C spectrum at specific energies. Theorists proposed these models to explain anomalous observations – by AMS and other experiments – that showed an increase in the number of positrons (anti-electrons) reaching Earth relative to electrons at high energy. The idea was that these “excess” positrons are – like boron – produced in collisions between cosmic rays and interstellar gas. But such a scenario would require that cosmic rays encounter additional scattering sites, not just magnetically turbulent regions. By ruling out these models, the AMS results support the alternative explanation – a new primary cosmic ray source that emits positrons. Candidates include pulsars and dark matter, but a lot of mystery still surrounds the unexplained positron data.

Igor Moskalenko from Stanford University is very surprised at the close match between the data and the Kolmogorov model. He expected that the ratio would deviate from a single power law in a way that might provide clues to the origin of the excess positrons. “This is a dramatic result that should lead to much better understanding of interstellar magnetohydrodynamic turbulence and propagation of cosmic rays,” he says. “On the other hand, it is very much unexpected in that it makes recent discoveries in astrophysics of cosmic rays even more puzzling.”

Violent Collision with Superluminous Supernovae

In a unique study, an international team of researchers including members from the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) simulated the violent collisions between supernovae and its surrounding gas – which is ejected before a supernova explosion, thereby giving off an extreme brightness.

supernova-shock

Many supernovae have been discovered in the last decade with peak luminosity one-to-two orders of magnitude higher than for normal supernovae of known types. These stellar explosions are called Superluminous Supernovae (SLSNe).

Some of them have hydrogen in their spectra, while some others demonstrate a lack of hydrogen. The latter are called Type I, or hydrogen-poor, SLSNe-I. SLSNe-I challenge the theory of stellar evolution, since even normal supernovae are not yet completely understood from first principles.

Led by Sternberg Astronomical Institute researcher Elena Sorokina, who was a guest investigator at Kavli IPMU, and Kavli IPMU Principal Investigator Ken’ichi Nomoto, Scientific Associate Sergei Blinnikov, as well as Project Researcher Alexey Tolstov, the team developed a model that can explain a wide range of observed light curves of SLSNe-I in a scenario which requires much less energy than other proposed models.

The models demonstrating the events with the minimum energy budget involve multiple ejections of mass in presupernova stars. Mass loss and buildup of envelopes around massive stars are generic features of stellar evolution. Normally, those envelopes are rather diluted, and they do not change significantly the light produced in the majority of supernovae.

In some cases, large amount of mass are expelled just a few years before the final explosion. Then, the “clouds” around supernovae may be quite dense. The shockwaves produced in collisions of supernova ejecta and those dense shells may provide the required power of light to make the supernova much brighter than a “naked” supernova without pre-ejected surrounding material.

This class of the models is referred to as “interacting” supernovae. The authors show that the interacting scenario is able to explain both fast and slowly fading SLSNe-I, so the large range of these intriguingly bright objects can in reality be almost ordinary supernovae placed into extraordinary surroundings.

Another extraordinarity is the chemical composition expected for the circumstellar “clouds.” Normally, stellar wind consists of mostly hydrogen, because all thermonuclear reactions happen in the center of a star, while outer layers are hydrogenous.

In the case of SLSNe-I, the situation must be different. The progenitor star must lose its hydrogen and a large part of helium well before the explosion, so that a few months to a few years before the explosion, it ejects mostly carbon and oxygen, and then explode inside that dense CO cloud. Only this composition can explain the spectral and photometric features of observed hydrogen-poor SLSNe in the interacting scenario.

It is a challenge for the stellar evolution theory to explain the origin of such hydrogen- and helium-poor progenitors and the very intensive mass loss of CO material just before the final explosion of the star. These results have been published in a paper accepted by The Astrophysical Journal.