New Confirmation Galactic Cosmic Rays Have Increased Intensity

Further confirmation advocating my research related to external sources outside our solar system is synchronous to our interplanetary cycles. The Sun-Earth connection, analogous to its 11, 22 year cycle reacts in congruous with larger galactic cycles of 500, 1,000, 5,000, 44,000, 100,000 (Centrennium) and beyond into (Megaannus) 1,000,000 year cycles.


Only the most recent research has been able to identify such events as a result of almost magical new hardware of satellites, telescopes, spacecraft, and of course the software that goes with it. You might remember an article I wrote almost 2 years ago, as I reported to you what my sources directly connected to international space agencies, had told to me. It went something like this: “New information is coming in so fast, and is challenging our known formulas, templates, equations etc, we had to shut it down (figuratively) and begin our unsettling task of creating a new paradigm.”

galactic cosmic ray chart

As our brilliant, yet mostly isolated scientific disciplines, have begun to slowly unwind data that reaches memory sizes beyond Terabytes, beyond Petabytes, beyond Exabytes, now beyond Zettabytes, and currently is filling Yottabytes.  As the slow untangling of new insights unfold, we can now see a direct connection to cyclical patterns far beyond our solar system borders and into our home galaxy Milky Way.

Memory Scale: 1 yottabyte = 1024 zettabytes = 1048576 exabytes = 1073741824 petabytes = 1099511627776 terabytes = 1125899906842624 gigabytes.


New information collected from neutron monitor measurements from the University of Oulu Cosmic Ray Station intensification of cosmic rays is making itself felt not only over the poles, but also over lower latitudes where Earth’s magnetic field provides a greater degree of protection against deep space radiation.

Earth’s magnetic field is currently weakening more rapidly. Data from the SWARM satellites have shown the field is starting to weaken faster than in the past. Previously, researchers estimated the field was weakening about 5 percent per century, but the new data revealed the field is actually weakening at 5 percent per decade, or 10 times faster than thought.

cosmic ray stream - solar system_m

New Equation:
Increase Charged Particles Decreased Magnetic Field → Increase Outer Core Convection → Increase of Mantle Plumes → Increase in Earthquake and Volcanoes → Cools Mantle and Outer Core → Return of Outer Core Convection (Mitch Battros – July 2012)

new_equation 2012

In a recent study using neutron monitor measurements from the University of Oulu Cosmic Ray Station, show an accelerated amount of cosmic rays are now hitting lower latitudes likely due to a weakened magnetic field. This is cause for alert as radiation measurements have increased which could have a long-lasting effect on airline ceilings.

More on this research coming this week…………….



If banner is not working click here:

**We Are Always in Need of Supplemental Support – Thank You.


If banner is not working click here:


Scientists Decode Brain Signals Nearly At Speed Of Perception

Using electrodes implanted in the temporal lobes of awake patients, scientists have decoded brain signals at nearly the speed of perception. Further, analysis of patients’ neural responses to two categories of visual stimuli — images of faces and houses — enabled the scientists to subsequently predict which images the patients were viewing, and when, with better than 95 percent accuracy.

The research is published in PLOS Computational Biology.


University of Washington computational neuroscientist Rajesh Rao and UW Medicine neurosurgeon Jeff Ojemann, working their student Kai Miller and with colleagues in Southern California and New York, conducted the study.

“We were trying to understand, first, how the human brain perceives objects in the temporal lobe, and second, how one could use a computer to extract and predict what someone is seeing in real time?” explained Rao. He is a UW professor of computer science and engineering, and he directs the National Science Foundation’s Center for Sensorimotor Engineering, headquartered at UW.

“Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked-in,” he said.

The study involved seven epilepsy patients receiving care at Harborview Medical Center in Seattle. Each was experiencing epileptic seizures not relieved by medication, Ojemann said, so each had undergone surgery in which their brains’ temporal lobes were implanted — temporarily, for about a week — with electrodes to try to locate the seizures’ focal points.

“They were going to get the electrodes no matter what; we were just giving them additional tasks to do during their hospital stay while they are otherwise just waiting around,” Ojemann said.

Temporal lobes process sensory input and are a common site of epileptic seizures. Situated behind mammals’ eyes and ears, the lobes are also involved in Alzheimer’s and dementias and appear somewhat more vulnerable than other brain structures to head traumas, he said.

In the experiment, the electrodes from multiple temporal-lobe locations were connected to powerful computational software that extracted two characteristic properties of the brain signal: “event-related potentials” and “broadband spectral changes.”

Rao characterized the former as likely arising from “hundreds of thousands of neurons being co-activated when an image is first presented,” and the latter as “continued processing after the initial wave of information.”

The subjects, watching a computer monitor, were shown a random sequence of pictures — brief (400 millisecond) flashes of images of human faces and houses, interspersed with blank gray screens. Their task was to watch for an image of an upside-down house.

“We got different responses from different (electrode) locations; some were sensitive to faces and some were sensitive to houses,” Rao said.

The computational software sampled and digitized the brain signals 1,000 times per second to extract their characteristics. The software also analyzed the data to determine which combination of electrode locations and signal types correlated best with what each subject actually saw.

In that way it yielded highly predictive information.

By training an algorithm on the subjects’ responses to the (known) first two-thirds of the images, the researchers could examine the brain signals representing the final third of the images, whose labels were unknown to them, and predict with 96 percent accuracy whether and when (within 20 milliseconds) the subjects were seeing a house, a face or a gray screen.

This accuracy was attained only when event-related potentials and broadband changes were combined for prediction, which suggests they carry complementary information.

“Traditionally scientists have looked at single neurons,” Rao said. “Our study gives a more global picture, at the level of very large networks of neurons, of how a person who is awake and paying attention perceives a complex visual object.”

The scientists’ technique, he said, is a steppingstone for brain mapping, in that it could be used to identify in real time which locations of the brain are sensitive to types of information.

Lead author of the study is Kai Miller, a neurosurgery resident and physicist at Stanford University who obtained his M.D. and Ph.D. at the UW. Other collaborators were Dora Hermes, a Stanford postdoctoral fellow in neuroscience, and Gerwin Schalk, a neuroscientist at the Wadsworth Institute in New York.

“The computational tools that we developed can be applied to studies of motor function, studies of epilepsy, studies of memory. The math behind it, as applied to the biological, is fundamental to learning,” Ojemann said.

Bringing Time And Space Together For Universal Symmetry

New research from Griffith University’s Centre for Quantum Dynamics is broadening perspectives on time and space.


In a paper published in the journal Proceedings of the Royal Society A, Associate Professor Joan Vaccaro challenges the long-held presumption that time evolution — the incessant unfolding of the universe over time — is an elemental part of Nature.

In the paper, entitled Quantum asymmetry between time and space, she suggests there may be a deeper origin due to a difference between the two directions of time: to the future and to the past.

“If you want to know where the universe came from and where it’s going, you need to know about time,” says Associate Professor Vaccaro.

“Experiments on subatomic particles over the past 50 years ago show that Nature doesn’t treat both directions of time equally.

“In particular, subatomic particles called K and B mesons behave slightly differently depending on the direction of time.

“When this subtle behaviour is included in a model of the universe, what we see is the universe changing from being fixed at one moment in time to continuously evolving.

“In other words, the subtle behaviour appears to be responsible for making the universe move forwards in time.

“Understanding how time evolution comes about in this way opens up a whole new view on the fundamental nature of time itself.

“It may even help us to better understand bizarre ideas such as travelling back in time.”

According to the paper, an asymmetry exists between time and space in the sense that physical systems inevitably evolve over time whereas there is no corresponding ubiquitous translation over space.

This asymmetry, long presumed to be elemental, is represented by equations of motion and conservation laws that operate differently over time and space.

However, Associate Professor Vaccaro used a “sum-over-paths formalism” to demonstrate the possibility of a time and space symmetry, meaning the conventional view of time evolution would need to be revisited.

“In the connection between time and space, space is easier to understand because it’s simply there. But time is forever forcing us towards the future,” says Associate Professor Vaccaro.

“Yet while we are indeed moving forward in time, there is also always some movement backwards, a kind of jiggling effect, and it is this movement I want to measure using these K and B mesons.”

Associate Professor Vaccaro says the research provides a solution to the origin of dynamics, an issue that has long perplexed science.

Monstrous Cloud Boomerangs Back To Our Galaxy

Hubble Space Telescope astronomers are finding that the old adage “what goes up must come down” even applies to an immense cloud of hydrogen gas outside our Milky Way galaxy. The invisible cloud is plummeting toward our galaxy at nearly 700,000 miles per hour.


Though hundreds of enormous, high-velocity gas clouds whiz around the outskirts of our galaxy, this so-called “Smith Cloud” is unique because its trajectory is well known. New Hubble observations suggest it was launched from the outer regions of the galactic disk, around 70 million years ago. The cloud was discovered in the early 1960s by doctoral astronomy student Gail Smith, who detected the radio waves emitted by its hydrogen.

The cloud is on a return collision course and is expected to plow into the Milky Way’s disk in about 30 million years. When it does, astronomers believe it will ignite a spectacular burst of star formation, perhaps providing enough gas to make 2 million suns.

“The cloud is an example of how the galaxy is changing with time,” explained team leader Andrew Fox of the Space Telescope Science Institute in Baltimore, Maryland. “It’s telling us that the Milky Way is a bubbling, very active place where gas can be thrown out of one part of the disk and then return back down into another.”

“Our galaxy is recycling its gas through clouds, the Smith Cloud being one example, and will form stars in different places than before. Hubble’s measurements of the Smith Cloud are helping us to visualize how active the disks of galaxies are,” Fox said.

Astronomers have measured this comet-shaped region of gas to be 11,000 light-years long and 2,500 light-years across. If the cloud could be seen in visible light, it would span the sky with an apparent diameter 30 times greater than the size of the full moon.

Astronomers long thought that the Smith Cloud might be a failed, starless galaxy, or gas falling into the Milky Way from intergalactic space. If either of these scenarios proved true, the cloud would contain mainly hydrogen and helium, not the heavier elements made by stars. But if it came from within the galaxy, it would contain more of the elements found within our sun.

The team used Hubble to measure the Smith Cloud’s chemical composition for the first time, to determine where it came from. They observed the ultraviolet light from the bright cores of three active galaxies that reside billions of light-years beyond the cloud. Using Hubble’s Cosmic Origins Spectrograph, they measured how this light filters through the cloud.

In particular, they looked for sulfur in the cloud which can absorb ultraviolet light. “By measuring sulfur, you can learn how enriched in sulfur atoms the cloud is compared to the sun,” Fox explained. Sulfur is a good gauge of how many heavier elements reside in the cloud.

The astronomers found that the Smith Cloud is as rich in sulfur as the Milky Way’s outer disk, a region about 40,000 light-years from the galaxy’s center (about 15,000 light-years farther out than our sun and solar system). This means that the Smith Cloud was enriched by material from stars. This would not happen if it were pristine hydrogen from outside the galaxy, or if it were the remnant of a failed galaxy devoid of stars. Instead, the cloud appears to have been ejected from within the Milky Way and is now boomeranging back.

Though this settles the mystery of the Smith Cloud’s origin, it raises new questions: How did the cloud get to where it is now? What calamitous event could have catapulted it from the Milky Way’s disk, and how did it remain intact? Could it be a region of dark matter — an invisible form of matter — that passed through the disk and captured Milky Way gas? The answers may be found in future research.

‘Lifespan Machine’ Probes Cause Of Aging

Aging is one of the most mysterious processes in biology. We don’t know, scientifically speaking, what exactly it is. We do know for sure when it ends, but to make matters even more inscrutable, the timing of death is determined by factors that are in many cases statistically random.


Researchers in the lab of Walter Fontana, Harvard Medical School professor of systems biology, have found patterns in this randomness that provide clues into the biological basis of aging.

The research team, led by Novartis Fellow Nicholas Stroustrup, found a surprising statistical regularity in how a variety of genetic and environmental factors affect the life span of the Caenorhabditis elegans worm. Their findings suggest that aging does not have a single discrete molecular cause but is rather a systemic process involving many components within a complex biological network. Perturb any node in the system, and you affect the whole thing.

The study, published Jan. 27 in Nature, offers an alternative to research that seeks to identify a specific master aging mechanism, such as protein homeostasis or DNA damage.

“There are many important molecular changes that occur with age, but it might not make sense to call all of them ’causes of aging,’ per se,” said Stroustrup, first author on the paper.

Off the shelf

In order to study life span dynamics at the population level, Stroustrup constructed the Lifespan Machine, a device comprising 50 off-the-shelf flatbed scanners purchased from an office supplies store. Each scanner has been retooled to record 16 petri dishes every hour, totaling 800 dishes and 30,000 worms. The scanners capture images at 3,200 dots per inch, which is a resolution high enough to detect movements of eight micrometers, or about 12 percent of the width of an average worm.

Stroustrup subjected the worms to interventions as diverse as temperature changes, oxidative stress, changes in diet and genetic manipulations that altered, for example, insulin growth factor signaling. The Lifespan Machine recorded how long it took the worms to die under each condition. Stroustrup then aggregated the data, generated life span distribution curves for each intervention and compared results.

The life span distributions provided considerably more information than just changes in average life span. The research team measured variations arising in ostensibly identical individuals, looking at how many worms died young versus how many made it to old age under each condition. This comprehensive view was important for capturing the dynamics and randomness in the aging process.

Clear as a bell curve

In one sense, the findings were not surprising: different circumstances produced different life spans. Turning up the heat caused the worms to die quickly, and turning it up higher only increased that rate. Pictured as bell-shaped distributions, certain interventions produced a thinner, high-peaked bell, while others resulted in a more drawn-out and protracted bell.

Despite these obvious differences, the researchers found an unexpected uniformity among the curves, observing what statisticians call “temporal scaling.” Stated for the rest of us, if you were to take all of the bell-shaped curves and expand or contract them along the X-axes (which in this study represented time), they would become statistically indistinguishable. Simply compressing the protracted bell would produce a high-peaked bell, or vice versa. The two bells have, in a rigorous sense, the same shape.

The various interventions seemed to affect the duration of life in the same way across all individuals in the same population, regardless of whether chance or randomness had a short or long life in store for them. No matter which genetic process or environmental factor the researchers targeted, all molecular causes of death seemed to be affected at once and to the same extent.

“Life span is a whole-organism property,” said Fontana, “and it is profoundly difficult to study it molecularly in real time. But by discovering this kind of statistical regularity about the endpoint of aging, we have learned something about the aging process that determines that endpoint.”

Most important, said Fontana, this regularity suggests that there is profound interdependence in the physiology of an organism, and changes in one physiological aspect affect all others to determine life span.

The researchers believe that their discovery will influence how scientists study, and even define, aging — for people as well as worms. The researchers now plan to study in more detail how broad statistical regularities can emerge from the action of diverse molecular mechanisms, seeking to determine exactly how alteration of one mechanism can affect all others.

Explosive Underwater Volcanoes Were A Major Feature Of ‘Snowball Earth’

Around 720-640 million years ago, much of the Earth’s surface was covered in ice during a glaciation that lasted millions of years. Explosive underwater volcanoes were a major feature of this ‘Snowball Earth’, according to new research led by the University of Southampton.


Many aspects of this extreme glaciation remain uncertain, but it is widely thought that the breakup of the supercontinent Rodinia resulted in increased river discharge into the ocean. This changed ocean chemistry and reduced atmospheric CO2 levels, which increased global ice coverage and propelled Earth into severe icehouse conditions.

Because the land surface was then largely covered in ice, continental weathering effectively ceased. This locked the planet into a ‘Snowball Earth’ state until carbon dioxide released from ongoing volcanic activity warmed the atmosphere sufficiently to rapidly melt the ice cover. This model does not, however, explain one of the most puzzling features of this rapid deglaciation; namely the global formation of hundreds of metres thick deposits known as ‘cap carbonates’, in warm waters after Snowball Earth events.

The Southampton-led research, published in Nature Geoscience, now offers an explanation for these major changes in ocean chemistry.

Lead author of the study Dr Tom Gernon, Lecturer in Earth Science at the University of Southampton, said: “When volcanic material is deposited in the oceans it undergoes very rapid and profound chemical alteration that impacts the biogeochemistry of the oceans. We find that many geological and geochemical phenomena associated with Snowball Earth are consistent with extensive submarine volcanism along shallow mid-ocean ridges.”

During the breakup of Rodinia, tens of thousands of kilometres of mid-ocean ridge were formed over tens of millions of years. The lava erupted explosively in shallow waters producing large volumes of a glassy pyroclastic rock called hyaloclastite. As these deposits piled up on the sea floor, rapid chemical changes released massive amounts of calcium, magnesium and phosphorus into the ocean.

Dr Gernon explained: “We calculated that, over the course of a Snowball glaciation, this chemical build-up is sufficient to explain the thick cap carbonates formed at the end of the Snowball event.

“This process also helps explain the unusually high oceanic phosphorus levels, thought to be the catalyst for the origin of animal life on Earth.”