CERN Scientists Say The LHC Has Confirmed Two New Particles, And Possibly Discovered a Third


They are known as bottom baryons.

The Large Hadron Collider is at it again, showing us new wonders in the world of particle physics. Scientists working on the Large Hadron Collider beauty (LHCb) collaboration have observed two new particles that have never been seen before – and seen evidence of a third.

main article image

The two new particles, predicted by the standard quark model, are baryons – the same family of particles as the protons used in LHC particle acceleration experiments.

Baryons are what most of the Universe is made up of, including protons and neutrons – composite particles consisting of three fundamental particles called quarks, which have different ‘flavours’, or types: up, down, top, bottom, charm, and strange.

Protons consist of two up quarks and one down quark, while neutrons consist of one up quark and two down quarks, for instance. But the two new particles discovered have a slightly different composition.

Named Σb(6097)+ and Σb(6097), they consist of two up quarks and one bottom quark; and two down quarks and one bottom quark, respectively.

These particles are known as bottom baryons, and they are related to four particles previously observed at Fermilab. However, the new observations mark the first time scientists have detected these higher-mass counterparts; they are about six times more massive than a proton.

So what’s the third particle candidate we mentioned earlier?

The researchers think it might be a strange type of composite particle called a tetraquark. These are an exotic kind of meson, which normally have two quarks. But a tetraquark is composed of four quarks – well, two quarks and two antiquarks, to be more accurate.

Observational evidence of tetraquarks has been pretty elusive to date, and that is also the case here. Evidence of the candidate particle, called Zc(4100) and including two heavy charm quarks, was detected in the decay of heavier B mesons.

But the detection only had a significance of over 3 standard deviations. The usual threshold to claim the discovery of a new particle is 5 standard deviations. It will take future observations to either confirm or disprove the existence of Zc(4100).

The new bottom baryons, you’ll be pleased to know, blew that threshold out of the water: Σb(6097)+ and Σb(6097) had significances of 12.7 and 12.6 standard deviations respectively.

Advertisements

Particle Physicists Turn to AI to Cope with CERN’s Collision Deluge


Can a competition with cash rewards improve techniques for tracking the Large Hadron Collider’s messy particle trajectories?

Particle Physicists Turn to AI to Cope with CERN's Collision Deluge
A visualization of complex sprays of subatomic particles, produced from colliding proton beams in CERN’s CMS detector at the Large Hadron Collider near Geneva, Switzerland in mid-April of 2018. Credit: CERN

Physicists at the world’s leading atom smasher are calling for help. In the next decade, they plan to produce up to 20 times more particle collisions in the Large Hadron Collider (LHC) than they do now, but current detector systems aren’t fit for the coming deluge. So this week, a group of LHC physicists has teamed up with computer scientists to launch a competition to spur the development of artificial-intelligence techniques that can quickly sort through the debris of these collisions. Researchers hope these will help the experiment’s ultimate goal of revealing fundamental insights into the laws of nature.

At the LHC at CERN, Europe’s particle-physics laboratory near Geneva, two bunches of protons collide head-on inside each of the machine’s detectors 40 million times a second. Every proton collision can produce thousands of new particles, which radiate from a collision point at the centre of each cathedral-sized detector. Millions of silicon sensors are arranged in onion-like layers and light up each time a particle crosses them, producing one pixel of information every time. Collisions are recorded only when they produce potentially interesting by-products. When they are, the detector takes a snapshot that might include hundreds of thousands of pixels from the piled-up debris of up to 20 different pairs of protons. (Because particles move at or close to the speed of light, a detector cannot record a full movie of their motion.)

From this mess, the LHC’s computers reconstruct tens of thousands of tracks in real time, before moving on to the next snapshot. “The name of the game is connecting the dots,” says Jean-Roch Vlimant, a physicist at the California Institute of Technology in Pasadena who is a member of the collaboration that operates the CMS detector at the LHC.

After future planned upgrades, each snapshot is expected to include particle debris from 200 proton collisions. Physicists currently use pattern-recognition algorithms to reconstruct the particles’ tracks. Although these techniques would be able to work out the paths even after the upgrades, “the problem is, they are too slow”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay. Without major investment in new detector technologies, LHC physicists estimate that the collision rates will exceed the current capabilities by at least a factor of 10.

Researchers suspect that machine-learning algorithms could reconstruct the tracks much more quickly. To help find the best solution, Vlimant and other LHC physicists teamed up with computer scientists including Germain to launch the TrackML challenge. For the next three months, data scientists will be able to download 400 gigabytes of simulated particle-collision data—the pixels produced by an idealized detector—and train their algorithms to reconstruct the tracks.

Participants will be evaluated on the accuracy with which they do this. The top three performers of this phase hosted by Google-owned company Kaggle, will receive cash prizes of US$12,000, $8,000 and $5,000. A second competition will then evaluate algorithms on the basis of speed as well as accuracy, Vlimant says.

Prize appeal

Such competitions have a long tradition in data science, and many young researchers take part to build up their CVs. “Getting well ranked in challenges is extremely important,” says Germain. Perhaps the most famous of these contests was the 2009 Netflix Prize. The entertainment company offered US$1 million to whoever worked out the best way to predict what films its users would like to watch, going on their previous ratings. TrackML isn’t the first challenge in particle physics, either: in 2014, teams competed to ‘discover’ the Higgs boson in a set of simulated data (the LHC discovered the Higgs, long predicted by theory, in 2012). Other science-themed challenges have involved data on anything from plankton to galaxies.

From the computer-science point of view, the Higgs challenge was an ordinary classification problem, says Tim Salimans, one of the top performers in that race (after the challenge, Salimans went on to get a job at the non-profit effort OpenAI in San Francisco, California). But the fact that it was about LHC physics added to its lustre, he says. That may help to explain the challenge’s popularity: nearly 1,800 teams took part, and many researchers credit the contest for having dramatically increased the interaction between the physics and computer-science communities.

TrackML is “incomparably more difficult”, says Germain. In the Higgs case, the reconstructed tracks were part of the input, and contestants had to do another layer of analysis to ‘find’ the particle. In the new problem, she says, you have to find in the 100,000 points something like 10,000 arcs of ellipse. She thinks the winning technique might end up resembling those used by the program AlphaGo, which made history in 2016 when it beat a human champion at the complex game of Go. In particular, they might use reinforcement learning, in which an algorithm learns by trial and error on the basis of ‘rewards’ that it receives after each attempt.

Vlimant and other physicists are also beginning to consider more untested technologies, such as neuromorphic computing and quantum computing. “It’s not clear where we’re going,” says Vlimant, “but it looks like we have a good path.”

For all book lovers please visit my friend’s website.
URL: http://www.romancewithbooks.com

Particle Physicists Turn to AI to Cope with CERN’s Collision Deluge


Can a competition with cash rewards improve techniques for tracking the Large Hadron Collider’s messy particle trajectories?

Particle Physicists Turn to AI to Cope with CERN's Collision Deluge
A visualization of complex sprays of subatomic particles, produced from colliding proton beams in CERN’s CMS detector at the Large Hadron Collider near Geneva, Switzerland in mid-April of 2018.

Physicists at the world’s leading atom smasher are calling for help. In the next decade, they plan to produce up to 20 times more particle collisions in the Large Hadron Collider (LHC) than they do now, but current detector systems aren’t fit for the coming deluge. So this week, a group of LHC physicists has teamed up with computer scientists to launch a competition to spur the development of artificial-intelligence techniques that can quickly sort through the debris of these collisions. Researchers hope these will help the experiment’s ultimate goal of revealing fundamental insights into the laws of nature.

At the LHC at CERN, Europe’s particle-physics laboratory near Geneva, two bunches of protons collide head-on inside each of the machine’s detectors 40 million times a second. Every proton collision can produce thousands of new particles, which radiate from a collision point at the centre of each cathedral-sized detector. Millions of silicon sensors are arranged in onion-like layers and light up each time a particle crosses them, producing one pixel of information every time. Collisions are recorded only when they produce potentially interesting by-products. When they are, the detector takes a snapshot that might include hundreds of thousands of pixels from the piled-up debris of up to 20 different pairs of protons. (Because particles move at or close to the speed of light, a detector cannot record a full movie of their motion.)

From this mess, the LHC’s computers reconstruct tens of thousands of tracks in real time, before moving on to the next snapshot. “The name of the game is connecting the dots,” says Jean-Roch Vlimant, a physicist at the California Institute of Technology in Pasadena who is a member of the collaboration that operates the CMS detector at the LHC.

After future planned upgrades, each snapshot is expected to include particle debris from 200 proton collisions. Physicists currently use pattern-recognition algorithms to reconstruct the particles’ tracks. Although these techniques would be able to work out the paths even after the upgrades, “the problem is, they are too slow”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay. Without major investment in new detector technologies, LHC physicists estimate that the collision rates will exceed the current capabilities by at least a factor of 10.

Researchers suspect that machine-learning algorithms could reconstruct the tracks much more quickly. To help find the best solution, Vlimant and other LHC physicists teamed up with computer scientists including Germain to launch the TrackML challenge. For the next three months, data scientists will be able to download 400 gigabytes of simulated particle-collision data—the pixels produced by an idealized detector—and train their algorithms to reconstruct the tracks.

Participants will be evaluated on the accuracy with which they do this. The top three performers of this phase hosted by Google-owned company Kaggle, will receive cash prizes of US$12,000, $8,000 and $5,000. A second competition will then evaluate algorithms on the basis of speed as well as accuracy, Vlimant says.

Prize appeal

Such competitions have a long tradition in data science, and many young researchers take part to build up their CVs. “Getting well ranked in challenges is extremely important,” says Germain. Perhaps the most famous of these contests was the 2009 Netflix Prize. The entertainment company offered US$1 million to whoever worked out the best way to predict what films its users would like to watch, going on their previous ratings. TrackML isn’t the first challenge in particle physics, either: in 2014, teams competed to ‘discover’ the Higgs boson in a set of simulated data (the LHC discovered the Higgs, long predicted by theory, in 2012). Other science-themed challenges have involved data on anything from plankton to galaxies.

From the computer-science point of view, the Higgs challenge was an ordinary classification problem, says Tim Salimans, one of the top performers in that race (after the challenge, Salimans went on to get a job at the non-profit effort OpenAI in San Francisco, California). But the fact that it was about LHC physics added to its lustre, he says. That may help to explain the challenge’s popularity: nearly 1,800 teams took part, and many researchers credit the contest for having dramatically increased the interaction between the physics and computer-science communities.

TrackML is “incomparably more difficult”, says Germain. In the Higgs case, the reconstructed tracks were part of the input, and contestants had to do another layer of analysis to ‘find’ the particle. In the new problem, she says, you have to find in the 100,000 points something like 10,000 arcs of ellipse. She thinks the winning technique might end up resembling those used by the program AlphaGo, which made history in 2016 when it beat a human champion at the complex game of Go. In particular, they might use reinforcement learning, in which an algorithm learns by trial and error on the basis of ‘rewards’ that it receives after each attempt.

Vlimant and other physicists are also beginning to consider more untested technologies, such as neuromorphic computing and quantum computing. “It’s not clear where we’re going,” says Vlimant, “but it looks like we have a good path.”

For all book lovers please visit my friend’s website.
URL: http://www.romancewithbooks.com

CERN May Have Evidence of a Quasiparticle We’ve Been Hunting For Decades


Meet the elusive odderon.

The Large Hadron Collider (LHC) is the particle accelerator that just keeps on giving, and recent experiments at the site suggest we’ve got the first evidence for a mysterious subatomic quasiparticle that, until now, was only a hypothesis.

Quasiparticles aren’t technically particles, but they act like them in some respects, and the newly recorded reactions point to a particular quasiparticle called the odderon.

It already has a name because physicists have been on its theoretical trail for the past 40 years.

Now, they still haven’t seen the elusive odderon itself, but researchers have now observed certain effects that hint the quasiparticle really is there.

That would in turn give us new information to feed into the Standard Model of particle physics, the guidebook that all the building blocks of physical matter are thought to follow.

“This doesn’t break the Standard Model, but there are very opaque regions of the Standard Model, and this work shines a light on one of those opaque regions,” says one of the team, particle physicist Timothy Raben from the University of Kansas.

“These ideas date back to the 70s, but even at that time it quickly became evident we weren’t close technologically to being able to see the odderon, so while there are several decades of predictions, the odderon has not been seen.”

The reactions studied in this case involve quarks, or electrically charged subatomic particles, and gluons, which act as exchange particles between quarks and enable them to stick together to form protons and neutrons.

In proton collisions where the protons remain intact, up until now scientists have only seen this happen when an even number of gluons are exchanged between different protons. The new research notes, for the first time, these reactions happening with an odd number of gluons.

And it’s the way the protons deviate rather than break that’s important for this particular area of investigation. It was this phenomena that first led to the idea of a quasiparticle called an odderon, to explain away collisions where protons survived.

“The odderon is one of the possible ways by which protons can interact without breaking, whose manifestations have never been observed .. this could be the first evidence of that,” Simone Giani, spokesperson at the TOTEM experiment of which this is a part, told Ryan F. Mandelbaum at Gizmodo.

It’s a pretty complex idea to wrap your head around, so the researchers have used a vehicle metaphor to explain what’s going on.

“The protons interact like two big semi-trucks that are transporting cars, the kind you see on the highway,” explains Raben.

“If those trucks crashed together, after the crash you’d still have the trucks, but the cars would now be outside, no longer aboard the trucks – and also new cars are produced. Energy is transformed into matter.”

“Until now, most models were thinking there was a pair of gluons – always an even number… We found measurements that are incompatible with this traditional model of assuming an even number of gluons.”

What all of that theoretical physics and subatomic analysis means is that we may have seen evidence of the odderon at work – with the odderon being the total contribution produced from the exchange of an odd number of gluons.

The experiments involved a team of over 100 physicists, colliding billions of proton pairs together every second in the LHC. At its peak, data was being collected at 13 teraelectronvolts (TeV), a new record.

By comparing these high energy tests with results gleaned from other tests run on less powerful hardware, the researchers could reach a new level of accuracy in their proton collision measurements, and that may have revealed the odderon.

Ultimately this kind of super-high energy experiment can feed into all kinds of areas of research, including medicine, water purification, and cosmic ray measuring.

We’re still waiting for confirmation that this legendary quasiparticle has in fact been found – or at least that its effects have – and the papers are currently submitted to be published in peer reviewed journals.

But it’s definitely a super-exciting time for physicists.

“We expect big results in the coming months or years,” says one of the researchers, Christophe Royon from the University of Kansas.

The research is currently undergoing peer review, but you can read the studies on the ArXiv.org and CERN pre-print servers.

The Real Science of the God Particle in Netflix’s ‘The Cloverfield Paradox’


Even if you’re not a particle physics buff, you may have noticed that the plot of Netflix’s surprise Superbowl Sunday release, The Cloverfield Paradox, relies heavily on a huge physics discovery that was in the news a few years ago: the Higgs Boson particle.

The Cloverfield Paradox

Also known as the “God particle” — which happened to be the working title of the new J.J. Abrams film — the Higgs Boson was first observed directly by scientists in 2012.

Gratuitous spoilers for The Cloverfield Paradox ahead.

In the midst of an energy crisis in the year 2028, scientists are struggling to use a massive space-based particle accelerator to help efficiently produce energy. When they finally get it to accelerate particles, they suddenly find themselves on the opposite side of the sun from the Earth. Chaos ensues: Worms explode out of a guy. Someone’s arm rematerializes on the other side of the ship with a mind of its own. Standard body horror nonsense.

Long story short, we’re led to believe that this botched experiment is what brought monsters to Earth in the first Cloverfield film — which, given the crazy science that goes on at the European Organization for Nuclear Research (CERN), is not totally absurd.

Cloverfield Paradox Monster
In ‘The Cloverfield Paradox,’ we’re led to believe that a particle accelerator experiment gone wrong in 2028 messed up the multiverse and caused a monster attack in 2008.

Any good science fiction story has some basis in reality, and it’s clear that The Cloverfield Paradox drew heavily on conspiracy theories that sprung up around CERN and its efforts to find direct evidence of the Higgs-Boson particle using a 27-kilometer circumference accelerator, the Large Hadron Collider.

 The particle’s discovery was a big deal because it was the only one out of 17 particles predicted by the Standard Model of particle physics that had never been observed. The Higgs Boson is partly responsible for the forces between objects, giving them mass.

But it wasn’t the particle itself that conspiracy theorists and skeptics worried about. It’s the way physicists had to observe it.

Doing so involved building the LHC, an extraordinarily large real-life physics experiment that housed two side-by-side high-energy particle beams traveling in opposite directions at close to the speed of light. The hope was that accelerated protons or lead ions in the beam would collide, throwing off a bunch of extremely rare, short-lived particles, one of which might be the Higgs Boson. In 2012, scientists finally observed it, calling it the “God particle” because “Goddamn particle” — as in “so Goddamn hard to find” — was considered too rude to print.

Critics and skeptics argued that colliding particles at close to the speed of light increased the potential to accidentally create micro black holes and possibly even larger black holes, leading to wild speculation like that in Cloverfield Paradox.

cloverfield paradox
Ah yes, the elusive Hands Bosarm particle.

This has never happened in real life, of course, and there’s also strong evidence that it couldn’t happen. Check out this excerpt from an interaction between astrophysicist Neil deGrasse Tyson and science skeptic Anthony Liversidge that Gizmodo reported on in 2011:

NDT: To catch everybody up on this, there’s a concern that if you make a pocket of energy that high, it might create a black hole that would then consume the Earth. So I don’t know what papers your fellow read, but there’s a simple calculation you can do. Earth is actually bombarded by high energy particles that we call cosmic rays, from the depths of space moving at a fraction of the speed of light, energies that far exceed those in the particle accelerator. So it seems to me that if making a pocket of high energy would put Earth at risk of black holes, then we and every other physical object in the universe would have become a black hole eons ago because these cosmic rays are scattered across the universe are hitting every object that’s out there. Whatever your friend’s concerns are were unfounded.

Liversidge may be on the fringe with his argument, but he isn’t alone. As Inverse previously reported, Vanderbilt University physicist Tom Weiler, Ph.D., has hypothesized that a particle created alongside the Higgs Boson, called the Higgs singlet, could travel through time through an as-yet-undiscovered fifth dimension. If Weiler’s hypothesis is correct, then it seems possible that interdimensional travel, as depicted in Cloverfield Paradox, could be possible, though his model really only accounts for the Higgs singlet particle’s ability to time travel.

'The Cloverfield Paradox' is forever the most important Cloverfield.
In ‘The Cloverfield Paradox,’ a particle accelerator plays a central role.

The reason the Cloverfield Paradox scientists were trying to fire up a particle accelerator in space is just as speculative. While particle accelerators take a massive amount of energy to accelerate their beams to near light speed, some physicists argue that under certain conditions, a particle accelerator could actually produce energy. Using superconductors, they argued, it would be possible for a particle accelerator to actually produce plutonium that could be used in nuclear reactors. So in a sense, the science of the movie is kind of based on maybe possibly real science.

That being said, this space horror film takes extreme liberties, even where it’s based on real science. Even on the extreme off-chance that any of the hypotheses outlined in this article turned out to be true, the tiny potential side effects of particle accelerators are nothing like what we see in The Cloverfield Paradox.

This Is The Smartest Kid In The World And He Thinks CERN Destroyed Our Universe


Our universe is a miracle which is beyond our comprehension. However much we advance through science and begin to unravel the mysteries of the world, the more we get confused and messed up in them.

 No human can be said to know all the secrets of the universe, not even our most knowledgeable scientists. Science is not about facts; facts are easy to learn. Science is about exploring and questioning these pre-known facts and establishing new ones.

One such kid, Max Laughlin is definitely much smarter than the average 13-year-old or 30 year old for that matter and has been called the smartest kid on the planet earth. Before his 13th birthday, he had invented a device which was capable of giving free energy to everyone in the world (once the logistics of the production could be taken care of).

He has been discussing and debating extensively on the multi-verse theory and alternate realities for quite a while now and with the biggest brains in the business. He is one of the many physical theorists who are of the opinion that when CERN used the Hadron Collider, it leads to a permanent destruction of our universe as it existed. And now we are living in a parallel one, which was closest to our own in that space-time continuum.

Multiverse

Multiverse is the theory that says that our reality is not the only one which exists in our space-time continuum. In the beginning, when the universe began to take shape, right from the next instant it started spiraling outwards and kept forming parallel universes right next to each other. Down the line, through infinity, there has been an uncountable number of parallel universes. And we inhabit just one of these parallel universes.

How it happened

When CERN set off the super collider it destroyed one single electron. That immediately set off a chain reaction which annihilated our entire universe. We were shifted to the next closest universe to our own but we didn’t make the shift unscathed. Many were not able to accompany us and were left behind and forgotten. And the new universe we now inhabit, though similar to our own is not exactly the same. Here is the proof.

The Mandela effect

The Mandela effect is the phenomenon which best supports this theory. Not everyone remembers how Nelson Mandela died in the same way. There are also many pop culture references and real-world events that we swear to remember in a certain way than what is available to us through records. These little glitches are a proof that the reality we remember is different than the one we now inhabit.

Save

Ancient particle accelerator discovered on Mars.


New images of the surface of Mars taken by NASA’s Mars Reconnaissance Orbiter probe have revealed the presence of the largest particle accelerator.

The search for water, or even signs of life, on the planet Mars has been ongoing for some time. But with today’s announcement by CERN and NASA scientists, the exploration of the red planet has revealed a major new discovery. New images of the surface of Mars taken by NASA’s Mars Reconnaissance Orbiter probe, analysed by an interdisciplinary team of experts from the fields of geology, archaeology and particle physics, have revealed the presence of the largest particle accelerator ever built. The team has shown that Olympus Mons, previously thought to be the largest volcanic formation in the solar system, is in fact the remains of an ancient particle accelerator thought to have operated several million years ago.

A landslide stretching over several kilometres spotted by the probe’s high-resolution camera, sparked the scientists’ attention. This apparently recent event revealed a number of structures, which intrigued the scientists, as their shapes clearly resembled those of superconducting accelerating cavities such as those used in the Large Hadron Collider (LHC). With a circumference of almost 2000 kilometres, this particle accelerator would have been around 75 times bigger than the LHC, and millions of times more powerful. However, it is not yet known which type of particles might have been accelerated in such a machine.

Ancient Egyptian hieroglyphs, the meaning of which was previously a mystery, seem to corroborate these observations, leading scientists to believe that the pyramids might have served as giant antennae 

This major discovery could also help to explain the Egyptian pyramids, one of archaeology’s oldest mysteries. Heavily eroded structures resembling pyramids also appear on the images in the immediate vicinity of Olympus Mons. In addition, ancient Egyptian hieroglyphs, the meaning of which was previously a mystery, seem to corroborate these observations, leading scientists to believe that the pyramids might have served as giant antennae. The pyramids on Earth might therefore have allowed the accelerator to be controlled remotely. “The accelerator control room was probably under the pyramids,” said Friedrich Spader, CERN’s Head of Technical Design.

This particle accelerator – a veritable “stargate” – is thought to have served as a portal into the solar system for a highly technologically advanced civilisation with the aim of colonisation. “The papyrus that was recently deciphered indicates that the powerful magnetic field and the movement of the particles in the accelerator were such that they would create a portal through spacetime,” said Fadela Emmerich, the leader of the team of scientists. “It’s a phenomenon that is completely new to CERN and we can’t wait to study it!” Such a technology could revolutionise space travel and open the way for intergalactic exploration.

Olympus Mons was until now considered to be the biggest volcano in the solar system, with its most recent lava flows estimated to be about 2 million years old. Scientists believe that this dating is quite accurate, on the basis of the latest measurements carried out by NASA’s Mars Odyssey probe. “This would mean that the particle accelerator was last used around 2 million years ago,” suggested Eilert O’Neil, the geologist who led this aspect of the research.

The powerful synchrotron radiation emitted by the particle accelerator generated an intense heat, which explains the volcanic structure and the presence of lava flows. “We have also suspected for a long time that a large quantity of water must have existed on the surface of Mars. We can only assume that this water was used at the time to cool the machines,” revealed Friedrich Spader.

“We’re probably talking about forgotten technologies and a highly advanced ancient civilisation,” said Eilert O’Neil. “Maybe even our own distant ancestors.”

Source:http://home.cern/

The Missing Universe: CERN Has Started Searching for “Dark Photons”


IN BRIEF
  • Dark matter seems to outweigh visible matter roughly six to one, making up about 27% of the universe.
  • Physicists from CERN now believe there’s a fifth universal force that rules the behavior of dark matter, and is transmitted by a particle called the dark photon.

THE FIFTH FORCE

The universe is shrouded in mystery—a shroud so dark, in fact, that 27 percent of the matter in it is “dark.” Dark matter does not interact with photons and electromagnetic waves, so it’s invisible to our eyes and to every kind of telescope. Basically, it’s the darkness that surrounds every celestial body, and we only know that it’s there because astronomers observe its gravitational pull on everything else.

A working theory is that – in addition to the four fundamental forces that drive the universe: gravity, electromagnetism, and strong and weak nuclear forces – there’s a fifth force that rules the behavior of dark matter. Physicists from CERN now believe that this force is transmitted by a particle called the dark photon.

“To use a metaphor, an otherwise impossible dialogue between two people not speaking the same language (visible and dark matter) can be enabled by a mediator (the dark photon), who understands one language and speaks the other one,” explained Sergei Gninenko of CERN.

The research facility is now launching the NA64 experiment to search for this particle. The equipment focuses a beam of electrons with a known value of initial energy at a detector. Interactions between the electrons and atoms in the detector produce visible photons. If dark photons exist, they will escape the detector and subtract from the initial electron energy, as by the law of conservation of energy.

THE COMPLEX UNIVERSE

There’s a lot of work to be done by physicists in order to prove that dark photons exist. Results of the experiment must be replicable and, if the scientists find it, another round of research will be pursued to prove its relation to dark matter.

CERN is an organization of physicists and engineers that probe the universe in pursuit of understanding its fundamental structure. Discoveries from these studies could validate or totally destroy everything we currently know.

While dark matter may seem very far away from us and our daily lives, understanding all these mysteries is another step toward understanding ourselves and this complex universe we live in.

A Revolutionary Test Uncovered a Key Fact About the Nature of Antimatter


IN BRIEF
  • CERN scientists were able to finally measure the frequency of light needed to move the antiparticle from a ground state to an excited state.
  • The results confirmed an important symmetry that’s a key part of the standard model of particle physics.

ALL ABOUT BALANCE

Antimatter is a particularly difficult aspect of our world to study since it has a funny way of annihilating when it comes into contact with matter. This makes it impossible to properly study these particles by any conventional means of measurement. Scientists had to devise a way of using magnetic fields to “trap” antiparticles in order to get a better look. Progress was made in 2010 when CERN’s ALPHA collaboration was able to trap antihydrogen and proceed with some unprecedented study.

As reported to Nature, CERN scientists were able to finally measure the frequency of light needed to move the antiparticle from a ground state to an excited state. This was accomplished using cooled antiparticles and firing a laser at them to induce the shift change, then measuring that shift using spectrography. The results confirmed an important symmetry that’s a key part of the standard model of particle physics.

Image credit: Maximilien Brice/CERN
TEST AND RETEST

According to Nature, “Charge-parity-time (CPT) symmetry predicts that energy levels in antimatter and matter should be the same. Even the tiniest violation of this rule would require a serious rethink of the standard model of particle physics.” The standard model is safe for now as the frequency measured in the antihydrogen shift from grounded to excited matched the shift observed in hydrogen.

The ALPHA team’s work is far from finished. They plan to throw a variety of laser beams at these trapped antiparticles to further test the fidelity of the symmetry found in the original study.

The mystery of how there is more regular matter than antimatter in the universe is further complicated by this discovery since it reinforces their similarity. As Gizmodo puts it, “This would be easier to explain if matter and antimatter were less similar.” Still, we have finally made some measurable progress in the study of antimatter. It is only a matter of continued innovation in both knowledge and technology to ensure that we someday solve this mystery.

Antihydrogen spectroscopy achieved


 

Trapped antihydrogen

 

The spectrum of the hydrogen atom has played a central part in fundamental physics in the past 200 years. Historical examples of its significance include the wavelength measurements of absorption lines in the solar spectrum by Fraunhofer, the identification of transition lines by Balmer, Lyman et al., the empirical description of allowed wavelengths by Rydberg, the quantum model of Bohr, the capability of quantum electrodynamics to precisely predict transition frequencies, and modern measurements of the 1S–2S transition by Hänsch1 to a precision of a few parts in 1015. Recently, we have achieved the technological advances to allow us to focus on antihydrogen—the antimatter equivalent of hydrogen2,3,4. The Standard Model predicts that there should have been equal amounts of matter and antimatter in the primordial Universe after the Big Bang, but today’s Universe is observed to consist almost entirely of ordinary matter. This motivates physicists to carefully study antimatter, to see if there is a small asymmetry in the laws of physics that govern the two types of matter. In particular, the CPT (charge conjugation, parity reversal, time reversal) Theorem, a cornerstone of the Standard Model, requires that hydrogen and antihydrogen have the same spectrum. Here we report the observation of the 1S–2S transition in magnetically trapped atoms of antihydrogen in the ALPHA-2 apparatus at CERN. We determine that the frequency of the transition, driven by two photons from a laser at 243 nm, is consistent with that expected for hydrogen in the same environment. This laser excitation of a quantum state of an atom of antimatter represents a highly precise measurement performed on an anti-atom. Our result is consistent with CPT invariance at a relative precision of ~2 × 10−10.For the first time, researchers have probed the energy difference between two states of the antimatter atom.

The best known research at CERN centers on collisions of particles accelerated to higher and higher energies. But for the past 30 years, the lab has also hosted several research teams working to decelerate antiprotons, combine them with positrons, and cool and trap the resulting atoms of antihydrogen. A main goal of that research is to perform precision spectroscopic measurements that might reveal differences between matter and antimatter—and help to explain why the universe contains so much more of the former than the latter. (See the Quick Study by Gerald Gabrielse, Physics Today, March 2010, page 68.) Now CERN’s ALPHA collaboration has achieved the first spectroscopic success: observing the transition between antihydrogen’s 1S and 2S states.

The standard technique for atomic spectroscopy—exciting atoms with a laser and detecting the photons they emit—is unsuitable for antihydrogen. First, the coils and electrodes required to magnetically trap the antihydrogen, as shown here, leave little room for optical detectors. Second, the researchers trap only 14 antihydrogen atoms at a time, on average, so the optical signals would be undetectably weak.

Happily, antimatter offers an alternative spectroscopic method that works well for small numbers of atoms. When an antihydrogen atom is excited out of its 1S (or ground) state, it can be ionized by absorbing just one more photon. The bare antiproton, no longer confined by the magnetic field, quickly collides with the wall of the trap and annihilates, producing an easily detectable signal.

When the researchers tuned their excitation laser to the exact frequency that would excite atoms of hydrogen, about half of the antihydrogen atoms were lost from the trap during each 10-minute trial. When they detuned the laser by just 200 kHz—about 200 parts per trillion—all the antihydrogen remained in the trap. By repeating the experiment for many more laser frequencies, the ALPHA team hopes to get a detailed measurement of the transition line shape. But that will have to wait until the experiment resumes in May 2017.

 

%d bloggers like this: