CERN May Have Evidence of a Quasiparticle We’ve Been Hunting For Decades


Meet the elusive odderon.

The Large Hadron Collider (LHC) is the particle accelerator that just keeps on giving, and recent experiments at the site suggest we’ve got the first evidence for a mysterious subatomic quasiparticle that, until now, was only a hypothesis.

Quasiparticles aren’t technically particles, but they act like them in some respects, and the newly recorded reactions point to a particular quasiparticle called the odderon.

It already has a name because physicists have been on its theoretical trail for the past 40 years.

Now, they still haven’t seen the elusive odderon itself, but researchers have now observed certain effects that hint the quasiparticle really is there.

That would in turn give us new information to feed into the Standard Model of particle physics, the guidebook that all the building blocks of physical matter are thought to follow.

“This doesn’t break the Standard Model, but there are very opaque regions of the Standard Model, and this work shines a light on one of those opaque regions,” says one of the team, particle physicist Timothy Raben from the University of Kansas.

“These ideas date back to the 70s, but even at that time it quickly became evident we weren’t close technologically to being able to see the odderon, so while there are several decades of predictions, the odderon has not been seen.”

The reactions studied in this case involve quarks, or electrically charged subatomic particles, and gluons, which act as exchange particles between quarks and enable them to stick together to form protons and neutrons.

In proton collisions where the protons remain intact, up until now scientists have only seen this happen when an even number of gluons are exchanged between different protons. The new research notes, for the first time, these reactions happening with an odd number of gluons.

And it’s the way the protons deviate rather than break that’s important for this particular area of investigation. It was this phenomena that first led to the idea of a quasiparticle called an odderon, to explain away collisions where protons survived.

“The odderon is one of the possible ways by which protons can interact without breaking, whose manifestations have never been observed .. this could be the first evidence of that,” Simone Giani, spokesperson at the TOTEM experiment of which this is a part, told Ryan F. Mandelbaum at Gizmodo.

It’s a pretty complex idea to wrap your head around, so the researchers have used a vehicle metaphor to explain what’s going on.

“The protons interact like two big semi-trucks that are transporting cars, the kind you see on the highway,” explains Raben.

“If those trucks crashed together, after the crash you’d still have the trucks, but the cars would now be outside, no longer aboard the trucks – and also new cars are produced. Energy is transformed into matter.”

“Until now, most models were thinking there was a pair of gluons – always an even number… We found measurements that are incompatible with this traditional model of assuming an even number of gluons.”

What all of that theoretical physics and subatomic analysis means is that we may have seen evidence of the odderon at work – with the odderon being the total contribution produced from the exchange of an odd number of gluons.

The experiments involved a team of over 100 physicists, colliding billions of proton pairs together every second in the LHC. At its peak, data was being collected at 13 teraelectronvolts (TeV), a new record.

By comparing these high energy tests with results gleaned from other tests run on less powerful hardware, the researchers could reach a new level of accuracy in their proton collision measurements, and that may have revealed the odderon.

Ultimately this kind of super-high energy experiment can feed into all kinds of areas of research, including medicine, water purification, and cosmic ray measuring.

We’re still waiting for confirmation that this legendary quasiparticle has in fact been found – or at least that its effects have – and the papers are currently submitted to be published in peer reviewed journals.

But it’s definitely a super-exciting time for physicists.

“We expect big results in the coming months or years,” says one of the researchers, Christophe Royon from the University of Kansas.

The research is currently undergoing peer review, but you can read the studies on the ArXiv.org and CERN pre-print servers.

Advertisements

The Real Science of the God Particle in Netflix’s ‘The Cloverfield Paradox’


Even if you’re not a particle physics buff, you may have noticed that the plot of Netflix’s surprise Superbowl Sunday release, The Cloverfield Paradox, relies heavily on a huge physics discovery that was in the news a few years ago: the Higgs Boson particle.

The Cloverfield Paradox

Also known as the “God particle” — which happened to be the working title of the new J.J. Abrams film — the Higgs Boson was first observed directly by scientists in 2012.

Gratuitous spoilers for The Cloverfield Paradox ahead.

In the midst of an energy crisis in the year 2028, scientists are struggling to use a massive space-based particle accelerator to help efficiently produce energy. When they finally get it to accelerate particles, they suddenly find themselves on the opposite side of the sun from the Earth. Chaos ensues: Worms explode out of a guy. Someone’s arm rematerializes on the other side of the ship with a mind of its own. Standard body horror nonsense.

Long story short, we’re led to believe that this botched experiment is what brought monsters to Earth in the first Cloverfield film — which, given the crazy science that goes on at the European Organization for Nuclear Research (CERN), is not totally absurd.

Cloverfield Paradox Monster
In ‘The Cloverfield Paradox,’ we’re led to believe that a particle accelerator experiment gone wrong in 2028 messed up the multiverse and caused a monster attack in 2008.

Any good science fiction story has some basis in reality, and it’s clear that The Cloverfield Paradox drew heavily on conspiracy theories that sprung up around CERN and its efforts to find direct evidence of the Higgs-Boson particle using a 27-kilometer circumference accelerator, the Large Hadron Collider.

 The particle’s discovery was a big deal because it was the only one out of 17 particles predicted by the Standard Model of particle physics that had never been observed. The Higgs Boson is partly responsible for the forces between objects, giving them mass.

But it wasn’t the particle itself that conspiracy theorists and skeptics worried about. It’s the way physicists had to observe it.

Doing so involved building the LHC, an extraordinarily large real-life physics experiment that housed two side-by-side high-energy particle beams traveling in opposite directions at close to the speed of light. The hope was that accelerated protons or lead ions in the beam would collide, throwing off a bunch of extremely rare, short-lived particles, one of which might be the Higgs Boson. In 2012, scientists finally observed it, calling it the “God particle” because “Goddamn particle” — as in “so Goddamn hard to find” — was considered too rude to print.

Critics and skeptics argued that colliding particles at close to the speed of light increased the potential to accidentally create micro black holes and possibly even larger black holes, leading to wild speculation like that in Cloverfield Paradox.

cloverfield paradox
Ah yes, the elusive Hands Bosarm particle.

This has never happened in real life, of course, and there’s also strong evidence that it couldn’t happen. Check out this excerpt from an interaction between astrophysicist Neil deGrasse Tyson and science skeptic Anthony Liversidge that Gizmodo reported on in 2011:

NDT: To catch everybody up on this, there’s a concern that if you make a pocket of energy that high, it might create a black hole that would then consume the Earth. So I don’t know what papers your fellow read, but there’s a simple calculation you can do. Earth is actually bombarded by high energy particles that we call cosmic rays, from the depths of space moving at a fraction of the speed of light, energies that far exceed those in the particle accelerator. So it seems to me that if making a pocket of high energy would put Earth at risk of black holes, then we and every other physical object in the universe would have become a black hole eons ago because these cosmic rays are scattered across the universe are hitting every object that’s out there. Whatever your friend’s concerns are were unfounded.

Liversidge may be on the fringe with his argument, but he isn’t alone. As Inverse previously reported, Vanderbilt University physicist Tom Weiler, Ph.D., has hypothesized that a particle created alongside the Higgs Boson, called the Higgs singlet, could travel through time through an as-yet-undiscovered fifth dimension. If Weiler’s hypothesis is correct, then it seems possible that interdimensional travel, as depicted in Cloverfield Paradox, could be possible, though his model really only accounts for the Higgs singlet particle’s ability to time travel.

'The Cloverfield Paradox' is forever the most important Cloverfield.
In ‘The Cloverfield Paradox,’ a particle accelerator plays a central role.

The reason the Cloverfield Paradox scientists were trying to fire up a particle accelerator in space is just as speculative. While particle accelerators take a massive amount of energy to accelerate their beams to near light speed, some physicists argue that under certain conditions, a particle accelerator could actually produce energy. Using superconductors, they argued, it would be possible for a particle accelerator to actually produce plutonium that could be used in nuclear reactors. So in a sense, the science of the movie is kind of based on maybe possibly real science.

That being said, this space horror film takes extreme liberties, even where it’s based on real science. Even on the extreme off-chance that any of the hypotheses outlined in this article turned out to be true, the tiny potential side effects of particle accelerators are nothing like what we see in The Cloverfield Paradox.

This Is The Smartest Kid In The World And He Thinks CERN Destroyed Our Universe


Our universe is a miracle which is beyond our comprehension. However much we advance through science and begin to unravel the mysteries of the world, the more we get confused and messed up in them.

 No human can be said to know all the secrets of the universe, not even our most knowledgeable scientists. Science is not about facts; facts are easy to learn. Science is about exploring and questioning these pre-known facts and establishing new ones.

One such kid, Max Laughlin is definitely much smarter than the average 13-year-old or 30 year old for that matter and has been called the smartest kid on the planet earth. Before his 13th birthday, he had invented a device which was capable of giving free energy to everyone in the world (once the logistics of the production could be taken care of).

He has been discussing and debating extensively on the multi-verse theory and alternate realities for quite a while now and with the biggest brains in the business. He is one of the many physical theorists who are of the opinion that when CERN used the Hadron Collider, it leads to a permanent destruction of our universe as it existed. And now we are living in a parallel one, which was closest to our own in that space-time continuum.

Multiverse

Multiverse is the theory that says that our reality is not the only one which exists in our space-time continuum. In the beginning, when the universe began to take shape, right from the next instant it started spiraling outwards and kept forming parallel universes right next to each other. Down the line, through infinity, there has been an uncountable number of parallel universes. And we inhabit just one of these parallel universes.

How it happened

When CERN set off the super collider it destroyed one single electron. That immediately set off a chain reaction which annihilated our entire universe. We were shifted to the next closest universe to our own but we didn’t make the shift unscathed. Many were not able to accompany us and were left behind and forgotten. And the new universe we now inhabit, though similar to our own is not exactly the same. Here is the proof.

The Mandela effect

The Mandela effect is the phenomenon which best supports this theory. Not everyone remembers how Nelson Mandela died in the same way. There are also many pop culture references and real-world events that we swear to remember in a certain way than what is available to us through records. These little glitches are a proof that the reality we remember is different than the one we now inhabit.

Save

Ancient particle accelerator discovered on Mars.


New images of the surface of Mars taken by NASA’s Mars Reconnaissance Orbiter probe have revealed the presence of the largest particle accelerator.

The search for water, or even signs of life, on the planet Mars has been ongoing for some time. But with today’s announcement by CERN and NASA scientists, the exploration of the red planet has revealed a major new discovery. New images of the surface of Mars taken by NASA’s Mars Reconnaissance Orbiter probe, analysed by an interdisciplinary team of experts from the fields of geology, archaeology and particle physics, have revealed the presence of the largest particle accelerator ever built. The team has shown that Olympus Mons, previously thought to be the largest volcanic formation in the solar system, is in fact the remains of an ancient particle accelerator thought to have operated several million years ago.

A landslide stretching over several kilometres spotted by the probe’s high-resolution camera, sparked the scientists’ attention. This apparently recent event revealed a number of structures, which intrigued the scientists, as their shapes clearly resembled those of superconducting accelerating cavities such as those used in the Large Hadron Collider (LHC). With a circumference of almost 2000 kilometres, this particle accelerator would have been around 75 times bigger than the LHC, and millions of times more powerful. However, it is not yet known which type of particles might have been accelerated in such a machine.

Ancient Egyptian hieroglyphs, the meaning of which was previously a mystery, seem to corroborate these observations, leading scientists to believe that the pyramids might have served as giant antennae 

This major discovery could also help to explain the Egyptian pyramids, one of archaeology’s oldest mysteries. Heavily eroded structures resembling pyramids also appear on the images in the immediate vicinity of Olympus Mons. In addition, ancient Egyptian hieroglyphs, the meaning of which was previously a mystery, seem to corroborate these observations, leading scientists to believe that the pyramids might have served as giant antennae. The pyramids on Earth might therefore have allowed the accelerator to be controlled remotely. “The accelerator control room was probably under the pyramids,” said Friedrich Spader, CERN’s Head of Technical Design.

This particle accelerator – a veritable “stargate” – is thought to have served as a portal into the solar system for a highly technologically advanced civilisation with the aim of colonisation. “The papyrus that was recently deciphered indicates that the powerful magnetic field and the movement of the particles in the accelerator were such that they would create a portal through spacetime,” said Fadela Emmerich, the leader of the team of scientists. “It’s a phenomenon that is completely new to CERN and we can’t wait to study it!” Such a technology could revolutionise space travel and open the way for intergalactic exploration.

Olympus Mons was until now considered to be the biggest volcano in the solar system, with its most recent lava flows estimated to be about 2 million years old. Scientists believe that this dating is quite accurate, on the basis of the latest measurements carried out by NASA’s Mars Odyssey probe. “This would mean that the particle accelerator was last used around 2 million years ago,” suggested Eilert O’Neil, the geologist who led this aspect of the research.

The powerful synchrotron radiation emitted by the particle accelerator generated an intense heat, which explains the volcanic structure and the presence of lava flows. “We have also suspected for a long time that a large quantity of water must have existed on the surface of Mars. We can only assume that this water was used at the time to cool the machines,” revealed Friedrich Spader.

“We’re probably talking about forgotten technologies and a highly advanced ancient civilisation,” said Eilert O’Neil. “Maybe even our own distant ancestors.”

Source:http://home.cern/

The Missing Universe: CERN Has Started Searching for “Dark Photons”


IN BRIEF
  • Dark matter seems to outweigh visible matter roughly six to one, making up about 27% of the universe.
  • Physicists from CERN now believe there’s a fifth universal force that rules the behavior of dark matter, and is transmitted by a particle called the dark photon.

THE FIFTH FORCE

The universe is shrouded in mystery—a shroud so dark, in fact, that 27 percent of the matter in it is “dark.” Dark matter does not interact with photons and electromagnetic waves, so it’s invisible to our eyes and to every kind of telescope. Basically, it’s the darkness that surrounds every celestial body, and we only know that it’s there because astronomers observe its gravitational pull on everything else.

A working theory is that – in addition to the four fundamental forces that drive the universe: gravity, electromagnetism, and strong and weak nuclear forces – there’s a fifth force that rules the behavior of dark matter. Physicists from CERN now believe that this force is transmitted by a particle called the dark photon.

“To use a metaphor, an otherwise impossible dialogue between two people not speaking the same language (visible and dark matter) can be enabled by a mediator (the dark photon), who understands one language and speaks the other one,” explained Sergei Gninenko of CERN.

The research facility is now launching the NA64 experiment to search for this particle. The equipment focuses a beam of electrons with a known value of initial energy at a detector. Interactions between the electrons and atoms in the detector produce visible photons. If dark photons exist, they will escape the detector and subtract from the initial electron energy, as by the law of conservation of energy.

THE COMPLEX UNIVERSE

There’s a lot of work to be done by physicists in order to prove that dark photons exist. Results of the experiment must be replicable and, if the scientists find it, another round of research will be pursued to prove its relation to dark matter.

CERN is an organization of physicists and engineers that probe the universe in pursuit of understanding its fundamental structure. Discoveries from these studies could validate or totally destroy everything we currently know.

While dark matter may seem very far away from us and our daily lives, understanding all these mysteries is another step toward understanding ourselves and this complex universe we live in.

A Revolutionary Test Uncovered a Key Fact About the Nature of Antimatter


IN BRIEF
  • CERN scientists were able to finally measure the frequency of light needed to move the antiparticle from a ground state to an excited state.
  • The results confirmed an important symmetry that’s a key part of the standard model of particle physics.

ALL ABOUT BALANCE

Antimatter is a particularly difficult aspect of our world to study since it has a funny way of annihilating when it comes into contact with matter. This makes it impossible to properly study these particles by any conventional means of measurement. Scientists had to devise a way of using magnetic fields to “trap” antiparticles in order to get a better look. Progress was made in 2010 when CERN’s ALPHA collaboration was able to trap antihydrogen and proceed with some unprecedented study.

As reported to Nature, CERN scientists were able to finally measure the frequency of light needed to move the antiparticle from a ground state to an excited state. This was accomplished using cooled antiparticles and firing a laser at them to induce the shift change, then measuring that shift using spectrography. The results confirmed an important symmetry that’s a key part of the standard model of particle physics.

Image credit: Maximilien Brice/CERN
TEST AND RETEST

According to Nature, “Charge-parity-time (CPT) symmetry predicts that energy levels in antimatter and matter should be the same. Even the tiniest violation of this rule would require a serious rethink of the standard model of particle physics.” The standard model is safe for now as the frequency measured in the antihydrogen shift from grounded to excited matched the shift observed in hydrogen.

The ALPHA team’s work is far from finished. They plan to throw a variety of laser beams at these trapped antiparticles to further test the fidelity of the symmetry found in the original study.

The mystery of how there is more regular matter than antimatter in the universe is further complicated by this discovery since it reinforces their similarity. As Gizmodo puts it, “This would be easier to explain if matter and antimatter were less similar.” Still, we have finally made some measurable progress in the study of antimatter. It is only a matter of continued innovation in both knowledge and technology to ensure that we someday solve this mystery.

Antihydrogen spectroscopy achieved


 

Trapped antihydrogen

 

The spectrum of the hydrogen atom has played a central part in fundamental physics in the past 200 years. Historical examples of its significance include the wavelength measurements of absorption lines in the solar spectrum by Fraunhofer, the identification of transition lines by Balmer, Lyman et al., the empirical description of allowed wavelengths by Rydberg, the quantum model of Bohr, the capability of quantum electrodynamics to precisely predict transition frequencies, and modern measurements of the 1S–2S transition by Hänsch1 to a precision of a few parts in 1015. Recently, we have achieved the technological advances to allow us to focus on antihydrogen—the antimatter equivalent of hydrogen2,3,4. The Standard Model predicts that there should have been equal amounts of matter and antimatter in the primordial Universe after the Big Bang, but today’s Universe is observed to consist almost entirely of ordinary matter. This motivates physicists to carefully study antimatter, to see if there is a small asymmetry in the laws of physics that govern the two types of matter. In particular, the CPT (charge conjugation, parity reversal, time reversal) Theorem, a cornerstone of the Standard Model, requires that hydrogen and antihydrogen have the same spectrum. Here we report the observation of the 1S–2S transition in magnetically trapped atoms of antihydrogen in the ALPHA-2 apparatus at CERN. We determine that the frequency of the transition, driven by two photons from a laser at 243 nm, is consistent with that expected for hydrogen in the same environment. This laser excitation of a quantum state of an atom of antimatter represents a highly precise measurement performed on an anti-atom. Our result is consistent with CPT invariance at a relative precision of ~2 × 10−10.For the first time, researchers have probed the energy difference between two states of the antimatter atom.

The best known research at CERN centers on collisions of particles accelerated to higher and higher energies. But for the past 30 years, the lab has also hosted several research teams working to decelerate antiprotons, combine them with positrons, and cool and trap the resulting atoms of antihydrogen. A main goal of that research is to perform precision spectroscopic measurements that might reveal differences between matter and antimatter—and help to explain why the universe contains so much more of the former than the latter. (See the Quick Study by Gerald Gabrielse, Physics Today, March 2010, page 68.) Now CERN’s ALPHA collaboration has achieved the first spectroscopic success: observing the transition between antihydrogen’s 1S and 2S states.

The standard technique for atomic spectroscopy—exciting atoms with a laser and detecting the photons they emit—is unsuitable for antihydrogen. First, the coils and electrodes required to magnetically trap the antihydrogen, as shown here, leave little room for optical detectors. Second, the researchers trap only 14 antihydrogen atoms at a time, on average, so the optical signals would be undetectably weak.

Happily, antimatter offers an alternative spectroscopic method that works well for small numbers of atoms. When an antihydrogen atom is excited out of its 1S (or ground) state, it can be ionized by absorbing just one more photon. The bare antiproton, no longer confined by the magnetic field, quickly collides with the wall of the trap and annihilates, producing an easily detectable signal.

When the researchers tuned their excitation laser to the exact frequency that would excite atoms of hydrogen, about half of the antihydrogen atoms were lost from the trap during each 10-minute trial. When they detuned the laser by just 200 kHz—about 200 parts per trillion—all the antihydrogen remained in the trap. By repeating the experiment for many more laser frequencies, the ALPHA team hopes to get a detailed measurement of the transition line shape. But that will have to wait until the experiment resumes in May 2017.

 

A ‘New Physics’? Scientists May Have Glimpsed a World Beyond the Standard Model


Physicists are using the LHC to probe for elementary particles that may exist beyond the Standard Model. By doing so, they may discover (and may have already discovered) a “new physics” that has a real chance to resolve some of the greatest mysteries in science.
MESON, FERMIONS, LEPTONS, AND BOSONS

The Standard Model, which emerged in the 1970s, is a theoretical foundation that explains the world and matter at the very smallest levels of reality: elementary particles so minute they boggle the imagination and defy easy understanding.  It has been a pretty successful description so far, but like most old foundations, it’s beginning to show signs of cracks and disrepair.

Of course, it’s not so much that the standard model is wrong; rather, there may be a deeper kind of physics, a dark sector that we haven’t been able to reach yet.

In other words, there are hints of something greater and even more fundamental shining through those cracks like glinting rays of sunshine.  And a team of physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN), working with the LHC particle accelerator at CERN, think they may be on the track of what that “something” is.

Briefly, the Standard Model divides matter and the forces of the universe into several categories of elementary particles.  Pay attention now, reader, because this will go quickly.  Bosons transmit force; photons (light) emerge from electromagnetic activity; eight species of gluon are involved in the strong nuclear force (holding atoms together); and the W+, W- and Z0 bosons oversee the weak nuclear force (responsible for radioactive decay).  Matter comprises fermions, which are formed by quarks and leptons; there are six species of quarks, and six of leptons (which include electrons and neutrinos), together with 12 antiparticles for each.  The Higgs boson provides mass for all, save the gluons and photons.

Got that?  Good.

But here’s the problem—the Standard Model, in common with other theories explaining the universe (such as Quantum Mechanics and General Relativity), is not quite as comprehensive as we’d like it to be.  It fails to explain some of the most interesting and pressing questions confronting physics.

For instance, it doesn’t account for the division of fermions into different families, or why matter achieved the upper hand over antimatter in the early universe.  And if dark matter is indeed an actual form of “matter,” it is not explained by our current understanding of elementary particles.  Perhaps most importantly, gravity (that most mysterious and fundamental of forces) is utterly unaccounted for by the Standard Model.

Highly complicated, graphical analysis of the decay of a "beauty" meson into a kaon and two muons. Credit: CERN
Highly complicated, graphical analysis of the decay of a “beauty” meson into a kaon and two muons. 
THE BEAUTY MESON

The Large Hadron Collider has turned its considerable particle-smashing heft to the task of seeking out new elementary particles beyond the Standard Model; but it’s possible they exist just outside the energy limit of the LHC.  If this is the case, then the only way to discover their presence will be to discern their “shadow,” as it were—the influence they exert upon other particles at lower energies.

And one way this might work is if they cause “mesons”—unstable, short-lived combinations of a quark and antiquark—to decay in unusual and unexpected ways.

This is what the team believes it may have found. A few years back, the LHCb experiment, which probes the mysteries of matter and antimatter, detected anomalous readings in the decay of a B meson or “beauty” meson—a meson consisting of a light quark and a heavy beauty antiquark.  It was necessary to rig up a more accurate method of determining the parameters by which the beauty quark decayed in order to test its deviation from the Standard Model; the Polish team devised a means to determine the parameters independently.

According to Dr. Marcin Chrzaszcz of IFJ PAN, one of the authors of the new research, “[m]y approach can be likened to determining the year when a family portrait was taken. Rather than looking at the whole picture, it is better to analyze each person individually and from that perspective try to work out the year the portrait was taken.”

By more accurately determining the degree of deviation from the Standard Model, scientists will be able to ascertain whether the anomaly really represents the influence of unknown elementary particles beyond the Model, or whether it is merely some hitherto undiscovered property which the Model does account for.

For now, physicists hypothesize that there is something called a “Z-prime” (Z’) boson, which mediates the decay of B mesons.  The LHC is gearing up now for new, higher-energy collisions. Perhaps, at last, they’ll discover the new particles, and the new physics, they’ve been searching for.

Breakthrough ‘Madala Boson’ Could Unlock the Mysteries of Dark Matter


The Higgs’ boson helped us understand known matter, but scientists at the High Energy Physics Group (HEP) of the University of the Witwatersrand in Johannesburg believe they have the necessary data to discover a new boson, called the Madala boson. Its discovery may help us explore more about what dark matter is and how it interacts with the universe.

DISCOVERING THE MADALA BOSON

Discovery of the Higgs boson in 2012 at the European Organization for Nuclear Research (CERN) has contributed heaps to our understanding of modern physics. But the Higgs boson only explains mass that we can see, touch and smell. Known matter only makes up 4% of the Universe’s mass and energy. Scientists predict the discovery of a new boson which interacts with dark matter, which makes up 27% of our universe.

Using the same data that led to Higgs’ discovery, the bright minds at the High Energy Physics Group (HEP) of the University of the Witwatersrand in Johannesburg have come up with the Madala hypothesis, which they believe will help them discover the new Madala boson.

The Madala boson team isn’t lacking in scientific minds, as they have around 35 students and researchers to brainstorm and help understand data from the experiments. They also have the support from Wits University, such as theorists Prof. Alan Cornell and Dr. Mukesh Kumar and Prof. Elias Sideras-Haddad’s assistance in detector instrumentation.

Image credit: Taylor L; McCauley T/CERN
Image credit: Taylor L; McCauley T/CERN

DARK MATTER MATTERS

Man’s understanding of physics keeps on evolving. Professor Bruce Mellado, team leader of the HEP group at Wits says we are now at a point similar to when Einstein formulated relativity and to when quantum mechanics came to light. We found classic physics lacking as it failed to make sense of plenty of phenomena. When the Higgs’ boson was discovered, the Standard Model of Physics was completed, but we have still only scratched the surface. Modern physics still can’t explain other phenomena including dark matter.

The discovery of the new Madala boson puts man in a good position to learn more about our universe. Perhaps there are even more particles to be discovered aside from this new boson. The future of modern physics has never been brighter.

The Search for New Physics at CERN



Anomalous collision events observed in Run 2 of CERN’s Large Hadron Collider in 2015 may point to new physics beyond the Standard Model. Image courtesy of CERN.

 

In February, I spent a few days at CERN, the European Organization for Nuclear Research. I spoke with physicists who devote themselves to understanding the stuff the universe is made of. These were the people who collaborated with thousands of other physicists to find the Higgs boson. Now they are looking for physics that lies beyond theStandard Model—the theory of nature that has dominated the field for more than forty years.

Some of our conversations focused on a puzzling anomaly that was announced at CERN in December 2015. The physicists I spoke with were calm, perplexed, and excited. Calm, because they thought the anomaly might well disappear as evidence accumulates over the coming months. Perplexed, because they did not know what to make of what they had observed. Excited, because they hoped they would find something new and strange that would upset their understanding of nature.

It was depressing to come back to the current political season, with its breathtaking blend of authoritarian bluster, racist garbage, and sheer bullshit. True, politics is not particle physics. Physicists do not declare a discovery until they think the chance of a mistake isless than one in three million. In politics, we do not have that luxury: we need to act. Still, we have a lot to learn from the fallibilist humility that defines fundamental physics: you make your ideas clear, acknowledge your ignorance, and hope for a surprising refutation of settled expectations.

With this piece, Matt Buckley, a theoretical physicist and professor at Rutgers University, begins a series of eight articles that will lead us into these waters. The series starts from the anomaly announced in December and will end with another announcement from CERN in summer 2016, which will tell us whether the anomaly is nature’s signal or just statistical noise. Buckley will lay the foundations for understanding the current calm and perplexed excitement. He will give us a feel for the physics and a sense of how physicists think and work. Politics may not be particle physics, but we have much to learn from this remarkable practice of submitting our beliefs to the discipline of evidence and argument. Physicists will ask questions; nature will talk back; and we will get to watch it unfold in real time.

— Joshua Cohen, co-editor


 

December 15, 2015, was a very big day at CERN. Certainly it was the biggest since July 4, 2012, when scientists announced that they had found the Higgs boson. This discovery—like the recent discovery of gravitational waves from the merger of black holes more than a billion years ago—was one of the most important scientific events of the last forty years. Named for Peter Higgs, the Scottish physicist who predicted it in 1964, the particle was the final missing piece in the Standard Model, physicists’ fundamental theory of visible matter. Though physicists at CERN were pretty confident that they would find the Higgs, actually finding it was a remarkable achievement: a stunning feat of science, engineering, and human collaboration. The particle was produced by colliding protons at enormous energies in CERN’s Large Hadron Collider (LHC). (Protons are one type of hadron, a special type of subatomic particle.) After it was produced, the Higgs was identified by two big detectors,ATLAS (A Toroidal Large Hadron Collider Apparatus) and CMS (Compact Muon Solenoid).

Following the Higgs announcement, CERN was shut down for upgrades for more than two years. It was fired up again in early 2015, and on December 15, ATLAS and CMS announcedthe initial findings from this new run. The results were the first taste of the physics of energy scales vastly higher than anything we have ever seen—nearly twice as high as the energies that yielded the Higgs. Buried deep in the talks in December, among the outcomes of many other results from ATLAS and CMS, were a couple of innocuous slides with a few tentative suggestions of something we call “new physics.” Nothing solid, nothing that could rise to level of discovery, but their mere existence had sparked rumors for weeks before the official announcements.

These hints were enough to set wheels in motion. Immediately after the December presentation, all around the world, a thousand physicists ran out of their offices to find someone to start arguing with about the results. Within a day, the first papers hit the arXiv, a public repository of physics and mathematics research, where papers can be posted for comment before peer review. By the end of the month at least 150 papers had appeared. The questions on everyone’s mind then are still on everyone’s mind now. What did the results mean? Are we seeing experimental error? Or random noise, like a blast of static on the radio? Or was this the first sign of physics beyond the Standard Model?

The Standard Model has been the consensus theory in physics for roughly forty years. It tells us that the visible world is made of a small set of fundamental particles, and it tells us how those particles work and interact. If the hints in the CMS and ATLAS data are vindicated—and that’s a big “if”—then particle physics would enter a new era. Our fundamental theory of the natural world would need to change, and no one knows quite how. This chance, however slight, has driven the extraordinary response among physicists to the announcement of the ATLAS and CMS results. Everyone wants to be part of what would be the biggest news in our field—perhaps the biggest scientific news—in nearly a half century.

So what is this Standard Model? What exactly are these experiments? And what did they see that got so many physicists so worked up?

I will explore these questions in a series of articles. CERN scientists will clarify the findings over the next several months and announce new results this summer. As the research proceeds, I will set the stage for the announcement. The idea is to provide you with a basic understanding of the results when they come in—and a sense of why we are all so excited.

• • •

The puzzling new data were gathered at the LHC, a twenty-seven kilometer ring more than a hundred meters underground that curves from CERN headquarters in Geneva, Switzerland, and into France before completing its circuit. When the LHC is in full swing, two beams of protons speed around the ring at 99.9999991 percent of the speed of light. They are kept on their curved path by titanic magnetic fields generated by nearly 10,000 superconducting magnets.

The two beams circulate in opposite directions, just over seven inches apart. But at four points along their underground course, the beams are bent just a bit more and made to cross. The protons come in bunches, roughly 100 billion in each bunch. When the beams cross, which happens 40 million times every second, virtually all of the protons miss their counterparts on the opposite beam and continue back onto the ring to complete another circuit in 0.0009 seconds. But a precious few—maybe fifteen or twenty pairs—collide.

Was this the first sign of physics beyond the Standard Model?

If two cars collide head-on, their kinetic energy—their energy of motion—has to go somewhere: to sound waves, to compression of structures in the frame, to propelling pieces of the cars long distances at high speeds. The same is true for the LHC’s protons, except, proportionally, they carry vastly more energy than cars on a highway. In the LHC’s first run (2010–2012), each proton carried enough kinetic energy to make—in principle—over 4000 new protons. (Remember Einstein’s famous equation E = mc2, which means that energy and mass are interconvertible, and that the speed of light squared is the rate of conversion.) In the LHC’s second run in 2015, this value had been increased; each proton carried more than 6500 times the energy needed to make a single proton. (In the units used by particle physicists, a proton “weighs” a bit less than a gigaelectronvolt, abbreviated GeV. The LHC beams now consist of protons with 6500 GeV of energy.) This extra burst of energy, made possible by even stronger superconducting magnets, is what makes this new run of the LHC so exciting.

To understand what happens when protons collide, bear in mind that even though protons are really small, they are not elementary particles—the most fundamental particles in nature. Rather, protons are made of a roiling ocean of particles called quarks and their antimatter counterparts called antiquarks (always with three more quarks than antiquarks), held together by still more particles called gluons. Often a collision of protons just results in a rearrangement of those constituents: the huge kinetic energy is redirected to throw quarks, antiquarks, and gluons off in random directions, much like pieces of a car flying off in a head-on wreck or the insides of a watch when it is smashed against a wall.

Sometimes, however, the kinetic energy from the collisions is channeled into the production of completely new particles. These new particles aren’t constituents of protons, but they can be summoned into existence—for a very, very short time—by the colossal amount of energy available when protons collide. In a later article, I’ll explain in more detail about how to picture this process of producing fast-fading particles. For now, just know that this is why we built the LHC: to give us a chance to see what new stuff comes out when you focus enough energy into a tiny region.

• • •

One thing that can (and did) come out of these collisions is the Higgs boson. Officially, the LHC’s principal aim was to produce and detect the Higgs. Its discovery significantly advanced our understanding of the Standard Model.

Yet although the Standard Model is the most precise and accurate theory ever devised in science, from nearly the moment of its discovery it was known to be incomplete: a hitherto unseen particle with very unique properties had to exist. Otherwise the whole theory of visible matter falls apart. Searching for the Higgs was made harder by the fact that the Standard Model does not predict its mass, so we did not know exactly where to look. Since the 1970s we’ve been searching, in one way or another, for the Higgs: experiments such as the Large Electron Positron collider (at CERN) and the Tevatron (at Fermilab, thirty miles west of Chicago) did not find it. (It turns out that they lacked the ability to pump enough energy into the collisions to create the Higgs in large numbers, though we had no way of knowing that at the time.) The Superconducting Supercollider, planned to be constructed in Texas, would have reached energies vastly greater than the LHC in the late 1990s. But in 1993, after half the ring was built and much political conflict, the project was cancelled. The LHC was the machine that was going to find the Higgs: there were “no-lose” theorems to prove that it could be done, and find it the LHC did.

Of course, throwing two protons together at vast energies is only half the battle. It isn’t enough to create the Higgs; you also have to detect it, which is hard because the particle lasts only for 0.0000000000000000000001 seconds before decaying into other particles. The only way to find it is indirect: by detecting the less massive particles that it decays into. To do this, at each point where the proton beams cross, massive detectors were constructed. ATLAS and CMS are two of them; think of them as very big and very complicated cameras. (ATLAS is 45 meters long, 25 meters in diameter, weighs 7000 tons, and has 10 million components.) These two detectors have very different designs and are run by separate teams of physicists, some 3000 for CMS and 5000 for ATLAS. This redundancy serves as a quality control check: each experiment gathers its own data, analyzes it independently, and each team has a burning desire to beat their counterparts to the punch.

ATLAS and CMS are designed to detect and measure the properties of as many different particles coming out of a proton-proton collision as possible, and to decide whether the collision meets some minimal, if vague, standard of “likely to contain interesting physics.” Most collisions do not pass that test, but for the few that do, the detector relays the information to the outside world, where it is saved for later analysis.

This whole effort is a massive engineering and physics challenge: particles do not come with handy ID cards telling you what they are and where they’re going. Instead, the ATLAS and CMS experiments must track how energy appears to be deposited in various materials that make up their detectors (starting with lots of silicon pixels close to the collision point), and then carefully unravel this pattern to determine what particles went where. The combination of the LHC and its four experiments (ATLAS, CMS, and two more specialized detectors, ALICE and LHCb) are among the most complicated artifacts ever constructed by humanity, and it is all custom-made, designed and constructed by thousands of physicists in labs around the world.

Once the information about what happened in a collision—the “event,” as it is called—has been safely stored “on tape” (a terminological throwback to the days when data-storage was on magnetic tape, rather than solid-state hard drives), the analysis can begin. Though the Higgs boson was the principal reason for the LHC’s construction, it was never the only target. The Standard Model is known to have problems that cry out for solutions. For various reasons, which I will explore in later articles, we suspect that some of these will be solved by physics that becomes manifest at energies not too far above the mass of the Higgs boson itself. There is no no-lose theorem here, however, and the possibilities for what could be lurking at high energies are not quite endless, but certainly immense.

Thus, the experimental physicists working on ATLAS and CMS look for as many things as they can think of. Part of the job of a theoretical physicist is to think up more things to look for, all for the sake of figuring out what the world is ultimately made of and how those ultimate constituents work. One great fear is that there is something new at the LHC and we’re just not clever enough to figure out where to look for it. The other great fear is that there is nothing new to be found. All we can do is cast as wide a net as we can, and be diligent.

• • •

The method of looking for new physics that sparked all the rumors prior to December 15 were the “diphoton” searches done separately by ATLAS and CMS experimentalists. These are very simple searches, as far as these things go. They require none of the baroque tricks that theorists like myself had spent many years developing (though of course they still require the dedicated work of many trained scientists to complete).

Every event at the LHC contains photons (particles of light), thousands of them in fact, with about the energy of the photons coming from an X-ray machine. The experimentalists were interested not in those everyday photons but the ones that contain tens or hundreds of billions times more energy. And not just any photons: pairs of photons (diphotons) that show up at the same time in a layer of the detector that is very good at photon detection. The hope was that these diphotons would be the signal of something new. But, as with many types of particles at the LHC, pairs of extremely energetic photons can be made in lots of different ways, including interactions from the physics of the Standard Model as well as interactions that could arise from a new particle. So seeing a bunch of diphoton events isn’t necessarily interesting; the experimentalists needed some way to distinguish diphotons that might have come from new physics and those that originated in well-understood physics. In particular, they needed a way to pick out pairs of photons that come from some new particle that was produced in a collision of LHC protons and then promptly fell apart (“decayed”), emitting two particles of light.

One fear is that there’s something new to be discovered and we’re just not clever enough to find it.

To do this, they relied on a useful property of particles—useful, but a little complicated. One of the things that defines a particle is its mass, and one of the properties of mass is that no matter how you are moving relative to a particle, or how the particle is moving relative to you, you will always measure the mass to be the same value. (In other words, mass is what physicists call a “relativistic invariant.”) A proton always has a mass equivalent to a bit less than 1 GeV in energy, for example. A massless photon is always seen as having zero mass.

Suppose now that you create a new particle in a collision between protons, and suppose that the new particle decays eventually into two photons. Those two photons inherit some information about their parent. While an individual photon is massless, you can take two photons and combine their energy and momentum to construct something called the “invariant mass” of the pair. This invariant mass, denoted mγγ (pronounced “m-gamma-gamma”), is the same as the mass of the parent particle. So if you can make many such new particles and capture the light from their decay in the detector, you will see many pairs of photons, all with the same invariant mass. For example, if you make some Higgs particles, and they decay eventually into photon pairs, the invariant mass of all the pairs will be the same as the mass of the Higgs.

For pairs of photons that are just produced from some random smashing of a proton against another proton, this quantity, the invariant mass of the two photons, will not always be exactly the same, as the photons can be produced in all sorts of ways, not just from the decay of a single type of particle with a unique mass. In general, in such collisions mγγ will be smoothly distributed, roughly tracking how much energy went in to making the pair of photons; the bigger the invariant mass of the photon pair, the less often we will see it.

So when you create a new particle (say, a Higgs) that sometimes decays to create diphoton events, you will see an elevated number of those events, all with the same invariant mass. If you plot the number of events with a given invariant mass, this results in a “bump,” which will stick out like a sore thumb over the “background” of diphotons created in other ways. For this reason, such searches at the LHC are called “bump-hunts.”

Here is an example. The Higgs is one particle that can decay into a pair of photons; one of the ways that the Higgs was discovered was through exactly this sort of bump-hunt. The Higgs has a mass equivalent to 125 GeV of energy. (Particle physicists use the same unit for mass and energy, again because of E = mc2.) So, when ATLAS and CMS collected events, they started to see an excess of diphoton events around an mγγ of 125 GeV. Of course, they didn’t know ahead of time that the Higgs mass was 125 GeV: they learned it only by seeing this bump and working their way back from there.

You can see the progression of the Higgs search in a handy gif the ATLAS experiment created afterwards. In the animation, the number of diphoton events collected is increasing as time goes by. The horizontal axis is the invariant mass mγγ of the diphotons measured by ATLAS, and the vertical axis is the number of events with diphotons that have a particular invariant mass.

Animation showing ATLAS evidence for the Higgs boson during Run 1 of the LHC. Image courtesy of ATLAS/CERN.

In this animation, there is a nice smoothly falling “background” of diphoton events. These are coming from well-known Standard Model processes. Then, at an invariant mass of 125 GeV, you start seeing a visible excess creep out over the background, once enough data has been collected. That’s the Higgs, as it was discovered at ATLAS: it was created in collisions of protons and then decayed into a pair of photons, which remembered enough of their parent to show up on this plot as a bump at a particular invariant mass.

• • •

As the Higgs discovery demonstrates, diphotons can be used to discover new physics. But it isn’t the only way to do so, of course. There are literally hundreds of possible ways to look for something new at the LHC. But diphotons are a “clean” way to look: the particles in the collision are relatively straightforward for the experimentalists to see in their detectors (they show up as pairs in what is called the “electromagnetic calorimeter”). They are also relatively straightforward to analyze; the invariant mass gives a simple way to see new physics jump out at you from the data. All this sort of search needs, then, is a new particle that can decay into a pair of photons.

This finally brings us back to the results of the ATLAS and CMS searches announced on December 15. Here is the ATLAS data:

Diphoton ATLAS data from Run 2 of the LHC. Image courtesy of ATLAS/CERN.

And here is the CMS data:

Diphoton CMS data from Run 2 of the LHC. Image courtesy of CMS/CERN.

These plots, like the Higgs plots above, show the number of events with diphotons that have a particular invariant mass. The difference is that these new plots come from the second run of the LHC, which collides protons at higher energies. That means the experimentalists can hope to make more massive new particles and can look at higher invariant masses of diphoton pairs than they could before.

There are a bunch of things going on with these plots, so let me draw your eye to particular features. First, in both cases, the experimentalists have drawn a background line (red for ATLAS, blue for CMS). This is their best estimate for the number of events that they should see at a given invariant mass, according to the Standard Model. Notice that the actual number of events at any given invariant mass doesn’t always match the prediction. Such departures are especially pronounced at the far right side of the plots, where the invariant mass is largest. Here, the number of predicted diphoton pairs is very small, less than one per range of invariant mass. Since you can’t actually see a fraction of an event—it either happened or it didn’t—the predicted and observed numbers of events are expected to deviate, and the way experimentalists compare what they expected to see and what they actually saw takes these small number of events into account.

Away from the right-side tail, however, the background prediction and actual number of events are in pretty good agreement, though they are not exactly the same. That’s as it should be. A background estimate is exactly that: an estimate. If you flip a coin a hundred times, you expect it to come up heads about fifty times. But you’re not going to be very surprised if, after a hundred flips, you have sixty heads. That’s not quite the expectation, but some percentage of the time you’re going to get fluctuations away from that value. As long as those fluctuations aren’t too far away from your expectation, you’re not too bothered. Of course, if you flip a coin a hundred times and see ninety-four heads, you might start to suspect that your coin isn’t a fair coin. Critically, though, you couldn’t say for certain that the coin is biased, no matter how frequently and extremely it departs from the average value you expect. All you can say for sure is that it’s extremely unlikely that the coin is fair (and you could put a specific number on how unlikely it is).

The background lines in the CERN plots are like the expectation of seeing fifty heads in a hundred flips of a coin. Deviations around that expected value are consistent with the Standard Model, as long as those deviations are small compared to the number of diphotons collected. But if you see a large fluctuation away from that expectation, you have some reason to believe that there’s something else going on (new physics, or a biased coin in the analogy). I’m simplifying a little, of course, since the background expectation for ATLAS and CMS have to be determined using the data itself. This is unlike the fair coin example, where the background expectation is something you can figure out ahead of time.

As you read the ATLAS plot from left to right (increasing invariant mass along the horizontal axis), you see the actual number of events detected (the black crosses). They more or less track the background expectation, though there are some small deviations both up and down. But then, at an invariant mass of 750 GeV, there is a rather large bump: there are about twenty photons around this invariant mass. Now twenty is not a lot in absolute terms, but the Standard Model tells us that the number should be closer to eight. So we have a huge upward fluctuation from the background. That can happen simply due to random chance, just as it can happen that you can flip a fair coin a hundred times and get seventy-five heads. But if that happened with a coin, you would start looking at it pretty closely, wondering if something was going on.

We can put a number on how unlikely it is. Remember that the Higgs was found through a diphoton bump. How likely was it that the bump was due to chance? Not very likely: 1 in 3,488,560.

Similar calculations show that the ATLAS bump at 750 GeV could occur by chance only once in 6285 tries. That sounds pretty convincing. But recall that we had no special reason for looking at 750 GeV. We would have been equally happy with a big fluctuation at any invariant mass: 740, 730, 720, and so on. In effect, we “tried” many times to see a fluctuation, and we have to take that promiscuity into account. This is called the “look-elsewhere” effect.

Think of the look-elsewhere effect using the following analogy. You have a 1 in 365.25 chance of sharing the same birthday as any random individual. Imagine that you meet exactly one person, and you ask what their birthday is, and it is the same as yours! You’d be right to be very surprised. However, imagine that you run into a hundred people, and you ask all of them what their birthday is. Suddenly, the chance that someone shares a birthday with you is about 25 percent. You’ve looked in a lot of places, and so getting one result that’s somewhat rare becomes a lot more likely. Seek and ye shall find—or have a pretty good chance of finding, anyway.

Applying this “look-elsewhere” reasoning to the ATLAS search drops the odds of the bump being the result of random fluctuation from 1 in 6285 to around 1 in 44. That’s really not very low: in physics such numbers are routinely filed under “maybe interesting, but wait and see.” ATLAS does hundreds of searches, so about 1 in 22 will have a fluctuation of that order of magnitude in their results. We’re used to that.

But now look at the CMS plot. Instead of scanning your eye from left to right along increasing invariant mass, jump straight to 750 GeV. ATLAS already told you to look there, after all. And what’s that? At 750 GeV in the CMS data, you find a few more events than there “should” be, according to the Standard Model. Maybe ten events, while you were expecting four.

By itself, this result should happen due to random chance about 1 in 214 tries. If CMS were the only experiment to report such results, you’d have to apply the look-elsewhere effect again, and the significance of this result would drop precipitously. But ATLAS already told us this particular region was interesting; we no longer need to look elsewhere. And since the CMS and ATLAS experiments are completely separate from one another, we’ve basically tried twice to test the physics at this particular invariant mass, and in both cases we’ve come up with something that is really unlikely.

That is what makes this result so interesting: seeing something unusual not once, but twice, in results from groups that have different experiments, different ways of collecting data, and different ways of analyzing their data, and that don’t share their results with each other ahead of time. To some degree, one experiment provides an independent validation of the other.

But even this convergence of independent evidence is not definitive. This kind of coincidence has happened before even though the results turned out to be insignificant upon closer inspection. Two experiments can see unusual results in the same spot, and then, when more data comes in, everything can revert back to the expectation. After all, the LHC experiments look for new physics in a lot of places, so unlikely things can happen. Therefore, the safest bet, based on past experience, is that these results will also disappear once we get more data from another LHC run. Perhaps we just stopped the LHC collisions at a time when both experiments had a few more diphotons than normal. Physicists who say this have an annoying track record of being correct.

The flip side is that this particular result—as far as I can tell—is the most surprising result to come out of the LHC other than the discovery of the Higgs itself. In short, the CMS and ATLAS results combined are the mostly unlikely things to have been seen if the Standard Model is the only physics around, other than the one time we discovered a new particle.

That’s enough to get a lot of people sitting up and paying very close attention.

• • •

If the CMS and ATLAS hints pan out, we’ll have discovered something completely and utterly new. That hasn’t happened in a very long time in collider physics.

In fact, we can go even further. Based on this first round of data—and you have to keep in mind that the results are very preliminary—we can say that if these diphoton bumps are real, they are the sign not only of new physics, but of new physics of a kind we were absolutely not expecting. We know they would point beyond the Standard Model, but we do not know the direction they would point us in.

If the elegant idea can’t explain the data, so much for the elegant idea.

In the years leading up to the construction of the LHC and then the discovery of the Higgs, theoretical physicists had plenty of time to think about what should be lurking above the energies we had tested with other colliders. There were, of course, many ideas. But several gained popular currency thanks to the “elegant” ways they solved the problems we were wrestling with. (Mathematicians and physicists speak often of searching for “elegant” theories. The mathematician G.H. Hardy went so far as to say “there is no permanent place in the world for ugly mathematics.”) These theories tend to predict lots of different particles, with many different signals that would appear at the LHC, given the energies that are now being attained. Simple versions of these theoretical ideas have become benchmarks against which we test the ability of the LHC to find new physics. Interestingly, even though these benchmark theories were chosen specifically because they could result in so many interesting signatures at the LHC, they do not predict the signal we’re seeing in the ATLAS and CMS diphotons. The excess at 750 GeV is a little too weird to be explainable by these elegant ideas we’ve spend several decades honing.

That is why it is such tremendous fun to try to explain these new results. Without new physics, we theorists had to rely on aesthetic criteria—a certain degree of elegance, an economy of design—to guide our intuition about the “right” idea to pursue. But the universe never promised us a rose garden. If the elegant idea can’t explain the data, so much for theoretical aesthetics. The demand for beauty goes out the window in the face of experimental hints, and you’re free to consider ideas that are ugly but get the job done (though everyone will have their own personal measure of what counts as too ugly to consider pursuing). So there’s a flurry of activity, and optimistically, someone will stumble on the right idea, which might turn out of have aesthetic appeal all of its own.

In the end, we can only say one thing for sure: we need more data.

From a personal perspective, it is incredibly frustrating not to know what to make of these new results. The time physicists spend working on this is time we are not working on other ideas, ideas that we can be sure will continue to be important even after the LHC turns back on in a few months. To spend your time pondering something which could be flat out untrue is scary from a career perspective and dispiriting from the perspective of wanting to know how the universe works. To be not just wrong (which is part of the job), but irrelevantly wrong, wrong in a way that teaches you nothing? That’s not what any of us want.

On the other hand, this work is deeply exciting. It is what science is all about—at least the part of science that most scientists dream of (the remaining bits being some mixture of hard work, coffee, grant writing, more coffee, conferences, and gently banging your head against a wall).

For someone like me, in my mid-thirties in particle physics, this is brand new: we’ve never had a truly unbelievable surprise. It was great to find the Higgs boson, but it fit perfectly into the received theories, ideas that I studied in graduate school. We don’t know whether this diphoton excess is the first step into the undiscovered country beyond the Standard Model. But ATLAS and CMS are breaking new ground, reaching energies that have never before been probed by humanity.

In this series, I will try to guide you through this exciting work. By the time I get to the end, I hope to be able to tell you what is going on with those photons we saw at the LHC. The safe bet is that I will have to report that this result, like many others, is not interesting. But maybe not. I don’t know, and neither do you; that’s the beauty of it.