What No New Particles Means for Physics


Physicists are confronting their “nightmare scenario.” What does the absence of new particles suggest about how nature works?

Physicists at the Large Hadron Collider (LHC) in Europe have explored the properties of nature at higher energies than ever before, and they have found something profound: nothing new.

It’s perhaps the one thing that no one predicted 30 years ago when the project was first conceived.

The infamous “diphoton bump” that arose in data plots in December has disappeared, indicating that it was a fleeting statistical fluctuation rather than a revolutionary new fundamental particle. And in fact, the machine’s collisions have so far conjured up no particles at all beyond those catalogued in the long-reigning but incomplete “Standard Model” of particle physics. In the collision debris, physicists have found no particles that could comprise dark matter, no siblings or cousins of the Higgs boson, no sign of extra dimensions, no leptoquarks — and above all, none of the desperately sought supersymmetry particles that would round out equations and satisfy “naturalness,” a deep principle about how the laws of nature ought to work.

“It’s striking that we’ve thought about these things for 30 years and we have not made one correct prediction that they have seen,” said Nima Arkani-Hamed, a professor of physics at the Institute for Advanced Study in Princeton, N.J.

The news has emerged at the International Conference on High Energy Physics in Chicago over the past few days in presentations by the ATLAS and CMS experiments, whose cathedral-like detectors sit at 6 and 12 o’clock on the LHC’s 17-mile ring. Both teams, each with over 3,000 members, have been working feverishly for the past three months analyzing a glut of data from a machine that is finally running at full throttle after being upgraded to nearly double its previous operating energy. It now collides protons with 13 trillion electron volts (TeV) of energy — more than 13,000 times the protons’ individual masses — providing enough raw material to beget gargantuan elementary particles, should any exist.

The Large Hadron Collider collides protons at high energies, and the debris is recorded by two main detectors, CMS and ATLAS. In December 2015, both detectors picked up a small excess in the number of pairs of photons with a combined energy of 750 GeV produced during 13-TeV collisions, when compared to Standard Model predictions. Physicists hoped that this “diphoton bump” resulted from a new elementary particle momentarily forming and then decaying into two photons. Four times more data has been collected at the LHC in 2016, and the diphoton bump has gone away. This indicates that the excess seen last year was merely a statistical fluctuation. (Note that expectations based on the Standard Model have changed slightly in 2016 because of different accelerator and detector conditions.)

Lucy Reading-Ikkanda for Quanta Magazine

So far, none have materialized. Especially heartbreaking for many is the loss of the diphoton bump, an excess of pairs of photons that cropped up in last year’s teaser batch of 13-TeV data, and whose origin has been the speculation of some 500 papers by theorists. Rumors about the bump’s disappearance in this year’s data began leaking in June, triggering a community-wide “diphoton hangover.”

“It would have single-handedly pointed to a very exciting future for particle experiments,” said Raman Sundrum, a theoretical physicist at the University of Maryland. “Its absence puts us back to where we were.”

The lack of new physics deepens a crisis that started in 2012 during the LHC’s first run, when it became clear that its 8-TeV collisions would not generate any new physics beyond the Standard Model. (The Higgs boson, discovered that year, was the Standard Model’s final puzzle piece, rather than an extension of it.) A white-knight particle could still show up later this year or next year, or, as statistics accrue over a longer time scale, subtle surprises in the behavior of the known particles could indirectly hint at new physics. But theorists are increasingly bracing themselves for their “nightmare scenario,” in which the LHC offers no path at all toward a more complete theory of nature.

Some theorists argue that the time has already come for the whole field to start reckoning with the message of the null results. The absence of new particles almost certainly means that the laws of physics are not natural in the way physicists long assumed they are. “Naturalness is so well-motivated,” Sundrum said, “that its actual absence is a major discovery.”

Missing Pieces

The main reason physicists felt sure that the Standard Model could not be the whole story is that its linchpin, the Higgs boson, has a highly unnatural-seeming mass. In the equations of the Standard Model, the Higgs is coupled to many other particles. This coupling endows those particles with mass, allowing them in turn to drive the value of the Higgs mass to and fro, like competitors in a tug-of-war. Some of the competitors are extremely strong — hypothetical particles associated with gravity might contribute (or deduct) as much as 10 million billion TeV to the Higgs mass — yet somehow its mass ends up as 0.125 TeV, as if the competitors in the tug-of-war finish in a near-perfect tie. This seems absurd — unless there is some reasonable explanation for why the competing teams are so evenly matched.

Supersymmetry, as theorists realized in the early 1980s, does the trick. It says that for every “fermion” that exists in nature — a particle of matter, such as an electron or quark, that adds to the Higgs mass — there is a supersymmetric “boson,” or force-carrying particle, that subtracts from the Higgs mass. This way, every participant in the tug-of-war game has a rival of equal strength, and the Higgs is naturally stabilized. Theorists devised alternative proposals for how naturalness might be achieved, but supersymmetry had additional arguments in its favor: It caused the strengths of the three quantum forces to exactly converge at high energies, suggesting they were unified at the beginning of the universe. And it supplied an inert, stable particle of just the right mass to be dark matter.

“We had figured it all out,” said Maria Spiropulu, a particle physicist at the California Institute of Technology and a member of CMS. “If you ask people of my generation, we were almost taught that supersymmetry is there even if we haven’t discovered it. We believed it.”

Hence the surprise when the supersymmetric partners of the known particles didn’t show up — first at the Large Electron-Positron Collider in the 1990s, then at the Tevatron in the 1990s and early 2000s, and now at the LHC. As the colliders have searched ever-higher energies, the gap has widened between the known particles and their hypothetical superpartners, which must be much heavier in order to have avoided detection. Ultimately, supersymmetry becomes so “broken” that the effects of the particles and their superpartners on the Higgs mass no longer cancel out, and supersymmetry fails as a solution to the naturalness problem. Some experts argue that we’ve passed that point already. Others, allowing for more freedom in how certain factors are arranged, say it is happening right now, with ATLAS and CMS excluding the stop quark — the hypothetical superpartner of the 0.173-TeV top quark — up to a mass of 1 TeV. That’s already a nearly sixfold imbalance between the top and the stop in the Higgs tug-of-war. Even if a stop heavier than 1 TeV exists, it would be pulling too hard on the Higgs to solve the problem it was invented to address.

The Standard Model

Lucy Reading-Ikkanda for Quanta Magazine

“I think 1 TeV is a psychological limit,” said Albert de Roeck, a senior research scientist at CERN, the laboratory that houses the LHC, and a professor at the University of Antwerp in Belgium.

Some will say that enough is enough, but for others there are still loopholes to cling to. Among the myriad supersymmetric extensions of the Standard Model, there are more complicated versions in which stop quarks heavier than 1 TeV conspire with additional supersymmetric particles to counterbalance the top quark, tuning the Higgs mass. The theory has so many variants, or individual “models,” that killing it outright is almost impossible. Joe Incandela, a physicist at the University of California, Santa Barbara, who announced the discovery of the Higgs boson on behalf of the CMS collaboration in 2012, and who now leads one of the stop-quark searches, said, “If you see something, you can make a model-independent statement that you see something. Seeing nothing is a little more complicated.”

Particles can hide in nooks and crannies. If, for example, the stop quark and the lightest neutralino (supersymmetry’s candidate for dark matter) happen to have nearly the same mass, they might have stayed hidden so far. The reason for this is that, when a stop quark is created in a collision and decays, producing a neutralino, very little energy will be freed up to take the form of motion. “When the stop decays, there’s a dark-matter particle just kind of sitting there,” explained Kyle Cranmer of New York University, a member of ATLAS. “You don’t see it. So in those regions it’s very difficult to look for.” In that case, a stop quark with a mass as low as 0.6 TeV could still be hiding in the data.

Experimentalists will strive to close these loopholes in the coming years, or to dig out the hidden particles. Meanwhile, theorists who are ready to move on face the fact that they have no signposts from nature about which way to go. “It’s a very muddled and uncertain situation,” Arkani-Hamed said.

New Hope

Many particle theorists now acknowledge a long-looming possibility: that the mass of the Higgs boson is simply unnatural — its small value resulting from an accidental, fine-tuned cancellation in a cosmic game of tug-of-war — and that we observe such a peculiar property because our lives depend on it. In this scenario, there are many, many universes, each shaped by different chance combinations of effects. Out of all these universes, only the ones with accidentally lightweight Higgs bosons will allow atoms to form and thus give rise to living beings. But this “anthropic” argument is widely disliked for being seemingly untestable.

In the past two years, some theoretical physicists have started to devise totally new natural explanations for the Higgs mass that avoid the fatalism of anthropic reasoning and do not rely on new particles showing up at the LHC. Last week at CERN, while their experimental colleagues elsewhere in the building busily crunched data in search of such particles, theorists held a workshop to discuss nascent ideas such as the relaxion hypothesis — which supposes that the Higgs mass, rather than being shaped by symmetry, was sculpted dynamically by the birth of the cosmos — and possible ways to test these ideas. Nathaniel Craig of the University of California, Santa Barbara, who works on an idea called “neutral naturalness,” said in a phone call from the CERN workshop, “Now that everyone is past their diphoton hangover, we’re going back to these questions that are really aimed at coping with the lack of apparent new physics at the LHC.”

Arkani-Hamed, who, along with several colleagues, recently proposed another new approach called “Nnaturalness,” said, “There are many theorists, myself included, who feel that we’re in a totally unique time, where the questions on the table are the really huge, structural ones, not the details of the next particle. We’re very lucky to get to live in a period like this — even if there may not be major, verified progress in our lifetimes.”

As theorists return to their blackboards, the 6,000 experimentalists with CMS and ATLAS are reveling in their exploration of a previously uncharted realm. “Nightmare, what does it mean?” said Spiropulu, referring to theorists’ angst about the nightmare scenario. “We are exploring nature. Maybe we don’t have time to think about nightmares like that, because we are being flooded in data and we are extremely excited.”

There’s still hope that new physics will show up. But discovering nothing, in Spiropulu’s view, is a discovery all the same — especially when it heralds the death of cherished ideas. “Experimentalists have no religion,” she said.

Some theorists agree. Talk of disappointment is “crazy talk,” Arkani-Hamed said. “It’s actually nature! We’re learning the answer! These 6,000 people are busting their butts and you’re pouting like a little kid because you didn’t get the lollipop you wanted?”

Advertisements

Scientists Have Recreated Tiny Drops of Quark Soup From The Very Early Universe


You probably don’t stop to think about this often, but right after the birth of the Universe, matter was not in the form we recognise today.

Instead, scientists think it was very much in a state of soup, ‘quark soup’ to be precise – a state also known as quark-gluon plasma.

main article image

Now, researchers say they’ve managed to shape ultra-hot, ultra-small liquid droplets of this soup in the lab – possibly allowing us to peer back to the very first microseconds of existence.

Expanding drops of quark-gluon plasmas were created in three geometric shapes – circles, ellipses and triangles – by using a massive particle collider to smash together protons and neutrons at such high speeds and temperatures that they break up.

One of the key findings from the experiment is that these tiny quark-gluon plasma drops behave like fluids, even at the smallest scales. That’s something scientists had originally thought was impossible, but have now seen more and more evidence for.

“Our experimental result has brought us much closer to answering the question of what is the smallest amount of early Universe matter that can exist,” says one of the team, Jamie Nagle from the University of Colorado Boulder.

So what exactly is quark-gluon plasma? It’s a liquid-like state, but one that exists at searing temperatures, where conditions are too hot to form atoms. As the name suggests, it’s made up from quarks and gluons, elementary particles that combine to make up protons and neutrons.

We’re talking in the region of 4 trillion degrees Celsius (or nearly 7 trillion degrees Fahrenheit) – about 250,000 times hotter than the core of the Sun.

While these particles eventually cooled enough to form regular matter that makes up our world today, the properties of this plasma remain fascinating for scientists – properties which tell the story of the very beginnings of the Universe and everything in it.

Since the turn of the century, experiments at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in New York have suggested that quark-gluon plasmas might behave like near-perfect fluids.

qgp 01Atomic collisions (left) producing the three plasma shapes (right). (PHENIX, Nature 2018)

“If you could have a bottle of this liquid on your desk, and you were to tip it over and have it flow around an obstacle, it would do so with almost no friction,” says one of the researchers, Paul Romatschke from CU Boulder.

That’s what this latest experiment was designed to test: whether quark soup droplets would keep their shape even at the tiniest sizes, fitting with the liquid hypothesis.

The resulting circles, ellipses and triangles are the strongest evidence yet that this is indeed the case, even when only a couple of protons are colliding.

The new research backs up previous studies carried out at the RHIC on quark-gluon plasmas. Specialist labs have been producing them for several years in a variety of conditions, but they blink in and out of existence incredibly quickly, and can be challenging to analyse.

There’s plenty more to explore about this earliest form of matter, but we’re already well on our way to learning exactly how quark soup became actual matter. Eventually, we could unlock more secrets about how the Universe is expanding.

Next, the lab team at the Brookhaven National Laboratory are busy setting up experiments to test the properties of quark-gluon plasmas at even smaller scales.

“How small can a system be and still exhibit collective behaviour?” says one of the researchers, Victoria Greene from Vanderbilt University in Tennessee.

“The idea that you can replicate the Big Bang in miniature in the laboratory and understand that the early state of the Universe was this perfect liquid state could inform cosmological models.”

CERN Scientists Say The LHC Has Confirmed Two New Particles, And Possibly Discovered a Third


They are known as bottom baryons.

The Large Hadron Collider is at it again, showing us new wonders in the world of particle physics. Scientists working on the Large Hadron Collider beauty (LHCb) collaboration have observed two new particles that have never been seen before – and seen evidence of a third.

main article image

The two new particles, predicted by the standard quark model, are baryons – the same family of particles as the protons used in LHC particle acceleration experiments.

Baryons are what most of the Universe is made up of, including protons and neutrons – composite particles consisting of three fundamental particles called quarks, which have different ‘flavours’, or types: up, down, top, bottom, charm, and strange.

Protons consist of two up quarks and one down quark, while neutrons consist of one up quark and two down quarks, for instance. But the two new particles discovered have a slightly different composition.

Named Σb(6097)+ and Σb(6097), they consist of two up quarks and one bottom quark; and two down quarks and one bottom quark, respectively.

These particles are known as bottom baryons, and they are related to four particles previously observed at Fermilab. However, the new observations mark the first time scientists have detected these higher-mass counterparts; they are about six times more massive than a proton.

So what’s the third particle candidate we mentioned earlier?

The researchers think it might be a strange type of composite particle called a tetraquark. These are an exotic kind of meson, which normally have two quarks. But a tetraquark is composed of four quarks – well, two quarks and two antiquarks, to be more accurate.

Observational evidence of tetraquarks has been pretty elusive to date, and that is also the case here. Evidence of the candidate particle, called Zc(4100) and including two heavy charm quarks, was detected in the decay of heavier B mesons.

But the detection only had a significance of over 3 standard deviations. The usual threshold to claim the discovery of a new particle is 5 standard deviations. It will take future observations to either confirm or disprove the existence of Zc(4100).

The new bottom baryons, you’ll be pleased to know, blew that threshold out of the water: Σb(6097)+ and Σb(6097) had significances of 12.7 and 12.6 standard deviations respectively.

Particle Physicists Turn to AI to Cope with CERN’s Collision Deluge


Can a competition with cash rewards improve techniques for tracking the Large Hadron Collider’s messy particle trajectories?

Particle Physicists Turn to AI to Cope with CERN's Collision Deluge
A visualization of complex sprays of subatomic particles, produced from colliding proton beams in CERN’s CMS detector at the Large Hadron Collider near Geneva, Switzerland in mid-April of 2018. Credit: CERN

Physicists at the world’s leading atom smasher are calling for help. In the next decade, they plan to produce up to 20 times more particle collisions in the Large Hadron Collider (LHC) than they do now, but current detector systems aren’t fit for the coming deluge. So this week, a group of LHC physicists has teamed up with computer scientists to launch a competition to spur the development of artificial-intelligence techniques that can quickly sort through the debris of these collisions. Researchers hope these will help the experiment’s ultimate goal of revealing fundamental insights into the laws of nature.

At the LHC at CERN, Europe’s particle-physics laboratory near Geneva, two bunches of protons collide head-on inside each of the machine’s detectors 40 million times a second. Every proton collision can produce thousands of new particles, which radiate from a collision point at the centre of each cathedral-sized detector. Millions of silicon sensors are arranged in onion-like layers and light up each time a particle crosses them, producing one pixel of information every time. Collisions are recorded only when they produce potentially interesting by-products. When they are, the detector takes a snapshot that might include hundreds of thousands of pixels from the piled-up debris of up to 20 different pairs of protons. (Because particles move at or close to the speed of light, a detector cannot record a full movie of their motion.)

From this mess, the LHC’s computers reconstruct tens of thousands of tracks in real time, before moving on to the next snapshot. “The name of the game is connecting the dots,” says Jean-Roch Vlimant, a physicist at the California Institute of Technology in Pasadena who is a member of the collaboration that operates the CMS detector at the LHC.

After future planned upgrades, each snapshot is expected to include particle debris from 200 proton collisions. Physicists currently use pattern-recognition algorithms to reconstruct the particles’ tracks. Although these techniques would be able to work out the paths even after the upgrades, “the problem is, they are too slow”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay. Without major investment in new detector technologies, LHC physicists estimate that the collision rates will exceed the current capabilities by at least a factor of 10.

Researchers suspect that machine-learning algorithms could reconstruct the tracks much more quickly. To help find the best solution, Vlimant and other LHC physicists teamed up with computer scientists including Germain to launch the TrackML challenge. For the next three months, data scientists will be able to download 400 gigabytes of simulated particle-collision data—the pixels produced by an idealized detector—and train their algorithms to reconstruct the tracks.

Participants will be evaluated on the accuracy with which they do this. The top three performers of this phase hosted by Google-owned company Kaggle, will receive cash prizes of US$12,000, $8,000 and $5,000. A second competition will then evaluate algorithms on the basis of speed as well as accuracy, Vlimant says.

Prize appeal

Such competitions have a long tradition in data science, and many young researchers take part to build up their CVs. “Getting well ranked in challenges is extremely important,” says Germain. Perhaps the most famous of these contests was the 2009 Netflix Prize. The entertainment company offered US$1 million to whoever worked out the best way to predict what films its users would like to watch, going on their previous ratings. TrackML isn’t the first challenge in particle physics, either: in 2014, teams competed to ‘discover’ the Higgs boson in a set of simulated data (the LHC discovered the Higgs, long predicted by theory, in 2012). Other science-themed challenges have involved data on anything from plankton to galaxies.

From the computer-science point of view, the Higgs challenge was an ordinary classification problem, says Tim Salimans, one of the top performers in that race (after the challenge, Salimans went on to get a job at the non-profit effort OpenAI in San Francisco, California). But the fact that it was about LHC physics added to its lustre, he says. That may help to explain the challenge’s popularity: nearly 1,800 teams took part, and many researchers credit the contest for having dramatically increased the interaction between the physics and computer-science communities.

TrackML is “incomparably more difficult”, says Germain. In the Higgs case, the reconstructed tracks were part of the input, and contestants had to do another layer of analysis to ‘find’ the particle. In the new problem, she says, you have to find in the 100,000 points something like 10,000 arcs of ellipse. She thinks the winning technique might end up resembling those used by the program AlphaGo, which made history in 2016 when it beat a human champion at the complex game of Go. In particular, they might use reinforcement learning, in which an algorithm learns by trial and error on the basis of ‘rewards’ that it receives after each attempt.

Vlimant and other physicists are also beginning to consider more untested technologies, such as neuromorphic computing and quantum computing. “It’s not clear where we’re going,” says Vlimant, “but it looks like we have a good path.”

For all book lovers please visit my friend’s website.
URL: http://www.romancewithbooks.com

Particle Physicists Turn to AI to Cope with CERN’s Collision Deluge


Can a competition with cash rewards improve techniques for tracking the Large Hadron Collider’s messy particle trajectories?

Particle Physicists Turn to AI to Cope with CERN's Collision Deluge
A visualization of complex sprays of subatomic particles, produced from colliding proton beams in CERN’s CMS detector at the Large Hadron Collider near Geneva, Switzerland in mid-April of 2018.

Physicists at the world’s leading atom smasher are calling for help. In the next decade, they plan to produce up to 20 times more particle collisions in the Large Hadron Collider (LHC) than they do now, but current detector systems aren’t fit for the coming deluge. So this week, a group of LHC physicists has teamed up with computer scientists to launch a competition to spur the development of artificial-intelligence techniques that can quickly sort through the debris of these collisions. Researchers hope these will help the experiment’s ultimate goal of revealing fundamental insights into the laws of nature.

At the LHC at CERN, Europe’s particle-physics laboratory near Geneva, two bunches of protons collide head-on inside each of the machine’s detectors 40 million times a second. Every proton collision can produce thousands of new particles, which radiate from a collision point at the centre of each cathedral-sized detector. Millions of silicon sensors are arranged in onion-like layers and light up each time a particle crosses them, producing one pixel of information every time. Collisions are recorded only when they produce potentially interesting by-products. When they are, the detector takes a snapshot that might include hundreds of thousands of pixels from the piled-up debris of up to 20 different pairs of protons. (Because particles move at or close to the speed of light, a detector cannot record a full movie of their motion.)

From this mess, the LHC’s computers reconstruct tens of thousands of tracks in real time, before moving on to the next snapshot. “The name of the game is connecting the dots,” says Jean-Roch Vlimant, a physicist at the California Institute of Technology in Pasadena who is a member of the collaboration that operates the CMS detector at the LHC.

After future planned upgrades, each snapshot is expected to include particle debris from 200 proton collisions. Physicists currently use pattern-recognition algorithms to reconstruct the particles’ tracks. Although these techniques would be able to work out the paths even after the upgrades, “the problem is, they are too slow”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay. Without major investment in new detector technologies, LHC physicists estimate that the collision rates will exceed the current capabilities by at least a factor of 10.

Researchers suspect that machine-learning algorithms could reconstruct the tracks much more quickly. To help find the best solution, Vlimant and other LHC physicists teamed up with computer scientists including Germain to launch the TrackML challenge. For the next three months, data scientists will be able to download 400 gigabytes of simulated particle-collision data—the pixels produced by an idealized detector—and train their algorithms to reconstruct the tracks.

Participants will be evaluated on the accuracy with which they do this. The top three performers of this phase hosted by Google-owned company Kaggle, will receive cash prizes of US$12,000, $8,000 and $5,000. A second competition will then evaluate algorithms on the basis of speed as well as accuracy, Vlimant says.

Prize appeal

Such competitions have a long tradition in data science, and many young researchers take part to build up their CVs. “Getting well ranked in challenges is extremely important,” says Germain. Perhaps the most famous of these contests was the 2009 Netflix Prize. The entertainment company offered US$1 million to whoever worked out the best way to predict what films its users would like to watch, going on their previous ratings. TrackML isn’t the first challenge in particle physics, either: in 2014, teams competed to ‘discover’ the Higgs boson in a set of simulated data (the LHC discovered the Higgs, long predicted by theory, in 2012). Other science-themed challenges have involved data on anything from plankton to galaxies.

From the computer-science point of view, the Higgs challenge was an ordinary classification problem, says Tim Salimans, one of the top performers in that race (after the challenge, Salimans went on to get a job at the non-profit effort OpenAI in San Francisco, California). But the fact that it was about LHC physics added to its lustre, he says. That may help to explain the challenge’s popularity: nearly 1,800 teams took part, and many researchers credit the contest for having dramatically increased the interaction between the physics and computer-science communities.

TrackML is “incomparably more difficult”, says Germain. In the Higgs case, the reconstructed tracks were part of the input, and contestants had to do another layer of analysis to ‘find’ the particle. In the new problem, she says, you have to find in the 100,000 points something like 10,000 arcs of ellipse. She thinks the winning technique might end up resembling those used by the program AlphaGo, which made history in 2016 when it beat a human champion at the complex game of Go. In particular, they might use reinforcement learning, in which an algorithm learns by trial and error on the basis of ‘rewards’ that it receives after each attempt.

Vlimant and other physicists are also beginning to consider more untested technologies, such as neuromorphic computing and quantum computing. “It’s not clear where we’re going,” says Vlimant, “but it looks like we have a good path.”

For all book lovers please visit my friend’s website.
URL: http://www.romancewithbooks.com

CERN May Have Evidence of a Quasiparticle We’ve Been Hunting For Decades


Meet the elusive odderon.

The Large Hadron Collider (LHC) is the particle accelerator that just keeps on giving, and recent experiments at the site suggest we’ve got the first evidence for a mysterious subatomic quasiparticle that, until now, was only a hypothesis.

Quasiparticles aren’t technically particles, but they act like them in some respects, and the newly recorded reactions point to a particular quasiparticle called the odderon.

It already has a name because physicists have been on its theoretical trail for the past 40 years.

Now, they still haven’t seen the elusive odderon itself, but researchers have now observed certain effects that hint the quasiparticle really is there.

That would in turn give us new information to feed into the Standard Model of particle physics, the guidebook that all the building blocks of physical matter are thought to follow.

“This doesn’t break the Standard Model, but there are very opaque regions of the Standard Model, and this work shines a light on one of those opaque regions,” says one of the team, particle physicist Timothy Raben from the University of Kansas.

“These ideas date back to the 70s, but even at that time it quickly became evident we weren’t close technologically to being able to see the odderon, so while there are several decades of predictions, the odderon has not been seen.”

The reactions studied in this case involve quarks, or electrically charged subatomic particles, and gluons, which act as exchange particles between quarks and enable them to stick together to form protons and neutrons.

In proton collisions where the protons remain intact, up until now scientists have only seen this happen when an even number of gluons are exchanged between different protons. The new research notes, for the first time, these reactions happening with an odd number of gluons.

And it’s the way the protons deviate rather than break that’s important for this particular area of investigation. It was this phenomena that first led to the idea of a quasiparticle called an odderon, to explain away collisions where protons survived.

“The odderon is one of the possible ways by which protons can interact without breaking, whose manifestations have never been observed .. this could be the first evidence of that,” Simone Giani, spokesperson at the TOTEM experiment of which this is a part, told Ryan F. Mandelbaum at Gizmodo.

It’s a pretty complex idea to wrap your head around, so the researchers have used a vehicle metaphor to explain what’s going on.

“The protons interact like two big semi-trucks that are transporting cars, the kind you see on the highway,” explains Raben.

“If those trucks crashed together, after the crash you’d still have the trucks, but the cars would now be outside, no longer aboard the trucks – and also new cars are produced. Energy is transformed into matter.”

“Until now, most models were thinking there was a pair of gluons – always an even number… We found measurements that are incompatible with this traditional model of assuming an even number of gluons.”

What all of that theoretical physics and subatomic analysis means is that we may have seen evidence of the odderon at work – with the odderon being the total contribution produced from the exchange of an odd number of gluons.

The experiments involved a team of over 100 physicists, colliding billions of proton pairs together every second in the LHC. At its peak, data was being collected at 13 teraelectronvolts (TeV), a new record.

By comparing these high energy tests with results gleaned from other tests run on less powerful hardware, the researchers could reach a new level of accuracy in their proton collision measurements, and that may have revealed the odderon.

Ultimately this kind of super-high energy experiment can feed into all kinds of areas of research, including medicine, water purification, and cosmic ray measuring.

We’re still waiting for confirmation that this legendary quasiparticle has in fact been found – or at least that its effects have – and the papers are currently submitted to be published in peer reviewed journals.

But it’s definitely a super-exciting time for physicists.

“We expect big results in the coming months or years,” says one of the researchers, Christophe Royon from the University of Kansas.

The research is currently undergoing peer review, but you can read the studies on the ArXiv.org and CERN pre-print servers.

Neutrinos Suggest Solution to Mystery of Universe’s Existence


Updated results from a Japanese neutrino experiment continue to reveal an inconsistency in the way that matter and antimatter behave.

A neutrino passing through the Super-Kamiokande experiment creates a telltale light pattern on the detector walls.

A neutrino passing through the Super-Kamiokande experiment creates a telltale light pattern on the detector walls.

From above, you might mistake the hole in the ground for a gigantic elevator shaft. Instead, it leads to an experiment that might reveal why matter didn’t disappear in a puff of radiation shortly after the Big Bang.

I’m at the Japan Proton Accelerator Research Complex, or J-PARC — a remote and well-guarded government facility in Tokai, about an hour’s train ride north of Tokyo. The experiment here, called T2K (for Tokai-to-Kamioka) produces a beam of the subatomic particles called neutrinos. The beam travels through 295 kilometers of rock to the Super-Kamiokande (Super-K) detector, a gigantic pit buried 1 kilometer underground and filled with 50,000 tons (about 13 million gallons) of ultrapure water. During the journey, some of the neutrinos will morph from one “flavor” into another.

In this ongoing experiment, the first results of which were reported last year, scientists at T2K are studying the way these neutrinos flip in an effort to explain the predominance of matter over antimatter in the universe. During my visit, physicists explained to me that an additional year’s worth of data was in, and that the results are encouraging.

Researchers don’t know why. “There must be some particle reactions that happen differently for matter and antimatter,” said Morgan Wascko, a physicist at Imperial College London. Antimatter might decay in a way that differs from how matter decays, for example. If so, it would violate an idea called charge-parity (CP) symmetry, which states that the laws of physics shouldn’t change if matter particles swap places with their antiparticles (charge) while viewed in a mirror (parity). The symmetry holds for most particles, though not all. (The subatomic particles known as quarks violate CP symmetry, but the deviations are so small that they can’t explain why matter so dramatically outnumbers antimatter in the universe.)

Last year, the T2K collaboration announced the first evidence that neutrinos might break CP symmetry, thus potentially explaining why the universe is filled with matter. “If there is CP violation in the neutrino sector, then this could easily account for the matter-antimatter difference,” said Adrian Bevan, a particle physicist at Queen Mary University of London.

Researchers check for CP violations by studying differences between the behavior of matter and antimatter. In the case of neutrinos, the T2K scientists explore how neutrinos and antineutrinos oscillate, or change, as the particles make their way to the Super-K detector. In 2016, 32 muon neutrinos changed to electron neutrinos on their way to Super-K. When the researchers sent muon antineutrinos, only four became electron antineutrinos.

That result got the community excited — although most physicists were quick to point out that with such a small sample size, there was still a 10 percent chance that the difference was merely a random fluctuation. (By comparison, the 2012 Higgs boson discovery had less than a 1-in-1 million probability that the signal was due to chance.)

This year, researchers collected nearly twice the amount of neutrino data as last year. Super-K captured 89 electron neutrinos, significantly more than the 67 it should have found if there was no CP violation. And the experiment spotted only seven electron antineutrinos, two fewer than expected.

 

Researchers aren’t claiming a discovery just yet. Because there are still so few data points, “there’s still a 1-in-20 chance it’s just a statistical fluke and there isn’t even any violation of CP symmetry,” said Phillip Litchfield, a physicist at Imperial College London. For the results to become truly significant, he added, the experiment needs to get down to about a 3-in-1000 chance, which researchers hope to reach by the mid-2020s.

But the improvement on last year’s data, while modest, is “in a very interesting direction,” said Tom Browder, a physicist at the University of Hawaii. The hints of new physics haven’t yet gone away, as we might expect them to do if the initial results were due to chance. Results are also trickling in from another experiment, the 810-kilometer-long NOvA at the Fermi National Accelerator Laboratory outside Chicago. Last year it released its first set of neutrino data, with antineutrino results expected next summer. And although these first CP-violation results will also not be statistically significant, if the NOvA and T2K experiments agree, “the consistency of all these early hints” will be intriguing, said Mark Messier, a physicist at Indiana University

A planned upgrade of the Super-K detector might give the researchers a boost. Next summer, the detector will be drained for the first time in over a decade, then filled again with ultrapure water. This water will be mixed with gadolinium sulfate, a type of salt that should make the instrument much more sensitive to electron antineutrinos. “The gadolinium doping will make the electron antineutrino interaction easily detectable,” said Browder. That is, the salt will help the researchers to separate antineutrino interactions from neutrino interactions, improving their ability to search for CP violations.

“Right now, we are probably willing to bet that CP is violated in the neutrino sector, but we won’t be shocked if it is not,” said André de Gouvêa, a physicist at Northwestern University. Wascko is a bit more optimistic. “The 2017 T2K result has not yet clarified our understanding of CP violation, but it shows great promise for our ability to measure it precisely in the future,” he said. “And perhaps the future is not as far away as we might have thought last year.”

The Large Hadron Collider Just Detected a New Particle That’s Heavier Than a Proton


The Large Hadron Collider has once again done what it does best – smash bits of matter together and find new particles in the carnage.

This time physicists have come across a real charmer. It’s four times heavier than a proton and could help challenge some ideas about how this kind of matter sticks together.

 We’ve seen a lot of interesting new particles from CERN’s Large Hadron Collider “beauty” (LHCb) collaboration, which is a little sister to the ATLAS and CMS experiments that brought us the famous Higgs boson a few years back.

The experiments run in CERN’s colliders all involve accelerating matter and then bringing it to a quick stop. The resulting burst of energy results in a shower of particles with different properties, most of which we’re pretty familiar with.

Running these experiments over and over again and doing the maths on the sizes and behaviours of the particles as they form and interact with one another can occasionally provide something different.

We can now officially add a new kind of baryon to the zoo of particles, one that was already predicted to exist but never before seen.

The two baryons you’re no doubt most familiar with are the ones that make up an atom’s nucleus, called protons and neutrons.

Baryons are effectively triplets of smaller particles called quarks, which are elementary particles meaning they aren’t made up of anything smaller themselves.

 Quarks come in a variety of flavours, oddly called up, down, top, bottom, charm, and strange. It’s combinations of these that give us different bosons. Current models predict there are a bunch of ways quarks can make baryons, with some more common than others.

Protons consist of two ups and a down quark, while neutrons are two downs and an up. These quarks stick together under what’s called the strong nuclear force, which is caused by the swapping of particles called gluons. Never let it be said that physicists lack a sense of humour.

This new baryon – made when two charm quarks and a single up bound together – was given the less whimsical name Xi cc++, so they can’t all be winners.

Quarks have different masses, and charm is a beefy one. That makes this baryon a touch on the heavy side, which is good news for particle physicists.

“Finding a doubly heavy-quark baryon is of great interest as it will provide a unique tool to further probe quantum chromodynamics, the theory that describes the strong interaction, one of the four fundamental forces,” said Giovanni Passaleva, the spokesperson for the LHCb collaboration.

Seeing how this particle keeps itself together compared to the predictions made by current models will help give the going theories a good shake.

 Being made of two heavy quarks should give Xi cc++ a slightly different structure to protons and neutrons.

“In contrast to other baryons, in which the three quarks perform an elaborate dance around each other, a doubly heavy baryon is expected to act like a planetary system, where the two heavy quarks play the role of heavy stars orbiting one around the other, with the lighter quark orbiting around this binary system,” says former collaboration spokesperson Guy Wilkinson.

If you’re wondering where this baryon has been hiding all this time, like many particles it doesn’t hang around very long. It wasn’t seen directly, but was recognised by the particles it broke into.

The LHCb experiment is a champion at spotting these kinds of decay products, as well as making heavy quarks.

The discovery has a high statistical significance at 7 sigma. Physicists break out the champagne at 5 sigma, so we can be pretty confident Xi cc++ was produced.

If you’re playing Standard Model bingo, that’s one more to cross off your list.

There’s still a lot we don’t know about the proton.


Questions loom about the iconic particle’s size, spin and decay

proton illustration

PROTON PUZZLES  Hidden secrets of the humble particle could have physicists rethinking some standard notions about matter and the universe.

Nuclear physicist Evangeline Downie hadn’t planned to study one of the thorniest puzzles of the proton.

But when opportunity knocked, Downie couldn’t say no. “It’s the proton,” she exclaims. The mysteries that still swirl around this jewel of the subatomic realm were too tantalizing to resist. The plentiful particles make up much of the visible matter in the universe. “We’re made of them, and we don’t understand them fully,” she says.

Many physicists delving deep into the heart of matter in recent decades have been lured to the more exotic and unfamiliar subatomic particles: mesons, neutrinos and the famous Higgs boson — not the humble proton.

Protons have issues

Three proton conundrums have scientists designing new experiments. Agreement eludes researchers on proton size, spin and stability.

Radius

Current status: Two kinds of measurements of the proton’s radius disagree.

Why do we care? Testing theories of how particles interact requires a precise radius measurement. If the discrepancy persists, it may mean that new, undiscovered particles exist.

Spin

Current status: Scientists can’t account for the sources of the proton’s known spin.

Why do we care? Understanding the spin would satisfy fundamental scientific curiosity about how the proton works.

Life span

Current status: Despite decades of searching, no one has ever seen a proton decay

Why do we care? Proton decay would be a sign that three of nature’s forces — weak, strong and electromagnetic — were united early in the universe.

But rather than chasing the rarest of the rare, scientists like Downie are painstakingly scrutinizing the proton itself with ever-higher precision. In the process, some of these proton enthusiasts have stumbled upon problems in areas of physics that scientists thought they had figured out.

Surprisingly, some of the particle’s most basic characteristics are not fully pinned down. The latest measurements of its radius disagree with one another by a wide margin, for example, a fact that captivated Downie. Likewise, scientists can’t yet explain the source of the proton’s spin, a basic quantum property. And some physicists have a deep but unconfirmed suspicion that the seemingly eternal particles don’t live forever — protons may decay. Such a decay is predicted by theories that unite disparate forces of nature under one grand umbrella. But decay has not yet been witnessed.

Like the base of a pyramid, the physics of the proton serves as a foundation for much of what scientists know about the behavior of matter. To understand the intricacies of the universe, says Downie, of George Washington University in Washington, D.C., “we have to start with, in a sense, the simplest system.”

Sizing things up

For most of the universe’s history, protons have been VIPs — very important particles. They formed just millionths of a second after the Big Bang, once the cosmos cooled enough for the positively charged particles to take shape. But protons didn’t step into the spotlight until about 100 years ago, when Ernest Rutherford bombarded nitrogen with radioactively produced particles, breaking up the nuclei and releasing protons.

A single proton in concert with a single electron makes up hydrogen — the most plentiful element in the universe. One or more protons are present in the nucleus of every atom. Each element has a unique number of protons, signified by an element’s atomic number. In the core of the sun, fusing protons generate heat and light needed for life to flourish. Lone protons are also found as cosmic rays, whizzing through space at breakneck speeds, colliding with Earth’s atmosphere and producing showers of other particles, such as electrons, muons and neutrinos.

In short, protons are everywhere. Even minor tweaks to scientists’ understanding of the minuscule particle, therefore, could have far-reaching implications. So any nagging questions, however small in scale, can get proton researchers riled up.

A disagreement of a few percent in measurements of the proton’s radius has attracted intense interest, for example. Until several years ago, scientists agreed: The proton’s radius was about 0.88 femtometers, or 0.88 millionths of a billionth of a meter — about a trillionth the width of a poppy seed.

Ladder of matter

Protons make up a large part of the universe’s visible matter and play an essential role in atomic nuclei. But the building block is still revealing surprises.

Core components

Atoms are made of protons (red) and neutrons (blue), surrounded by a cloud of electrons. The proton number determines the element.

Going inside

Protons have two “up” quarks and one “down” quark. Neutrons have two downs and one up.

Deeper dive

But protons and neutrons contain much more. Quark-antiquark pairs constantly form and annihilate around the three persistent quarks. Gluons (yellow) hold the quarks together via the strong nuclear force. Quarks have a property called “color charge” — shown here as red, green and blue — which is related to the strong force.

But that neat picture was upended in the span of a few hours, in May 2010, at the Precision Physics of Simple Atomic Systems conference in Les Houches, France. Two teams of scientists presented new, more precise measurements, unveiling what they thought would be the definitive size of the proton. Instead the figures disagreed by about 4 percent (SN: 7/31/10, p. 7). “We both expected that we would get the same number, so we were both surprised,” says physicist Jan Bernauer of MIT.

By itself, a slight revision of the proton’s radius wouldn’t upend physics. But despite extensive efforts, the groups can’t explain why they get different numbers. As researchers have eliminated simple explanations for the impasse, they’ve begun wondering if the mismatch could be the first hint of a breakdown that could shatter accepted tenets of physics.

The two groups each used different methods to size up the proton. In an experiment at the MAMI particle accelerator in Mainz, Germany, Bernauer and colleagues estimated the proton’s girth by measuring how much electrons’ trajectories were deflected when fired at protons. That test found the expected radius of about 0.88 femtometers (SN Online: 12/17/10).

But a team led by physicist Randolf Pohl of the Max Planck Institute of Quantum Optics in Garching, Germany, used a new, more precise method. The researchers created muonic hydrogen, a proton that is accompanied not by an electron but by a heftier cousin — a muon.

In an experiment at the Paul Scherrer Institute in Villigen, Switzerland, Pohl and collaborators used lasers to bump the muons to higher energy levels. The amount of energy required depends on the size of the proton. Because the more massive muon hugs closer to the proton than electrons do, the energy levels of muonic hydrogen are more sensitive to the proton’s size than ordinary hydrogen, allowing for measurements 10 times as precise as electron-scattering measurements.

Pohl’s results suggested a smaller proton radius, about 0.841 femtometers, a stark difference from the other measurement. Follow-up measurements of muonic deuterium — which has a proton and a neutron in its nucleus — also revealed a smaller than expected size, he and collaborators reported last year in Science. Physicists have racked their brains to explain why the two measurements don’t agree. Experimental error could be to blame, but no one can pinpoint its source. And the theoretical physics used to calculate the radius from the experimental data seems solid.

Now, more outlandish possibilities are being tossed around. An unexpected new particle that interacts with muons but not electrons could explain the difference (SN: 2/23/13, p. 8). That would be revolutionary: Physicists believe that electrons and muons should behave identically in particle interactions. “It’s a very sacred principle in theoretical physics,” says John Negele, a theoretical particle physicist at MIT. “If there’s unambiguous evidence that it’s been broken, that’s really a fundamental discovery.”

But established physics theories die hard. Shaking the foundations of physics, Pohl says, is “what I dream of, but I think that’s not going to happen.” Instead, he suspects, the discrepancy is more likely to be explained through minor tweaks to the experiments or the theory.

The alluring mystery of the proton radius reeled Downie in. During conversations in the lab with some fellow physicists, she learned of an upcoming experiment that could help settle the issue. The experiment’s founders were looking for collaborators, and Downie leaped on the bandwagon. The Muon Proton Scattering Experiment, or MUSE, to take place at the Paul Scherrer Institute beginning in 2018, will scatter both electrons and muons off of protons and compare the results. It offers a way to test whether the two particles behave differently, says Downie, who is now a spokesperson for MUSE.

A host of other experiments are in progress or planning stages. Scientists with the Proton Radius Experiment, or PRad, located at Jefferson Lab in Newport News, Va., hope to improve on Bernauer and colleagues’ electron-scattering measurements. PRad researchers are analyzing their data and should have a new number for the proton radius soon.

But for now, the proton’s identity crisis, at least regarding its size, remains. That poses problems for ultrasensitive tests of one of physicists’ most essential theories. Quantum electrodynamics, or QED, the theory that unites quantum mechanics and Albert Einstein’s special theory of relativity, describes the physics of electromagnetism on small scales. Using this theory, scientists can calculate the properties of quantum systems, such as hydrogen atoms, in exquisite detail — and so far the predictions match reality. But such calculations require some input — including the proton’s radius. Therefore, to subject the theory to even more stringent tests, gauging the proton’s size is a must-do task.

At the Max Planck Institute of Quantum Optics, researchers use lasers to study proton size.

Spin doctors

Even if scientists eventually sort out the proton’s size snags, there’s much left to understand. Dig deep into the proton’s guts, and the seemingly simple particle becomes a kaleidoscope of complexity. Rattling around inside each proton is a trio of particles called quarks: one negatively charged “down” quark and two positively charged “up” quarks. Neutrons, on the flip side, comprise two down quarks and one up quark.

Yet even the quark-trio picture is too simplistic. In addition to the three quarks that are always present, a chaotic swarm of transient particles churns within the proton. Evanescent throngs of additional quarks and their antimatter partners, antiquarks, continually swirl into existence, then annihilate each other. Gluons, the particle “glue” that holds the proton together, careen between particles. Gluons are the messengers of the strong nuclear force, an interaction that causes quarks to fervently attract one another.

A new spin

Scientists thought that a proton’s spin was due to the three main quarks (left, arrows indicate direction of a quark’s spin). Instead, gluons (yellow) and ephemeral pairs of quarks and antiquarks contribute through their spin and motion (gray arrows at right).

As a result of this chaos, the properties of protons — and neutrons as well — are difficult to get a handle on. One property, spin, has taken decades of careful investigation, and it’s still not sorted out. Quantum particles almost seem to be whirling at blistering speed, like the Earth rotating about its axis. This spin produces angular momentum — a quality of a rotating object that, for example, keeps a top revolving until friction slows it. The spin also makes protons behave like tiny magnets, because a rotating electric charge produces a magnetic field. This property is the key to the medical imaging procedure called magnetic resonance imaging, or MRI.

But, like nearly everything quantum, there’s some weirdness mixed in: There’s no actual spinning going on. Because fundamental particles like quarks don’t have a finite physical size — as far as scientists know — they can’t twirl. Despite the lack of spinning, the particles still behave like they have a spin, which can take on only certain values: integer multiples of ½.

Quarks have a spin of ½, and gluons a spin of 1. These spins combine to help yield the proton’s total spin. In addition, just as the Earth is both spinning about its own axis and orbiting the sun, quarks and gluons may also circle about the proton’s center, producing additional angular momentum that can contribute to the proton’s total spin.

Somehow, the spin and orbital motion of quarks and gluons within the proton combine to produce its spin of ½. Originally, physicists expected that the explanation would be simple. The only particles that mattered, they thought, were the proton’s three main quarks, each with a spin of ½. If two of those spins were oriented in opposite directions, they could cancel one another out to produce a total spin of ½. But experiments beginning in the 1980s showed that “this picture was very far from true,” says theoretical high-energy physicist Juan Rojo of Vrije University Amsterdam. Surprisingly, only a small fraction of the spin seemed to be coming from the quarks, befuddling scientists with what became known as the “spin crisis” (SN: 9/6/97, p. 158). Neutron spin was likewise enigmatic.

Scientists’ next hunch was that gluons contribute to the proton’s spin. “Verifying this hypothesis was very difficult,” Rojo says. It required experimental studies at the Relativistic Heavy Ion Collider, RHIC, a particle accelerator at Brookhaven National Laboratory in Upton, N.Y.

In these experiments, scientists collided protons that were polarized: The two protons’ spins were either aligned or pointed in opposite directions. Researchers counted the products of those collisions and compared the results for aligned and opposing spins. The results revealed how much of the spin comes from gluons. According to an analysis by Rojo and colleagues, published in Nuclear Physics B in 2014, gluons make up about 35 percent of the proton’s spin. Since the quarks make up about 25 percent, that leaves another 40 percent still unaccounted for.

“We have absolutely no idea how the entire spin is made up,” says nuclear physicist Elke-Caroline Aschenauer of Brookhaven. “We maybe have understood a small fraction of it.” That’s because each quark or gluon carries a certain fraction of the proton’s energy, and the lowest energy quarks and gluons cannot be spotted at RHIC. A proposed collider, called the Electron-Ion Collider (location to be determined), could help scientists investigate the neglected territory.

The Electron-Ion Collider could also allow scientists to map the still-unmeasured orbital motion of quarks and gluons, which may contribute to the proton’s spin as well.

The PHENIX experiment at Brookhaven National Laboratory uses a giant detector to investigate spin.

An unruly force

Experimental physicists get little help from theoretical physics when attempting to unravel the proton’s spin and its other perplexities. “The proton is not something you can calculate from first principles,” Aschenauer says. Quantum chromo-dynamics, or QCD — the theory of the quark-corralling strong force transmitted by gluons — is an unruly beast. It is so complex that scientists can’t directly solve the theory’s equations.

The difficulty lies with the behavior of the strong force. As long as quarks and their companions stick relatively close, they are happy and can mill about the proton at will. But absence makes the heart grow fonder: The farther apart the quarks get, the more insistently the strong force pulls them back together, containing them within the proton. This behavior explains why no one has found a single quark in isolation. It also makes the proton’s properties especially difficult to calculate. Without accurate theoretical calculations, scientists can’t predict what the proton’s radius should be, or how the spin should be divvied up.

Cosmic time

Protons have been around since the early moments of the universe. If certain theories are correct, the universe may eventually be devoid of protons.

t = 0
Big Bang
I
t < 10-6 seconds
Quarks and gluons roam freely
I
t = 10-6 s
Protons and neutrons form
I
t = 10 s
Protons and neutrons begin to form atomic nuclei
I
t = 13.8 billion years
In today’s universe, atoms have formed into stars, planets and intelligent life.
I
t = 1034 years or later
A substantial portion of protons may have decayed.

To simplify the math of the proton, physicists use a technique called lattice QCD, in which they imagine that the world is made of a grid of points in space and time (SN: 8/7/04, p. 90). A quark can sit at one point or another in the grid, but not in the spaces in between. Time, likewise, proceeds in jumps. In such a situation, QCD becomes more manageable, though calculations still require powerful supercomputers.

Lattice QCD calculations of the proton’s spin are making progress, but there’s still plenty of uncertainty. In 2015, theoretical particle and nuclear physicist Keh-Fei Liu and colleagues calculated the spin contributions from the gluons, the quarks and the quarks’ angular momentum, reporting the results in Physical Review D. By their calculation, about half of the spin comes from the quarks’ motion within the proton, about a quarter from the quarks’ spin, with the last quarter or so from the gluons. The numbers don’t exactly match the experimental measurements, but that’s understandable — the lattice QCD numbers are still fuzzy. The calculation relies on various approximations, so it “is not cast in stone,” says Liu, of the University of Kentucky in Lexington.

Death of a proton

Although protons seem to live forever, scientists have long questioned that immortality. Some popular theories predict that protons decay, disintegrating into other particles over long timescales. Yet despite extensive searches, no hint of this demise has materialized.

A class of ideas known as grand unified theories predict that protons eventually succumb. These theories unite three of the forces of nature, creating a single framework that could explain electromagnetism, the strong nuclear force and the weak nuclear force, which is responsible for certain types of radioactive decay. (Nature’s fourth force, gravity, is not yet incorporated into these models.) Under such unified theories, the three forces reach equal strengths at extremely high energies. Such energetic conditions were present in the early universe — well before protons formed — just a trillionth of a trillionth of a trillionth of a second after the Big Bang. As the cosmos cooled, those forces would have separated into three different facets that scientists now observe.

A proton’s last moments

If theories that unite fundamental forces are correct, protons should decay, with average lifetimes longer than the age of the universe. Scientists watch giant tanks of water for the telltale signatures of proton death. One possible type of decay is described below.


A proton, made of three quarks, awaits its fate.

In an extremely rare event, two quarks unite, producing a new particle.

The new particle, an X boson, persists for a brief instant.

The X boson releases a positron and an antiquark. No longer a proton, the two remaining particles are a pion.

Finally, the pion decays into two photons, which can be detected, along with the positron.

“We have a lot of circumstantial evidence that something like unification must be happening,” says theoretical high-energy physicist Kaladi Babu of Oklahoma State University in Stillwater. Beyond the appeal of uniting the forces, grand unified theories could explain some curious coincidences of physics, such as the fact that the proton’s electric charge precisely balances the electron’s charge. Another bonus is that the particles in grand unified theories fill out a family tree, with quarks becoming the kin of electrons, for example.

Under these theories, a decaying proton would disintegrate into other particles, such as a positron (the antimatter version of an electron) and a particle called a pion, composed of a quark and an antiquark, which itself eventually decays. If such a grand unified theory is correct and protons do decay, the process must be extremely rare — protons must live a very long time, on average, before they break down. If most protons decayed rapidly, atoms wouldn’t stick around long either, and the matter that makes up stars, planets — even human bodies — would be falling apart left and right.

Protons have existed for 13.8 billion years, since just after the Big Bang. So they must live exceedingly long lives, on average. But the particles could perish at even longer timescales. If they do, scientists should be able to monitor many particles at once to see a few protons bite the dust ahead of the curve (SN: 12/15/79, p. 405). But searches for decaying protons have so far come up empty.

Still, the search continues. To hunt for decaying protons, scientists go deep underground, for example, to a mine in Hida, Japan. There, at the Super-Kamiokande experiment (SN: 2/18/17, p. 24), they monitor a giant tank of water — 50,000 metric tons’ worth — waiting for a single proton to wink out of existence. After watching that water tank for nearly two decades, the scientists reported in the Jan. 1 Physical Review D that protons must live longer than 1.6 × 1034 years on average, assuming they decay predominantly into a positron and a pion.

Experimental limits on the proton lifetime “are sort of painting the theorists into a corner,” says Ed Kearns of Boston University, who searches for proton decay with Super-K. If a new theory predicts a proton lifetime shorter than what Super-K has measured, it’s wrong. Physicists must go back to the drawing board until they come up with a theory that agrees with Super-K’s proton-decay drought.

Many grand unified theories that remain standing in the wake of Super-K’s measurements incorporate supersymmetry, the idea that each known particle has another, more massive partner. In such theories, those new particles are additional pieces in the puzzle, fitting into an even larger family tree of interconnected particles. But theories that rely on supersymmetry may be in trouble. “We would have preferred to see supersymmetry at the Large Hadron Collider by now,” Babu says, referring to the particle accelerator located at the European particle physics lab, CERN, in Geneva, which has consistently come up empty in supersymmetry searches since it turned on in 2009 (SN: 10/1/16, p. 12).

Story continues after sidebar

Persnickety protons

Scientists might solve some of their proton dilemmas with new data — for example, by spotting a proton decaying into a positron and two photons, as in the simulated data from the Super-Kamiokande detector below. But plenty more questions await exploration.

ED KEARNS/SUPER-KAMIOKANDE COLLABORATION

Why are quarks confined within the proton? Scientists observe that quarks don’t live on their own, but no one has been able to mathematically demonstrate that they can’t.How are the quarks and gluons arranged inside the proton? Gluons might be more common on the proton’s outskirts than in its center, for example.

Each quark and gluon carries a certain amount of the proton’s energy. How is that energy divvied up?

Aside from their electric charges, protons and antiprotons appear the same. Do they differ on some level not yet measured?

But supersymmetric particles could simply be too massive for the LHC to find. And some grand unified theories that don’t require supersymmetry still remain viable. Versions of these theories predict proton lifetimes within reach of an upcoming generation of experiments. Scientists plan to follow up Super-K with Hyper-K, with an even bigger tank of water. And DUNE, the Deep Underground Neutrino Experiment, planned for installation in a former gold mine in Lead, S.D., will use liquid argon to detect protons decaying into particles that the water detectors might miss.

If protons do decay, the universe will become frail in its old age. According to Super-K, sometime well after its 1034 birthday, the cosmos will become a barren sea of light. Stars, planets and life will disappear. If seemingly dependable protons give in, it could spell the death of the universe as we know it.

Although protons may eventually become extinct, proton research isn’t going out of style anytime soon. Even if scientists resolve the dilemmas of radius, spin and lifetime, more questions will pile up — it’s part of the labyrinthine task of studying quantum particles that multiply in complexity the closer scientists look. These deeper studies are worthwhile, says Downie. The inscrutable proton is “the most fundamental building block of everything, and until we understand that, we can’t say we understand anything else.”

Source:www.sciencenews.org

“Indistinguishable Photons” Could Unleash Quantum Computing.


Researchers have discovered an entirely new way of generating “indistinguishable photons,” the hard-to-create sources of energy we need to power quantum computers. It’s a crucial step in our development of quantum technology, and naturally, ties back to artificial intelligence. The research was published Tuesday in the journal Applied Physics Letters.

Quantum Computing

The phrase “indistinguishable photons” refers to them being indistinguishable from each other, and they’re a vital source of energy for quantum computers. A regular computer processes and stores information in binary, meaning the bits are always either 0 or 1. A quantum computer, though, harnesses principles of quantum mechanics so that the bits (cutely named “qubits”) can also be 0 and 1 simultaneously.

Scientists at the University of Tsukuba and Japan’s National Institute for Materials Science forged a new path for creating indistinguishable photons by testing the nitrogen impurity centers within III-V compound semiconductors. The elements within those centers facilitate a state of energy called an “isoelectronic trap,” which generates photons that contain the same energy — and are therefore indistinguishable.

This research marks the first time anyone’s used the nitrogen luminescence centers found within certain semiconductors to create the phenomenon. There are already a few established sources for generating identical photons, namely semiconductor quantum dots (an entirely separate rabbit hole you may fall down here if you are so inclined). But this new method is potentially faster and more conducive to the photons’ homogeneity, which isn’t always precise enough when created through the requisite huge numbers of quantum dots (more dots = more variability in charge).

“[I]ndistinguishable photons are very important for quantum information technology such as quantum teleportation and linear optical quantum computation,” says first author Michio Ikezawa. “Our goal is to be able to provide many photon sources that generate indistinguishable photons in an integrated form in a semiconductor chip.”

Why Quantum Comping Is So Important

“[Q]uantum computer scientists believe quantum computers can solve problems that are intractable for conventional computers. That is, it’s not that quantum computers are like regular computers, but smaller and faster. Rather, quantum computers work according to principles entirely different than conventional computers, and using those principles can solve problems whose solution will never be feasible on a conventional computer,” explained quantum physicist Michael Nielson in this great 2008 blog post.

Even someone like Nielson, who has worked in quantum computing for more than a decade, struggles to produce an adequate explanation of how quantum computers actually work — which is, really, the whole point. If they functioned in a way that our brains could recognize and interpret, they would just be advanced versions of binary computers, rather than something in a class of their own.

Quantum processing drives the deep learning of A.I., and the more smoothly we can facilitate the creation of indistinguishable photons, the better-functioning and more advanced our A.I. can become. Since that processing occurs at the single-particle level, it requires photons that are essentially all the same.

How It Could Affect A.I.

It’s too early to say for sure what tangible impacts this will ultimately have on A.I., because the method itself still needs to be refined. The researchers observed that while the degree of indistinguishability they obtained was high, it wasn’t as high as it could be. One of the next steps will be to get a more comprehensive look at the mechanisms that are likely to account for the interference and develop a way to compensate for them. But if and when that research is successful, we could potentially be looking at the new standard for creating identical photons, one which overtakes and improves upon the use of quantum dots to power our next developments in A.I.

“While atoms have long been the gold standard for emitting such indistinguishable photons because of their high stability, there is a race among solid-state emitters such as quantum dots, nitrogen-vacancy centers in diamond, and other color centers to determine the leading candidate for integration with future quantum computers and quantum networks,” Gurudev Dutt, a quantum physicist at the University of Pittsburgh who was not involved in the research, tells Inverse. “This work demonstrates that the [nitrogen centers in semiconductors are] starting to emerge as an important competitor in this arena.”

Source:http://www.inverse.com

%d bloggers like this: