Reality Doesn’t Exist Until We Measure It, Quantum Experiment Confirms

Australian scientists have recreated a famous experiment and confirmed quantum physics’s bizarre predictions about the nature of reality, by proving that reality doesn’t actually exist until we measure it – at least, not on the very small scale.

That all sounds a little mind-meltingly complex, but the experiment poses a pretty simple question: if you have an object that can either act like a particle or a wave, at what point does that object ‘decide’?

Our general logic would assume that the object is either wave-like or particle-like by its very nature, and our measurements will have nothing to do with the answer. But quantum theory predicts that the result all depends on how the object is measured at the end of its journey. And that’s exactly what a team from the Australian National University has now found.

“It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it,” lead researcher and physicist Andrew Truscott said in a press release.

Known as John Wheeler’s delayed-choice thought experiment, the experiment was first proposed back in 1978 using light beams bounced by mirrors, but back then, the technology needed was pretty much impossible. Now, almost 40 years later, the Australian team has managed to recreate the experiment using helium atoms scattered by laser light.

“Quantum physics predictions about interference seem odd enough when applied to light, which seems more like a wave, but to have done the experiment with atoms, which are complicated things that have mass and interact with electric fields and so on, adds to the weirdness,” said Roman Khakimov, a PhD student who worked on the experiment.

To successfully recreate the experiment, the team trapped a bunch of helium atoms in a suspended state known as a Bose-Einstein condensate, and then ejected them all until there was only a single atom left.

This chosen atom was then dropped through a pair of laser beams, which made a grating pattern that acted as a crossroads that would scatter the path of the atom, much like a solid grating would scatter light.

They then randomly added a second grating that recombined the paths, but only after the atom had already passed the first grating.

When this second grating was added, it led to constructive or destructive interference, which is what you’d expect if the atom had travelled both paths, like a wave would. But when the second grating was not added, no interference was observed, as if the atom chose only one path.

The fact that this second grating was only added after the atom passed through the first crossroads suggests that the atom hadn’t yet determined its nature before being measured a second time.

So if you believe that the atom did take a particular path or paths at the first crossroad, this means that a future measurement was affecting the atom’s path, explained Truscott. “The atoms did not travel from A to B. It was only when they were measured at the end of the journey that their wave-like or particle-like behaviour was brought into existence,” he said.

Although this all sounds incredibly weird, it’s actually just a validation for the quantum theory that already governs the world of the very small. Using this theory, we’ve managed to develop things like LEDs, lasers and computer chips, but up until now, it’s been hard to confirm that it actually works with a lovely, pure demonstration such as this one.

Source: Nature Physics.


China Has Built The World’s 1st Quantum Computer, And It’s 24,000 Times Faster Than Competitors.

China has managed to build the world’s first ever quantum computer, which clocks in at a whopping 24,000 times faster than its international counterparts.

A 128 qubit quantum chip constructed by D-Wave Systems


Created by researchers at the University of Science and Technology at Hefei in the Anhui province, the quantum computer is a highly advanced machine, capable of performing multiple complex calculations simultaneously, like predicting the movement and behaviour of subatomic particles.

 The basic idea behind quantum computers is that they perform calculations by keeping the system’s memory in a quantum state. So what exactly does that mean? Traditional computers store data in bits, which are represented by a 1 or 0, the foundation of binary. Quantum computer however, seek to apply the phenomenon of superposition and entanglement from quantum physics to the equation. In this case, each quantum bit (or qubit) can store a 1, 0 or any superposition of the two.

The short of it is that traditional computers process one operation at a time, whereas quantum computers can look at multiple states of data and calculate their outcomes simultaneously.

While the theory behind quantum computing has been around for decades, actually building the architecture required has proven to be a challenge. So far, researchers have only been able to replicate the technology with very small amounts of memory, less than what’s in your smartphone.

That’s why China’s development is so important; it marks the first single photon-based quantum computing machine that goes beyond the early classical computer. Despite using a fraction of the memory hardware, the Hefei quantum computer is already 10 to 100 times faster than ENIA, the first ever digital computer built in 1946. The advancement paves the way for scaled up quantum computers in the future, that will far outclass any supercomputers currently in existence.

“Indistinguishable Photons” Could Unleash Quantum Computing.

Researchers have discovered an entirely new way of generating “indistinguishable photons,” the hard-to-create sources of energy we need to power quantum computers. It’s a crucial step in our development of quantum technology, and naturally, ties back to artificial intelligence. The research was published Tuesday in the journal Applied Physics Letters.

Quantum Computing

The phrase “indistinguishable photons” refers to them being indistinguishable from each other, and they’re a vital source of energy for quantum computers. A regular computer processes and stores information in binary, meaning the bits are always either 0 or 1. A quantum computer, though, harnesses principles of quantum mechanics so that the bits (cutely named “qubits”) can also be 0 and 1 simultaneously.

Scientists at the University of Tsukuba and Japan’s National Institute for Materials Science forged a new path for creating indistinguishable photons by testing the nitrogen impurity centers within III-V compound semiconductors. The elements within those centers facilitate a state of energy called an “isoelectronic trap,” which generates photons that contain the same energy — and are therefore indistinguishable.

This research marks the first time anyone’s used the nitrogen luminescence centers found within certain semiconductors to create the phenomenon. There are already a few established sources for generating identical photons, namely semiconductor quantum dots (an entirely separate rabbit hole you may fall down here if you are so inclined). But this new method is potentially faster and more conducive to the photons’ homogeneity, which isn’t always precise enough when created through the requisite huge numbers of quantum dots (more dots = more variability in charge).

“[I]ndistinguishable photons are very important for quantum information technology such as quantum teleportation and linear optical quantum computation,” says first author Michio Ikezawa. “Our goal is to be able to provide many photon sources that generate indistinguishable photons in an integrated form in a semiconductor chip.”

Why Quantum Comping Is So Important

“[Q]uantum computer scientists believe quantum computers can solve problems that are intractable for conventional computers. That is, it’s not that quantum computers are like regular computers, but smaller and faster. Rather, quantum computers work according to principles entirely different than conventional computers, and using those principles can solve problems whose solution will never be feasible on a conventional computer,” explained quantum physicist Michael Nielson in this great 2008 blog post.

Even someone like Nielson, who has worked in quantum computing for more than a decade, struggles to produce an adequate explanation of how quantum computers actually work — which is, really, the whole point. If they functioned in a way that our brains could recognize and interpret, they would just be advanced versions of binary computers, rather than something in a class of their own.

Quantum processing drives the deep learning of A.I., and the more smoothly we can facilitate the creation of indistinguishable photons, the better-functioning and more advanced our A.I. can become. Since that processing occurs at the single-particle level, it requires photons that are essentially all the same.

How It Could Affect A.I.

It’s too early to say for sure what tangible impacts this will ultimately have on A.I., because the method itself still needs to be refined. The researchers observed that while the degree of indistinguishability they obtained was high, it wasn’t as high as it could be. One of the next steps will be to get a more comprehensive look at the mechanisms that are likely to account for the interference and develop a way to compensate for them. But if and when that research is successful, we could potentially be looking at the new standard for creating identical photons, one which overtakes and improves upon the use of quantum dots to power our next developments in A.I.

“While atoms have long been the gold standard for emitting such indistinguishable photons because of their high stability, there is a race among solid-state emitters such as quantum dots, nitrogen-vacancy centers in diamond, and other color centers to determine the leading candidate for integration with future quantum computers and quantum networks,” Gurudev Dutt, a quantum physicist at the University of Pittsburgh who was not involved in the research, tells Inverse. “This work demonstrates that the [nitrogen centers in semiconductors are] starting to emerge as an important competitor in this arena.”


How Quantum Technology Is Making Computers Millions Of Times More Powerful

When the first digital computer was built in the 1940s, it revolutionized the world of data calculation. When the first programming language was introduced in the 1950s, it transformed digital computers from impractical behemoths to real-world tools. And when the keyboard and mouse were added in the 1960s, it brought computers out of industry facilities and into people’s homes. There’s a technology that has the potential to change the world in an even bigger way than any of those breakthroughs. Welcome to the future of quantum computing.


1s and 0s (And Everything In Between)

Every computer you’ve ever encountered works on the principles of a Turing machine: they manipulate little particles of information, called bits, that exist as either a 0 or a 1, a system known as binary. The fundamental difference in a quantum computer is that it’s not limited to those two options. Their “quantum bits,” or qubits, can exist as 0, 1, or a superposition of 0 and 1—that is, both 0 and 1 and all points in between. It’s only once you measure them that they “decide” on a value. That’s what’s so groundbreaking about quantum computing: Conventional computers can only work on one computation at a time; the fastest just have ways of making multiple components work on separate tasks simultaneously. But the magic of superposition gives quantum computers the ability to work on a million computations at once. With that kind of power, just imagine what humanity could accomplish!

But that’s not all that makes quantum computing so impressive—there’s also the phenomenon of entanglement. Qubits don’t exist in a vacuum. Generally, systems of multiple qubits are entangled, so that they each take on the properties of the others. Take an entangled system of two qubits, for example. Once you measure one qubit, it “chooses” one value. But because of its relationship, or correlation, to the other qubit, that value instantly tells you the value of the other qubit—you don’t even need to measure it. When you add more qubits to the system, those correlations get more complicated. According to Plus magazine, “As you increase the number of qubits, the number of those correlations grows exponentially: for n qubits there are 2n correlations. This number quickly explodes: to describe a system of 300 qubits you’d already need more numbers than there are atoms in the visible Universe.” But that’s just the point—because those numbers are beyond what we could ever record with a conventional computer, the hope is that quantum computers could crunch unfathomably large amounts of information that conventional computers could never dream about.

D-Wave 2000Q quantum computer

The First Steps Of Quantum Computing

 In the future, quantum computers could revolutionize everything from human genomics to artificial intelligence—and over just a few decades, the technology has already gotten to the point where that’s a very real possibility. In 1998, researchers successfully analyzed information from a single qubit, and in 2000, scientists at Los Alamos National Laboratory unveiled the first 7-qubit quantum computer. Less than two decades later, and D-Wave’s 1,000-qubit quantum computers are being used by the likes of Google, NASA, and Lockheed Martin, and a 2,000-qubit quantum computer is being unveiled. Feel like replacing your old laptop? That quantum computer will run you a cool $15 million—a small price to pay for a millionfold improvement.

Watch And Learn: Our Favorite Content About The Future Of Computers

Quantum Computers Explained

  1. Transistors can either block or open the way for bits of information to pass.00:54
  2. Four classical bits can be in one of 16 different configurations at once; quantum qubits can be in all 16 combinations at once.03:44
  3. Quantum computers could better simulate the quantum world, possibly leading to insights in medicine and other fields.06:12

The Double-Slit Experiment Cracked Reality Wide Open

The double-slit experiment seems simple enough: cut two slits in a sheet of metal and send light through them, first as a constant wave, then in individual particles. What happens, though, is anything but simple. In fact, it’s what started science down the bizarre road of quantum mechanics.

You Got Particles In My Waves

 In the early 1800s, the majority of scientists believed that light was made up of particles, not waves. English scientists Thomas Young had a hunch that the particle theory wasn’t the end of the story, and set out to prove that light was a wave. He knew that waves interacted in predictable ways, and if he could demonstrate those interactions with light, he would have proven that light was indeed a wave. So he set up an experiment: he cut two slits in a sheet of metal and shone light through them onto a screen.

If light was indeed made of particles, the particles that hit the sheet would bounce off and those that passed through the slits would create the image of two slits on the screen, sort of like spraying paint on a stencil. But if light was a wave, it would do something very different: once they passed through the slits, the light waves would spread out and interact with one another. Where the waves met crest-to-crest, they’d strengthen each other and leave a brighter spot on the screen. Where they met crest-to-trough, they would cancel each other out, leaving a dark spot on the screen. That would produce what’s called an “interference pattern” of one very bright slit shape surrounded by “echoes” of gradually darker slit shapes on either side. Sure enough, that’s what happened. Light traveled in waves. All’s well that ends well, right?

A light wave passing through two slits interacts with itself to create an interference pattern on a screen.

Wait, That Can’t Be Right

 Around the turn of the 20th century, a few scientists began to refine this idea. Max Planck suggested that light and other types of radiation come in discrete amounts—it’s “quantized”—and Albert Einstein proposed the idea of the photon, a “quantum” of light that behaves like a particle. As a result, he said that light was both a particle and a wave. So back to the double-slit experiment: remember when we said if light was a particle, it would create a sort of spray-paint stencil pattern instead of an interference pattern? By using a special tool, you actually can send light particles through the slits one by one. But when scientists did this, something strange happened.

The interference pattern still showed up.

This suggests something very, very weird is going on: the photons seem to “know” where they would go if they were in a wave. It’s as if a theater audience showed up without seat assignments, but each person still knew the exact seat to choose in order to fill the theater correctly. As Popular Mechanics puts it, this means that “all the possible paths of these particles can interfere with each other, even though only one of the possible paths actually happens.” All realities exist at once (a concept known as superposition) until the final result occurs.

Weirder still, when scientists placed detectors at each slit to determine which slit each photon was passing through, the interference pattern disappeared. That suggests that the very act of observing the photons “collapses” those many realities into one. Mind blowing, right? It is for scientists too, which is why quantum mechanics is one of the most hotly debated areas of modern science.

Watch And Learn: The Most Mind-Melting Videos about Quantum Physics

The Quantum Experiment That Broke Reality

It changed science forever.

According To Quantum Mechanics, Reality Might Not Exist Without An Observer

If a tree falls in the forest and there’s no one around to hear it, does it make a sound? The obvious answer is yes—a tree falling makes a sound whether or not we hear it—but certain experts in quantum mechanics argue that without an observer, all possible realities exist. That means that the tree both falls and doesn’t fall, makes a sound and is silent, and all other possibilities therein. This was the crux of the debate between Niels Bohr and Albert Einstein. Learn more about it in the video below.

Quantum Entanglement And The Bohr-Einstein Debate

Does reality exist when we’re not watching?

The Double Slit Experiment

Learn about one of the most famous experiments in quantum physics.

Watch the video. URL:

An Illustrated Lesson In Quantum Entanglement

Delve into this heavy topic with some light animation.

Watch the video. URL:

Atomic Spins Evade Heisenberg Uncertainty Principle.

New measurements revise the limits of quantum fuzziness.

Many seemingly unrelated scientific techniques, from NMR spectroscopy to medical MRI and timekeeping using atomic clocks, rely on measuring atomic spin – the way an atom’s nucleus and electrons rotate around each other. The limit on how accurate these measurements can be is set by the inherent fuzziness of quantum mechanics. However, physicists in Spain have demonstrated that this limit is much less severe than previously believed, measuring two crucial quantities simultaneously with unprecedented precision.

Central to the limits of quantum mechanics is the Heisenberg uncertainty principle, which states that it is not possible to know a particle’s position and momentum with absolute accuracy, and the more precisely you measure one quantity, the less you know about the other. This is because to measure its position you have to disturb its momentum by hitting it with another particle and observing how the momentum of this second particle changes. A similar principle applies to measuring a particle’s spin angular momentum, which involves observing how the polarisation of incident light is changed by the interaction with the particle – every measurement disturbs the atom’s spin slightly. To infer the spin precession rate, you need to measure the spin angle, as well as its overall amplitude, repeatedly. However, every measurement disturbs the spin slightly, creating a minimum possible uncertainty.

The alternative approach suggested by Morgan Mitchell’s group at the Institute of Photonic Sciences in Barcelona, could circumvent this problem. The spin angle, they say, is in fact two angles: the azimuthal angle (like longitude on the Earth’s surface) and the polar angle (like latitude). To measure the precession rate, you need only the azimuthal angle. Therefore, by loading as much uncertainty as possible into the polar angle, you can measure the two quantities you need – the azimuthal angle and amplitude of the spin – and therefore measure the spin precession rate much more accurately than previously thought possible. ‘There are experiments that people are doing now that people expect to be limited by the Heisenberg uncertainty principle which in fact are not,’ says Mitchell.

Actually achieving this in practice, however, proved extremely difficult. The team cooled down a cloud of atoms to a few microkelvin, applied a magnetic field to produce spin motion and illuminated the cloud with a laser to measure the orientation of the atomic spins. ‘Not all the technologies we used for the experiment existed when we started,’ says Giorgio Colangelo, another member of the research team. ‘We had to design and develop a particular detector that was fast enough and with very low noise. We also had to improve a lot the way we were preparing the atoms and find a way to efficiently use all the dynamic range we had in the detector.’ The researchers hope that atomic timekeeping and nitrogen-vacancy magnetometry, which uses the precession of nitrogen defects in diamonds to measure magnetic fields, may benefit from the techniques unveiled here in the next few years. ‘We really hope that, in the long term, magnetic resonance techniques such as NMR and MRI may benefit, but right now they are limited by some other effects,’ says Colangelo.

Eugene Polzik of the University of Copenhagen in Denmark is impressed: ‘It sets a new and clever way of measuring certain magnetic field disturbances using an ensemble of quantum spins,’ he says. ‘It would be easy for me to look at this and say “Oh, yes, right: it doesn’t contradict quantum mechanics,” but to figure out how to achieve this, to understand how relevant it is and under what circumstances it is relevant – this is an excellent and elegant development.’


G Colangelo et al, Nature, 2017, DOI: 10.1038/nature21434

This strange light particle behaviour challenges our understanding of quantum theory.

It’s even spookier than we predicted.

 Scientists investigating how light particles (or photons) experience entanglement on the quantum scale have discovered something entirely unexpected, and it challenges long-held assumptions about the initial moments of what Einstein referred to as “spooky action at a distance”.

When the team created entangled pairs of photons, these particles didn’t originate in the same place and break away as predicted – they emerged from entirely different points in space, which means quantum theory might have to account for a whole lot more randomness than we thought.

 “Until now, it has been assumed that such paired photons come from the same location,” says one of the researchers, David Andrews from the University of East Anglia in the UK.

“Now, the identification of a new delocalised mechanism shows that each photon pair can be emitted from spatially separated points, introducing a new positional uncertainty of a fundamental quantum origin.”

The team figured this out by performing a very simple entanglement experiment called spontaneous parametric down-conversion (SPDC), which involves firing photon beams through a crystal such as barium borate, to generate entangled pairs of light particles.

As Spooky Action at a Distance author, George Musser, explains:

“If you set up the crystal properly, the amplification is so powerful that it turns the noise into a proper light beam. A single incoming beam (typically blue or ultraviolet) can thus conjure up two beams (typically red). This process occurs particle by particle: each blue photon splits into two red ones.”

Here’s a demonstration of the process: URL:

 When the single photons split into two – and this usually only occurs in around one in a billion photons – the pair experience quantum entanglement, a phenomenon where two particles interact in such a way that they become deeply linked, and essentially ‘share’ an existence.

This means that what happens to one particle will directly and instantly affect what happens to the other – even if its partner is many light-years away.

 It was assumed that when the single photons are split into entangled pairs, they emerge from the same point in the crystal, and share properties such as energy, momentum, and polarisation at speeds of at least 10,000 times the speed of light.

But what Andrews and his team found was that these split pairs could actually appear in entirely different parts of the crystal.

“The paired photons can emerge with separations in their origin of hundredths of a micron – despite being entangled,” he told Michael Franco at New Atlas.

“[I]t is as if they were not even born close together in terms of atomic dimensions.”

The question now is, how do we know where those different positions will be?

The researchers suspect that the positions are influenced by individual variations in the photons, and the next step will be to independently confirm this behaviour, and establish a method to predict where the photons could crop up.

There are a lot of questions now up in the air, but one thing’s for sure – photons have a whole lot more mystery to them than we gave them credit for.

As Andrews says in a press statement: “Everything has a certain quantum ‘fuzziness’ to it, and photons are not the hard little bullets of light that are popularly imagined.”

Quantum Equation Suggests The Big Bang Never Occurred – The Universe Has No Beginning

When it comes to the science regarding the true nature of our reality, you won’t find a shortage of theories, or a shortage of criticisms of each theory. We are like a race with amnesia, trying to discover and search for an answer that most probably exists, but has yet to be discovered. How did the universe begin?


According to new research, there might not have been a big bang. Instead, the universe might have existed forever. The theory was derived from the mathematics of general relativity, and compliment Einstein’s theory of general relativity.

“The Big Bang singularity is the most serious problem of general relativity because the laws of physics appear to break down there.”  – Ahmed Farag Ali, Benha University, Co-Author of the study. (source)

The big bang theory postulates that everything in existence resulted from a single event that launched the creation of the entire universe and that everything in existence today was once part of a single infinitely dense point, also known as the “singularity.”

Here is a good picture representing what the big bang theory is referring to.


So the big bang, again, postulates that the universe started out as an infinitely small point in space called a singularity, then exploded and created space where there was no space before, and that it is continually expanding. One big question regarding that expansion is; how did it happen? As you can see in the picture, “who is that guy?

According to Nassim Haramein, the Director of Research for the Resonance Project

“For every action there is an equal opposite reaction.” is one of the most foundational and proven concepts in all of physics. Therefore, if the universe is expanding then “the guy” (or whatever “he” is), who is blowing up that balloon, has to have some huge lungs that are contracting to be able to blow it up. This a concept that Nassim Haramein began exploring when creating an alternative unified field theory to explain the universe.” (source)

This is one out of many criticisms regarding the big bang theory. There are many considerations to be pondered. Can something come from nothing? What about quantum mechanics and the possibility that there is no moment of time at which the universe did not exist?

Again, so many considerations to be pondered.

According to

“The scientists propose that this fluid might be composed of gravitons—hypothetical massless particles that mediate the force of gravity. If they exist, gravitons are thought to play a key role in a theory of quantum gravity.In a related paper, Das and another collaborator, Rajat Bhaduri of McMaster University, Canada, have lent further credence to this model. They show that gravitons can form a Bose-Einstein condensate (named after Einstein and another Indian physicist, Satyendranath Bose) at temperatures that were present in the universe at all epochs.” (source)

The theory also suggests (obviously) that there are no singularities or dark matter, and that the universe is filled with a “quantum fluid.” These scientists are suggesting that this quantum fluid is filled with gravitons.

According to

“In a related paper, Das and another collaborator, Rajat Bhaduri of McMaster University, Canada, have lent further credence to this model. They show that gravitons can form a Bose-Einstein condensate (named after Einstein and another Indian physicist, Satyendranath Bose) at temperatures that were present in the universe at all epochs.”

As you can see, when quantum mechanics is thrown into the equation things appear to be far different. Again, this new theory is suggesting that the universe could have always existed, that it never was what we perceive to be as “the  beginning.” Perhaps it was just an event that did occur that we perceive as the beginning, perhaps the event occurred not from nothing, but something. Again, who is that guy blowing on the balloon in the picture? There is something there that has yet to be discovered.

“As far as we can see, since different points in the universe never actually converged in the past, it did not have a beginning. It lasted forever. It will also not have an end, in other words, there is no singularity. The universe could have lasted forever. It could have gone through cycles of being small and big. or it could have been created much earlier.” –  Saurya Das at the University of Lethbridge in Alberta, Canada, Co-Author of the study. (source)

What We Know Is Often Just Theory

To conclude, it’s clear that we do not yet have a solid explanation regarding what happened during the Big Bang, or if it even happened at all. This new theory is combining general relativity with quantum mechanics, and at the end of the day these are all just theories.

Not to mention the fact that theories regarding multiple dimensions, multiple universes and more have to be considered. When looking for the starting point of creation, our own universe might not even be the place to start. It might be hard given the fact that we cannot yet perceive other factors that have played a part in the make up of what we call reality. What is even harder is the fact that quantum physics is showing that the true nature and make up of the universe is not a physical material thing!

We just don’t know yet, and there are still new findings in modern day physics that delve into non-materialistic science that many mainstream materialistic scientists have yet to grasp and acknowledge.

I’ll leave you with a quote that might give you something to think about:

“A fundamental conclusion of the new physics also acknowledges that the observer creates the reality. As observers, we are personally involved with the creation of our own reality. Physicists are being forced to admit that the universe is a “mental” construction. Pioneering physicist Sir James Jeans wrote: “The stream of knowledge is heading toward a non-mechanical reality; the universe begins to look more like a great thought than like a great machine. Mind no longer appears to be an accidental intruder into the realm of matter, we ought rather hail it as the creator and governor of the realm of matter.” (R. C. Henry, “The Mental Universe”; Nature 436:29, 2005)

“Despite the unrivaled empirical success of quantum theory, the very suggestion that it may be literally true as a description of nature is still greeted with cynicism, incomprehension and even anger. (T. Folger, “Quantum Shmantum”; Discover 22:37-43, 2001)


A Deeper Look into Quantum Mechanics

Superconducting qubits are tops UCSB

Winfried Hensinger is the director of the Sussex Centre for Quantum Technologies in England, and he has spent a lifetime devoted to studying the ins and outs of quantum mechanics and just what it can do for us. When Hensinger first started in the field, quantum computing was still very much a theory, but now it is all around us, and various projects are within reach of creating a universal quantum computer. So, now that scientists are taking quantum computing more seriously it won’t be long before the field begins to explode and applications that we never even imagined possible will become available to use.

Quantum computing works with information that is stored in qubits which have a value of either 1, 0, or any quantum superposition of the two states.  The notion behind quantum superposition is that a quantum object has the ability to occupy more than one state until it’s measured.  Because quantum objects are used in this kind of computing, any given set of quantum values can represent much more data than binary ever could because data is not limited to 1’s and 0’s.

Currently, researchers are still battling it out to create a successful quantum computer, but they still have a way to go.  Systems have been constructed that has access to a few qubits and are good for testing hardware configurations or running some algorithms, but are very expensive and still very basic.  When Hensinger was asked about the current changes within quantum computing, he simply replied, “It used to be a physics problem.  Now, it’s and engineering problem.” 

Two possibilities from researchers of what the foundation of quantum computing should look like are superconducting qubits and trapped ions. Superconducting qubits relies on supercooled electrical circuits and could bring many advantages to the manufacturing process when making them on a mass scale.  The trapped ions method refers to a method that can cope with environmental factors but has trouble controlling lots of charged atoms within a vacuum.  Hensinger supports both of these implementations and believes they will both produce a quantum computer.  During his research, Hensinger’s results showed that the trapped ions method was slightly ahead of the competition.

However, Hensinger has also created his own method with the help of his team at Sussex and focuses on individually controlled voltages that are needed within the quantum circuit. He says, “With this concept, it becomes much easier to build a quantum computer.  This is one of the reasons why I’m very optimistic about trapped ions.”  Hensinger and co also chose to work with trapped ions as it works at room temperature, unlike the superconducting qubits method.

IBM, on the other hand, has chosen to work with superconducting qubits as the basis for their quantum work.  Their quantum computer consists of a five-qubit processor that’s contained within a printed circuit board.  The refrigerated system contains control wires that transmit microwave signals to the chip and send signals out through various amplifiers and passive microwave components where they are interpreted by a classical computer for easy reading of the system’s qubit state from outside the refrigerated system.  All of this takes up more than 100 square feet within IBM’s lab, and that is because of the significant cooling that needs to be done.


Jerry Chow, the manager of the Experimental Quantum Computing team at IBM, says that the reason IBM uses superconducting qubits is more to do with previous research using this technique that had been done.  And, as Chow explains, “I think superconducting qubits are really attractive because they’re micro-fabricate.  You can make them on a computer chip, on a silicon wafer, pattern them with the standard lithography techniques using transistor processes, and in that sense have a pretty straightforward route toward scaling.”

Two beryllium ions trapped 40 micrometers apart from the square gold chip in the center form the heart of this ‘trapped ion’ quantum computer. (Photo: Y. Colombe/NIST)
Two beryllium ions trapped 40 micrometers apart from the square gold chip in the center form the heart of this ‘trapped ion’ quantum computer. 
NASA’s 512-qubit Vesuvius processor is cooled to 20 millikelvin, more than 100 times colder than interstellar space. (Photo: NASA Ames/John Hardman)
NASA’s 512-qubit Vesuvius processor is cooled to 20 millikelvin, more than 100 times colder than interstellar space. 

So, one thing that we know for certain is that when it comes to quantum computing both superconducting qubits and trapped ions have emerged as the two techniques to take note of. Quantum computing will develop further over the next few years, and it’s in everyone’s best interests if large-scale quantum computers aren’t tied down to just one possible solution.  Hensinger for one is definitely in support of both ideas and notes, “It’s healthy to have different groups trying different things.”  At the moment, it’s still hard to say exactly what quantum computing will be used for in the future, but algorithms are constantly being worked on to see what hardware could be capable of using quantum computing.

A quantum algorithm is a recipe that is usually written in a mathematical format and is a solution for solving a particular problem.  But, because quantum computing does not work in the same was as classical computing, algorithms that are made from binary are useless, so new ones need to be made and that is something that Krysta Svore and team at Microsoft’s Quantum Architectures and Computation Group focuses on.  She states, “We have a programming language that we have developed explicitly for quantum computing.  Our language and our tools are called LIQUI|>.  LIQUI\> allows us to express these quantum algorithms, then it goes through a set of optimizations, compilations, and basically rewrites the language instructions into device-specific instructions.”

Svore and team at the Quantum Architecture and Computation Group have access to a simulated quantum computer that is currently running on a classical system.  This allows them to debug existing quantum algorithms, as well as design and test new ones and helps the hardware team to see how quantum computers could be used in practice.  However, IBM has taken things one step further.  As well as having a successful simulation that they can work on they have also launched the IBM Experience which is an online interface that allows students and enthusiasts to have a go themselves with a five-qubit system and run their own algorithms and experiments from the cloud-based platform. 

IBM’s five qubit processor uses a lattice architecture that scale to create larger, more powerful quantum computers. (Photo: IBM)
IBM’s five qubit processor uses a lattice architecture that scale to create larger, more powerful quantum computers.
One of the most famous applications in the world of quantum computing comes in the form of Shor’s algorithm.  Ryan O’Donnell of the Carnegie Mellon School of Computer Science in Pittsburgh said, “In 1997, Shor showed an algorithm for a quantum computer that would be able to factor such numbers very efficiently”, when referring to numbers with thousands of digits.  Ever since then it has become a kind of measuring stick for the advancement of the whole field.  One of the current applications involving quantum hardware is to research different areas of science further.

Although quantum computing is going to become more common over the next few years, it’s not suddenly going to become the next mainstream technology that is found in every office and home.  But, the technology in one form or another may do.  In the next ten years, quantum computing will develop although its exact development may not be immediately obvious to the general public as at the moment, the promise of quantum computing is much more advanced then where researchers are with it.  But, eventually, it will revolutionize computing.


Watch the video. URL:

%d bloggers like this: