What began as an invention to assist military and scientific study grew up to become one of the reigning industries in modern times. Be it extensive data management or simply just social networking, computers continue to touch our lives even more by the minute. And despite the massive amount of processing powers computer manufacturers offer today, our hunger for higher speeds and better computing capability remains unabated. All we know for sure is that technology always has a way to keep pace with our needs. Or does it?

Gordon Moore, cofounder of Intel, projected that the integration density of transistors on microprocessor chips doubles every 18 to 24 months. On very loose grounds this suggests that our thirst for speed is taken care of; but this observation comes at the ignorance of many factors. If this law proposed by Moore were to be held true, then by the year 2030, circuits on a microprocessor would need to be measured on an atomic scale!

This would naturally pose multiple problems, primary being that at such scales, the structural integrity of transistor gates becomes questionable. Also, current leakages and heat losses would naturally occur when you put several billion transistors together in a small area, switching them on and off again several billion times a second. This is one of the many reasons why processor clock speeds have been stagnated around the same value for the last ten years. The natural thing to do would be to start searching for alternatives to silicon based computing. This is what eventually drove engineers to the most obvious solution to the problem – quantum computing.

The concept of a quantum computer isn’t entirely a new one. In 1981, Paul Benioff applied certain principles of the quantum theory to computers. He proposed a quantum computer on the lines of a possible quantum Turing machine. This machine is distinguished from the conventional model of the Turing machine by the fact that it makes use of a quantum analogue of data (where data can be represented as a superposition of both 0 and 1). This means that the input symbols may be both 0 or 1 or any value between these two at the same time. While a normal Turing machine performs one calculation at a time, a quantum Turing machine can handle several of them at once.

Classical computers employ a Boolean system of data representation, where data is expressed in binary units called bits. Quantum computers, on the other hand aren’t limited to two states alone. They rely on the quantum nature of atoms and photons to store data in the form of their quantum states, represented by quantum bits or qubits.

Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. The information stored in a qubit depends on its quantum state, which is a linear region of probabilities from 0 to 1. In other words, the qubit can be thought of as having three states – 0, 1, and the superposition of the two. Since a quantum computer can contain these multiple states simultaneously, it has the potential to be a million times more powerful than today's supercomputers.

The superposition of the qubits gives quantum computers an interesting feature, an inherent parallelism that enables it to multitask on a level no classical computer can. This renders it possible to run millions of calculations simultaneously on a quantum computer while the conventional personal computer is still stuck on one. A 30-qubit quantum computer, on these lines, would equal the processing power of a conventional computer that is 10,000 times that of a typical desktop computer!

Another important feature of quantum computers is that it relies on quantum entanglement to interpret and process qubits. Qubits cannot be interpreted directly like conventional bits. Consider an atom that exists in a superposition state n. When viewed directly, it settles on either 0 or 1, hence losing its quantum nature and the information it carried in the state n.

Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently of the others. For example, an atom, when left alone tends to spin in all possible directions. However, when in the quantum vicinity of an entangled atom spinning in a specific direction, the subject atom adopts a spin in the opposite direction along a parallel axis. This interdependence enables us to determine the quantum states of all the entangled members in a group by studying just one specimen, enabling us to determine the value of the associated qubit without directly observing it.

The next step towards realizing this quantum dream is by replacing the internet with a suitable quantum analogue. While organizations have made impressive attempts at prototyping quantum computer models, the concept of data transfer between two quantum computers had computer scientists in a deadlock for a very long time.

One of the feasible solutions was to devise a quantum emitter that lets you fabricate a qubit that can store quantum information and be read without losing its quantum state. For reliable data transfer, qubits must not be influenced by external electronic or magnetic factors. They must be able to retain their superposed state during an operation so that no data is lost while being processed. Also, a certain procedure needs to be established wherein qubits can be generated under controlled conditions, rather than merely relying on chance. Recent study and research suggests that such a theory isn’t a long shot from achievable reality.

Normally, quantum emitters are in the form of random defects that are hard to locate. Not only are such specimens hard to locate, but seldom work as expected to. To tackle this, researchers at Sandia in association with Harvard used a particle-ion beam on diamond crystals, replacing carbon atoms with silicon ones. This ensures the production of reliable quantum bits which all yield working devices. In other news, researchers at **NCSU** developed a system to synthetically make microscopic diamonds in specialized crystalline structures that help stabilize calculations in quantum computers.

A fully functional quantum computer can have far reaching effects in our everyday lives. A single computer can perform what 10,000 conventional computers normally would and still take up only a fraction of the space one does. Quantum computing can solve problems that are considered unsolvable today, opening a whole new realm of applications. IBM has taken it a step further by beginning to build a truer AI, one that can check its own errors and correct itself without human invigilation. Quantum computers can in turn be used to design other quantum computers and further study the quantum theory in detail.

Until now, the best we have managed is a computer that can manipulate 16 qubits. A more practical model of a quantum computer would need to be able to manipulate several dozens of qubits at once. Another possible solution to this impasse is to shift focus from building large functional quantum computers onto integrating several smaller ones.

Rest assured, it won’t be until another decade till functional quantum computers start becoming mainstream.