The age of Quantum Computing

Abhi Avasthi
5 min readJun 9, 2022
Photo by Farai Gandiya on Unsplash

In1980, physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. Richard Feynman and Yuri Manin later suggested that a quantum computer had the potential to simulate things a classical computer could not feasibly do.

Quantum computing represents a new paradigm in computation that utilizes the fundamental principles of quantum mechanics to perform calculations.The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously.

Classical computing (our current computing technology) relies, at its ultimate level, on principles expressed by Boolean algebra. Data must be processed in an exclusive binary state at any point in time or bits. While the time that each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

Beyond this, the quantum world takes over. In a quantum computer, a number of elemental particles such as electrons or photons can be used with either their charge or polarisation acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behaviour of these particles form the basis of quantum computing.

Advantages of Quantum Computing

There are many advantages that quantum computing has over classical computing. One of the critical advantages of quantum computing is that it can solve problems much faster than classical computers.

This is because a quantum computer harnesses the features of both a particle and a wave, which allows it to explore many different solutions at once, which is called “quantum parallelism.”.

Additionally, quantum computers are not affected by noise and can function in extreme conditions, perfect for AI and machine learning applications.

Quantum AI could prevent another AI winter

“Sceptics are correct in that quantum computing is still a field of research and it is a long way from being applied to neural networks,” Cem Dilmegani writes. “However, in a decade, AI could run into another plateau due to insufficient computing power and quantum computing could rise to help the advance of AI.”

Many proposals for quantum machine learning algorithms have been made that can be best characterised as “heuristics,” meaning that these algorithms have no formal proof that supports their performance. This was the case until last year.

Last year, IBM Research announced that it found “mathematical proof” of a quantum advantage for quantum machine learning. The proof came in the form of a classification algorithm that, provided access to “classical data,” provided a “provable exponential speedup” over classic ML methods. While there are plenty of caveats to go along with that statement, it provides a glimpse into one potential future where quantum AI is feasible.

They proved that the discrete log problem — can be efficiently solved on a quantum computer by using Shor’s famous algorithm.You can read about in depth on their blog.

DIFFICULTIES WITH QUANTUM COMPUTERS

According to BBVAOpenMind, these are the issues we’re currently facing in Quantum Computing :

Interference — During the computation phase of a quantum calculation, the slightest disturbance in a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase. Current quantum computers typically suppress de-coherence by isolating the qubits from their environment as well as possible. The trouble is, as the number of qubits multiplies, this isolation becomes extremely hard to maintain: Decoherence is bound to happen, and errors creep in

Error correction — Given the nature of quantum computing, error correction is ultra-critical — even a single error in a calculation can cause the validity of the entire computation to collapse.

Output observance — Closely related to the above two, retrieving output data after a quantum calculation is complete risks corrupting the data.

Not to mention the fact that the minimum energy requirement for quantum logical operations is five times that of classical computers. A Quantum CPU will have efficiency and heating problems of its own.

Applications

One potential application of quantum machine learning is simulating quantum systems — for instance, chemical reactions that might yield insights leading to next-generation batteries or new drugs. This might entail creating models of the molecules of interest, having them interact, and using experiments of how the actual compounds interact as training data to help improve the models.

Quantum computing is also a game changer in conventional mathematical theory — Quantum computers hold the possibility of solving what computer science calls “NP-complete” problems, the problems that are impossible or nearly impossible to calculate on a classical computer. Picking out a single pattern from a collection of patterns, such as your mother from a photo of people, is easy for you, but beyond the reach of your PC. In fact, Google designed a machine that needs only 200 seconds to solve a problem that took the world’s fastest supercomputer 10,000 years to figure out.

According to IBM, quantum computing could enable a range of disruptive use cases for providers and health plans by accelerating diagnoses, personalising medicine, and optimising pricing. Quantum-enhanced machine learning algorithms are particularly relevant to the sector.

Even though an eventual switch to quantum computing is inevitable, it is still quite far away from being a reality. Massive improvements need to be made in its error correction capabilities and power consumption to make it feasible on a wider scale, even though we’re seeing some its applications in the world, it is far away from extensive usage across industries.

Bibliography :

Quantum Computing and AI : https://www.bbvaopenmind.com/en/technology/digital-world/quantum-computing-and-ai/

Is Quantum Computing the future of AI :
https://www.datanami.com/2021/11/11/is-quantum-computing-the-future-of-ai/

The father of Quantum Computing :
https://www.wired.com/2007/02/the-father-of-quantum-computing/

Paul Benioff Wiki:
https://en.wikipedia.org/wiki/Paul_Benioff#Quantum_Computing

--

--

Abhi Avasthi

I write about things that fascinate me, and make me think.