Tag Archives: Quantum Computing

Quantum Computing: The Origin and Its Applications

You have definitely heard the word computing and you might have also heard the term ‘quantum’. However, it is very unlikely that you have heard both of these words together.

The term ‘Quantum Computing’ hasn’t gotten the much-needed traction in the tech world as yet and those that have traversed through this subject might find it a bit confusing to say the least. But there are experts who strongly believe that quantum computing is not just the future, but also the future of humanity, as we will move ahead of the computer bit and venture into the world of computing based upon the subatomic level.

If you don’t have a clue what we are talking about, you are not alone. Stay with us through this article where we will discuss quantum computing in great detail—what it is—how it will change the tech world and its practical implications (both for better or worse).

But before we usher in the discussion of this potential life-changing advancement, it is necessary to discuss the platform on which its foundation is based i.e. Quantum theory. 

The Potential Enabler of Quantum Computing 

The industrial revolution of the 20th century was one of the greatest milestones of modern history. From the invention of the automobile to industrial steel, elevators, and aircraft, it gave birth to a plethora of things that now define our civilization and will continue to shape the history of our future. 

Enter the 21st century and we are watching a transition from the tangible to the intangible (virtual) world; notably, computer technology, its hardware, software, and the world wide web. Among the many incredible things that are ensuing during this technological revolution is the colossal development in physics, specifically quantum theory. We will try to keep the explanation of quantum theory as simple as possible in order to make this an interesting and informative article. 

Modern Physics

It is important to understand that the field of physics is divided into two definite branches: classical and modern. The former branch was actually established during the period of the Renaissance and continued to progress after that. Classical physics is majorly erected on the ideas put forward by Galileo and Newton. The principles are primarily focused on macroscopic (visible to the naked eye) and the solid nature of the world around us.  

Conversely, modern physics is about analyzing matter and energy at microscopic levels. The subject lies heavy on electromagnetism, the wave nature of light and matter, and the theory of duality. It is interesting to note that all these motifs of modern physics come from quantum theory.

While we are at it, it is important to clarify that quantum theory doesn’t just refer to one idea or hypothesis. It is actually a set of a number of principles. We will discuss them in a simple and brief manner and remain focused on the provisions that are relevant to quantum computing. 

  • The work of physicists Max Plank and Albert Einstein in the earliest of the 20th century theorized that energy can exist in discrete units or ‘quanta’. The hypothesis contradicts the principle of classical physics which states that energy can only exist in a continuous wave spectrum.
  • In the following years, Louis de Broglie extended the theory by suggesting that at microscopic (atomic and subatomic) levels, there is not much difference between matter particles and energy and both of them can act as either particles or waves as per the given condition. 
  • Lastly, Heisenberg proposed the theory of uncertainty, which entails that the complementary values of a subatomic particle can’t be simultaneously measured to give accurate values. 

Neil Bohr’s Interpretation of Quantum Theory: The Primal Basis of Quantum Computing

During the time period, when the stipulations of quantum theory were extensively being discussed among top physicists, Neil Bohr came up with an important interpretation of the theory. He suggested that the properties or the reality of any quantum system (an environment governed by wave-particle duality) can’t be determined or specified until they are particularly found out. 

This assertion led to the development of the Principle of Superposition, which in simple words, suggests that any quantum system exists in all its possible states at the same time until one goes on to find out the exact state. The infamous Schrodinger’s Cat thought experiment is an easy way to understand this concept. The experiment entails that a cat enclosed in a box (which is supposed as a quantum system) with poison would be considered both dead and alive simultaneously until the box is opened and the cat is observed. 

Use of Superposition to Develop Computer Algorithms 

Now, this is the point where the theory actually demonstrates its potential to be the basis of a new computer algorithm. In order to understand the quantum-based algorithm, it is essential to understand how contemporary/conventional computing systems work. 

Whether it’s a handheld gadget or a supercomputer working in the server room of Google, at the core of it, every computing device works on the binary language. In conventional computing systems, every bit of information can exist in one of either two states: 0 or 1 (hence ‘binary’). 

On the other hand, when we talk about quantum algorithms, they are actually inspired by the idea that any particle-wave system can exist in multiple states at any given time (Principle of Superposition). This means when data is stored in a quantum system, it can be stored in more than two states. This supposition makes quantum bits (also referred to as ‘Qubits’) more powerful and expensive than conventional computing bits.

Standard Binary Computing Vs Quantum Computing 

The fact that a quantum bit can exist in multiple states gives quantum computing an uncontested edge over conventional binary computing. With the help of a simple example, we will try to demonstrate how superior quantum computing could be in comparison to its classical counterpart. 

For example, picture a cylindrical rod, and each end of the rod is a bit, which is either a  1 or 0. That’s it! When one side is a 1, then the other side must be a 0. There is no in-between or complication here. 

On the other hand, the quantum bit exists in every possible state simultaneously. This means every point on the surface of the cylindrical rod denotes the quantum bit. 

The above illustration exhibits in a really simple manner that quantum bits can hold an unprecedented amount of information and hence the computing governed by this type of algorithm can exceed or super-exceed the processing of any classical computing machine. 

Apart from storing more information than classical computers, quantum computing can also implement the principle of entanglement.  In simple words, the principle will enable every quantum bit to be processed separately even without getting drifted away from each other. This feature will also enhance the processing capability of a quantum computer manifold. 

Beneficial Uses of Quantum Computing

The supreme processing capabilities of quantum computing make them an ideal machine to carry out many tasks where conventional computers lag behind.

Science and Life Sciences 

The study of complex atomic and molecular structures and reactions is no mean task. A lot of computing capacity is required to simulate such processes. For instance, the complete simulation of a molecule as simple as hydrogen is not possible with the available conventional computing technology. So, quantum computing can play a significant role in understanding many of the concealed facts of nature and more particularly of life. Many significant chemical, physical and biological research works stalled for years can take off after the development of quantum computers. 

Artificial Intelligence and Machine Learning 

Even though scientists have made significant inroads in the domain of machine learning and AI with the existing computing resources, quantum computing can help in making the progress that we have always aspired for i.e. to make a machine as intelligent as human cognition. Machine learning feeds on big data. The processing of humongous databases goes into the development of any system based on machine learning. 

With the fast processing of quantum computing, even the usual machine learning will become more streamlined. In addition, the unrestrained computing power of quantum devices will revamp the development of artificial intelligence.

Improvement of General Optimization Procedures 

In today’s bustling life, we feel the need for optimization more than ever—whether it’s personal or commercial dealings. An individual trying to find the best commute for his day-to-day destinations or a financial entity trying to come up with different plans for its every unique customer, a good optimization can only be done when more variables are involved. 

With the addition of more variables, the number of permutations and combinations also goes up and the amount of data to be processed increases exponentially. Optimization of a financial plan might need the processing of several petabytes. Implementation of such extensive optimization in everyday activities can only be achieved with the processing powered by quantum computers.

Other Side of the Coin: The Dangers Involved with Quantum Computing 

One should not be surprised by this heading. We have seen it all through the course of history how the advent of any new technology, intended for the benefit of humankind, is followed by its misuse. And there is no exception for quantum computing too. Adding insult to injury, the unrestrained processing power that can be harnessed by a quantum computer can make its exploitation more deadly. It’s important to mention here that the researchers working in the domain are well aware of the unwanted repercussions of quantum computing. 

Quantum Computing Puts Data Encryption Practices in a Great Danger 

Digitization of our everyday activities has shifted nearly every valuable piece of information into the digital form of data. From nuclear codes to personal banking information, everything now exists in the form of digitized data. For that matter, data is now considered a precious commodity. 

And as we know every precious commodity is vulnerable to vandalism, breaches, and thefts. So, in order to address this data vulnerability, computer scientists have developed encryption modules that are used to lock down the data in order to give it only authorized access. 

The encryption of data can only be neutralized with the help of a decryption key designed by the developers and stored with them. Any unauthorized party can’t get around the encryption without a technique called brute force cracking. But it is important to mention here that brute force might only work to crack simple passwords and basic encryption consisting of only a few bits. 

The Advanced Encryption Standard, which is used in most professional-level data encryption, is much superior to the brute force technique. Let’s try to understand this supremacy with the help of numbers. 

As per the calculations done by the researchers, a 128-bit AES encryption key will be cracked in 1.02 X 10^18 years, assuming that the brute force is done by a supercomputer with a performance rating of 10.51 petaflops. This exponential figure denotes more than a billion, billion years.  In order to put things into perspective, our universe is just 13.75 billion years old. So, it is impossible for a standard 128-bit AES key to get battered by the brute force cracking done through conventional computing. 

We are stating again that a supercomputer is also designed on a binary algorithm that only has two possible states. But when we replace this two-state bit of computing with a quantum bit of unlimited existing states, the tables surely get turned.

The 128-bit Key that looks so formidable against the brute force of classical binary supercomputers will fall flat when quantum computing is used to carry out its cracking. No operating quantum computing machine exists as of today, but experts have estimated that a quantum supercomputer will be able to crack 128-bit encryption keys within 100 seconds. 

Aftermath 

The aftermath of such a scenario won’t be less than any technological dystopia. Data encryption becoming ineffective will expose everything to the shenanigans of criminal elements. To understand just a fraction of this devastation, imagine that every person on the earth linked to the banking system loses access to his/her account. The mere imagination of such a situation is sending chills down the spine.

Apart from that, the neutralization of data encryption can lead to cyber warfare between nation-states. Here also, rogue elements will easily be able to capitalize on the situation. A global outbreak of war in a world with 8 nuclear powers can end up with a dreadful outcome. All things considered, the manifestation of quantum computing can bring along many irretrievable repercussions. 

Preparation to Protect Against the Shenanigans of Quantum Computing 

Google and IBM have successfully carried out quantum computing in a controlled environment. So, to think that quantum computers are a distant reality won’t be deemed an insightful judgment. For that matter, businesses should start preparing against the abuse of quantum computing against standard encryption. There is no point in waiting for formal rules and protocols to be issued for quantum computing. Experts working in the domain of digital security and cryptography recommend some measures to protect business data in the future from any exploitation of quantum computing. 

Increase the Key Size

The most basic thing a business can do to protect data is to increase the key size of encryption algorithms. For instance, if they are using the 128-bit key then it would be better to move to 256 or 512-bit versions. 

Moving to Hash-Based Cryptography 

Dropping the conventional AES and RSA encryption and adopting hash-based cryptography might also protect the data from decryption activity instigated by quantum computers. Researchers think that hash-encryption helps in developing signature schemes that can withstand quantum-brute force. However, there is one major downside of hash-based signatures for now i.e. they can only take care of a few data points. However, there are bright prospects that hash-based encryption will see improvement as we are heading towards the age of quantum computing. It is also being speculated that the NIST (National Institute of Standards and Technology) will standardize the commercial use of hash-based signatures by next year.

Using a Combination 

Some experts also suggest that the combination of established encryption algorithms such as RSA, AES, and ECC with newly proposed methods can also build a formidable barrier against decryption exploits driven by quantum computing. 

And lastly, it is important to stay in touch with tech experts to keep tabs on the development of quantum computing and the change of pertinent policies and rules. In today’s age, a venture that can’t protect its data certainly can’t protect its own and consumers’ interests. 

Conclusion 

How technology has progressed in the last few decades is clearly indicative of the fact that quantum computing is the reality of the future. So, the arrival of quantum computers is not the question of ‘if’ – it’s the question of ‘when’. Quantum computing with all its benefits for the development of life sciences, the financial sector, and AI poses a great threat to the existing encryption system, which is central for the protection of any type of confidential data. The proper approach for any business is to accept this unwanted aspect of quantum computing as a technological hazard and start preparing against it with the help of experts.