Quantum mechanics is the principle used to explain the behaviour of particles at the atomic and subatomic scales. It all began around the late 1800s and early 1900s when scientists realized from a series of experimental observations that the behaviour of atoms didn’t agree with the rules of classical mechanics, where everyday objects exist in a specific place at a particular time. This changed the traditional concept of an atom with a nucleus surrounded by electrons to orbitals representing the probability of the electrons being in a given range at any given time. Electrons can jump from one orbital to another as they gain or lose energy, but they cannot be found between orbitals. From this idea, and over many decades, the rules of quantum mechanics were unveiled, allowing scientists to build devices that followed those rules. This led to the first quantum revolution with the invention of the transistor, the laser, and the atomic clock, which gave us computers, optical fibre communications and the global positioning system.
It is getting so much attention because we are in the early stages of a second quantum revolution, with scientists now able to control individual atoms, electrons and photons. This allows our scientific community to build high-speed quantum computers, interception-proof quantum communication and hyper-sensitive quantum measurement methods. All are harnessed by solid technological companies worldwide that are now in a frantic race to redefine the limits of our technology and, with it, the fabric of our everyday lives.
Classical computers have billions of transistors that turn on or off to represent a value of 0 or 1. Hence, in classical computing, we talk about binary digits or bits. In contrast, quantum computers process data using quantum bits or qubits that, unlike classical bits, can exist in simultaneous states or superpositions at the same point in time, thanks to the laws of quantum mechanics. This allows each qubit to be 1, 0, or both states simultaneously.
The magic of quantum computers happens when these qubits are entangled. Entanglement is a type of correlation that ties qubits together, so the state of one qubit is tied to another. Hence, by leveraging superposition and entanglement, quantum computers can speed up computation and do things that classical computers can’t do.
Entangled qubits can be created in many different ways, for example, with superconductors electronic circuits, trapping ionized atoms, or squeezing particles of light (photons). Each technology is currently trying to preserve the quantum effects for as long as possible as they scale up in the number of qubits from the current hundreds to the targeted Million that will forever redefine the boundaries of computing technology.
Post-quantum cryptography (also known as quantum-proof, quantum-safe, or quantum-resistant) refers to cryptographic algorithms that are thought to be secure against the attack of quantum computers in the future. These algorithms are called post-quantum because the security of most standard algorithms today relies on solving challenging mathematical problems, sufficient for defending against modern computers but unable to resist the attack of a quantum computer once they reach specific computational power in the number of Qubits.
Quantum cryptography, on the other hand, also known as Quantum Key Distribution (QKD), describes the use of quantum effects to enable unconditionally secure key distribution between two legitimate users, guaranteed by the fundamental laws of quantum physics.
Although some people think these two technologies are exclusive, they are meant to be allies in securing future communications.
Quantum Key Distribution is a method for two parties in cryptography, Alice and Bob, to securely establish a shared key to encode messages through optical fiber or space. To create the key, first, Alice encrypts random bits into quantum signals (extremely weak photons) and transmits them through the channel. Next, Bob measures the state of the arriving photons and obtains data that is partially correlated to the data encoded by Alice. This data can be used to distill a secret key using error correction and privacy amplification.
When a hacker looks at the information encoded into the quantum photons sent by Alice, they will irreversibly change their properties because quantum states cannot be cloned or copied. This means that Bob receives quantum signals that are not correlated to Alice’s as they should be, letting them know that someone has tried to intercept the message. Alice and Bob discard this key that has been compromised, and a new one following the same process is generated until it is guaranteed that it is free from attacks.
In Discrete Variable QKD (DV-QKD), the emitter (Alice) prepares and sends to a receiver (Bob) quantum signals which consist of single photons with encoded random data. The encoding is done following a specific QKD protocol by using a discrete-valued degree of freedom of the photons, such as polarization, time-bin, or linear momentum. In the receiver, Bob measures the state of the arriving photons using single-photo detectors to distill a secret key.
In Continuous Variable QKD (CV-QKD), the quantum signals typically consist of coherent states of light with information encoded in the quadrature of electromagnetic fields. Instead of single-photon detectors, CV-QKD uses coherent homodyne or heterodyne detection (known in telecommunication phase-diversity homodyne detection) to retrieve the quadrature value of the signal and thus distill a secret key.
Standardization and certification of QKD technology are vital to enable market penetration and ensure equipment interoperability and a robust supply chain. For that, the standards are pretty comprehensive as they define frameworks that consider all aspects of the technology and the implementation into a complete system, performance, best operational practices, or security specifications, to name some.
All the critical standard organizations worldwide (national, European, and worldwide) already began years ago to write their specifications on QKD systems, which is an indicator of both increased maturity and a strong interest in the application and commercialization of QKD technology.
For further information on QKD standardization, you can read the comprehensive analysis run by OpenQKD here.
Rather than competing, both mathematical and post-quantum cryptographies are complementary to quantum cryptography. This is emerging when talking to encryption and telecom providers and is supported by the European Commission plans to deploy EuroQCI. The idea is to continuously monitor the evolution of all these technologies and put together a roadmap for leveraging both physical and mathematical complexity security-based protocols.
There are numerous European projects, and private companies are investing and researching to make the technology affordable by involving production experts and the know-how of end-user companies and network infrastructure owners.
European-made technology is desired/preferred for these systems to ensure European sovereignty. For that reason, the European Union itself, through many fund programs and industry consortiums such as the European Quantum Industry Consortium (QuIC), is stimulating potential makers and suppliers to develop and produce all critical components in Europe over the following years.