quantum computing · 5 min read

100% Quantum and Probabilistic

Quantum computing is a paradigm that exploits the quantum mechanical properties of matter and light to perform computations that are beyond the reach of classical computers. Quantum computing has the potential to revolutionize various fields and applications, such as artificial intelligence, cryptography, optimization, simulation, machine learning, and more. However, quantum computing also requires a different way of thinking and programming than classical computing, as it involves uncertainty, randomness, and probability at its core. Classical computing is based on deterministic bits, which can be either 0 or 1. Classical algorithms operate on these bits using logical operations, such as AND, OR, NOT, and XOR. Classical algorithms can also use random bits, which are generated by some physical process or algorithm, to introduce randomness or uncertainty in the computation. For example, randomized algorithms can use random bits to make probabilistic choices or estimates that can improve the performance or accuracy of the algorithm. Quantum computing is based on probabilistic qubits, which can be 0 or 1 or both at the same time. Quantum algorithms operate on these qubits using quantum operations, such as Hadamard, Pauli-X, Pauli-Y, Pauli-Z, CNOT, and Toffoli. Quantum algorithms can also use measurements, which are operations that collapse the qubits into definite states of 0 or 1 with some probability. For example, quantum algorithms can use measurements to extract the output of the computation or to implement feedback loops that can adapt the computation based on previous outcomes.

Quantum computing is a paradigm that exploits the quantum mechanical properties of matter and light to perform computations that are beyond the reach of classical computers. Quantum computing has the potential to revolutionize various fields and applications, such as artificial intelligence, cryptography, optimization, simulation, machine learning, and more. However, quantum computing also requires a different way of thinking and programming than classical computing, as it involves uncertainty, randomness, and probability at its core.

Classical computing is based on deterministic bits, which can be either 0 or 1. Classical algorithms operate on these bits using logical operations, such as AND, OR, NOT, and XOR. Classical algorithms can also use random bits, which are generated by some physical process or algorithm, to introduce randomness or uncertainty in the computation. For example, randomized algorithms can use random bits to make probabilistic choices or estimates that can improve the performance or accuracy of the algorithm.

Quantum computing is based on probabilistic qubits, which can be 0 or 1 or both at the same time. Quantum algorithms operate on these qubits using quantum operations, such as Hadamard, Pauli-X, Pauli-Y, Pauli-Z, CNOT, and Toffoli. Quantum algorithms can also use measurements, which are operations that collapse the qubits into definite states of 0 or 1 with some probability. For example, quantum algorithms can use measurements to extract the output of the computation or to implement feedback loops that can adapt the computation based on previous outcomes.

The difference between classical and quantum computing is not only in the type of bits and operations used, but also in the nature of probability and uncertainty involved. In classical computing, probability is an external tool that can be added or removed from the computation as needed. In quantum computing, probability is an intrinsic feature that cannot be avoided or ignored.

In classical computing, probability is a measure of ignorance or uncertainty about the state of a system or the outcome of an event. Probability is based on frequencies or degrees of belief that can be updated by new information or evidence. Probability is always positive and additive: the probability of a system being in one of several possible states is equal to the sum of the probabilities of each state.

In quantum computing, probability is a measure of interference or coherence of the state of a system or the outcome of an event. Probability is based on amplitudes or complex numbers that can be manipulated by quantum operations or measurements. Probability can be positive, negative, or imaginary and multiplicative: the probability of a system being in one of several possible states is equal to the square of the modulus of the sum of the amplitudes of each state.

This difference between classical and quantum probability has profound implications for the design and analysis of algorithms. In classical computing, we can assume that the state of a system or the outcome of an event is deterministic until we introduce randomness or uncertainty by using random bits or making probabilistic choices. In quantum computing, we have to assume that the state of a system or the outcome of an event is probabilistic until we remove randomness or uncertainty by using measurements or making definite choices.

This means that in quantum computing, we have to embrace the uncertainty of quantum mechanics and remove all classical computation and deterministic assumptions by default. We have to run our algorithm on a quantum processor that can manipulate qubits and perform quantum operations and measurements. We have to use quantum features such as superposition (the ability of qubits to exist in a linear combination of two states), interference (the phenomenon that occurs when two or more quantum states combine to form a new quantum state), entanglement (the phenomenon that occurs when two or more qubits share a quantum state and become correlated with each other), and teleportation (the process of transferring the quantum state of one qubit to another distant qubit without physically sending the qubit itself).

By doing so, we can exploit the power and potential of quantum computing and achieve speedup or advantage over classical computing for certain problems. For example:

  • Shor’s algorithm: This is an algorithm that can factor large numbers into their prime factors in polynomial time using quantum Fourier transform and modular arithmetic. This can break many classical cryptographic schemes that rely on the hardness of factoring, such as RSA.
  • Grover’s algorithm: This is an algorithm that can search an unsorted database of N items in square root of N steps using quantum amplitude amplification and inversion. This can speed up many classical search problems, such as finding a needle in a haystack or a password in a hash table.
  • Quantum machine learning: This is a branch of machine learning that uses quantum algorithms and quantum data to perform tasks such as classification, regression, clustering, dimensionality reduction, and more. This can enhance the performance and efficiency of machine learning models, such as neural networks or support vector machines.

In conclusion, quantum computing is a paradigm that is 100% quantum and probabilistic. It requires us to run our algorithm on a quantum processor and embrace the uncertainty of quantum mechanics, removing all classical computation and deterministic assumptions by default. By doing so, we can use quantum features such as superposition, interference, entanglement, and teleportation to solve problems faster or better than classical computing.

Back to Notes