Sunday, October 16, 2011

QUANTUM COMPUTERS/QUANTUM PROCESSING


Quantum Computers/Quantum Processing
A quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement. Quantum computers are different based on transistors. The basic principle behind quantum computation is that quantum properties can be used to represent data and perform operations on these data.1  A theoretical model is the quantum Turing machine, also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic computers (e.g., the ability to be in more than one state simultaneously). The field of quantum computing began with Richard Feynman in 1982.2
Quantum computing is still new and experiments have been carried out in which quantum computational operations were executed on a very small number of qubits (quantum bits) and quantum computers are being developed for cryptanalysis.3
Large-scale quantum computers solve certain problems much faster than any classical computer by using the best currently known algorithms, like integer factorization using Shor's algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon's algorithm, which run faster than any possible probabilistic classical algorithm.4   However, in practice infinite resources are never available and the computational basis of 500 qubits, for example, would already be too large to be represented on a classical computer because it would require 2500 complex values to be stored.5
A classical computer has a memory made up of bits, where a quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or, crucially, any quantum superposition  with a pair of qubits in any quantum superposition of 4 states, and three qubits in any superposition of 8.  In general a quantum computer with n qubits can be in an arbitrary superposition of up to 2n different states simultaneously while a normal computer is only in one of these 2n states at any one time. A quantum computer operates a fixed sequence of quantum logic gates with quantum algorithm sequence of gates.
Integer factorization is unfeasible with an ordinary computer for large integers if they are the product of few prime numbers (e.g., products of two 300-digit primes).6   By comparison, a quantum computer could efficiently solve this problem using Shor's algorithm to find its factors. This ability would allow a quantum computer to decrypt many of the cryptographic systems in use today, in the sense that there would be a polynomial time (in the number of digits of the integer) algorithm for solving the problem.  The popular public key ciphers are based on the difficulty of factoring integers (or discrete logarithm problem which can also be solved by Shor's algorithm), including forms of RSA. These are used to protect secure Web pages, encrypted email, and many other types of data and breaking these codes would have significant ramifications for electronic privacy and security.  However, other existing cryptographic algorithms do not appear to be broken by these algorithms.7







Works Cited

2 Quantum computation. David Deutsch, Physics World, 1/6/92
3Quantum Information Science and Technology Roadmap for a sense of where the research is heading.
4Simon, D.R. (1994). "On the power of quantum computation". Foundations of Computer Science, 1994 Proceedings., 35th Annual Symposium on: 116–123.
5Nielsen, Michael A.; Chuang, Isaac L.. Quantum Computation and Quantum Information. p. 17.
6Arjen K. Lenstra (2000). "Integer Factoring". Designs, Codes and Cryptography 19: 101–128.
7 a b Daniel J. Bernstein, Introduction to Post-Quantum Cryptography. Introduction to Daniel J. Bernstein, Johannes Buchmann, Erik Dahmen (editors). Post-quantum cryptography. Springer, Berlin, 2009. ISBN 978-3-540-88701-0

No comments:

Post a Comment