Skip to Main Content
Using atoms as digital bits will start a completely new era in computer design. Atoms cannot be simply manipulated and used like the bits built with transistors. The behavior of matter on the atomic scale follows the rules of modern physics. This behavior cannot be understood in terms of our classical description of the world (i. e. Newtonian mechanics or Maxwell's equations in electromagnetics). The physical theory dealing with such behavior is called quantum mechanics. Its use in the computer industry will most probably cause a revolution in the way we use and understand computers. The author describes how such a quantum computer-a computer based on the rules of quantum mechanics-may work, and how it is going to give incredible speed and problem-solving power.
Date of Publication: Dec 2002/Jan 2003