Date 1720 Nov 1994
Filter Results

Impact of locality and dimensionality limits on architectural trends
Publication Year: 1994 , Page(s): 30  35
Cited by: Papers (1)Since computing is a physical activity, all forms of computing must obey locality constraints imposed by physics. Unknowingly, many software abstractions violate locality constraints because they represent high dimensional topologies that have higher degrees of freedom than is uniformly implementable by the underlying physical architecture. This semantic gap between abstractions implemented in the virtual architecture and the physical machine resources results in poor performance for certain classes of computing problems. The paper discusses and analyzes the impact of locality constraints and dimensionality limits upon software and architecture trends with the specific goals of improved performance, lower cost, and the longevity of architectural investments View full abstract»

Multiprocessor architectures and physical law
Publication Year: 1994 , Page(s): 24  29
Cited by: Papers (2)We show that all highly symmetrical interconnection topologies for multiprocessors with low diameter require very long interconnect lengths. Therefore, such multicomputers do not scale well in the physical world with 3 dimensions. On the other hand, highly irregular (random) interconnection topologies have a very large subgraph of diameter two and therefore also require very long interconnect lengths. Hence the only scaling topologies for future massively parallel computers are high diameter regular ones, like mesh networks. The techniques used are symmetry properties in terms of orbits of automorphism groups of graphs, and a modern notion of randomness of individual objects, Kolmogorov complexity View full abstract»

Research toward nanoelectronic computing technologies in Japan
Publication Year: 1994 , Page(s): 1  4A perspective of the research activities in Japan aimed at the development of new computing technologies based on structures with ultrasmall dimensions is presented. Examples are given of work toward the development of resonant tunneling circuits, electronwave interference and singleelectron tunneling devices, and atomicscale fabrication technologies View full abstract»

A fast algorithm for entropy estimation of greylevel images
Publication Year: 1994 , Page(s): 233  238Examines an efficient approach to the calculation of the entropy of long binary and nonbinary 1D information sequences. The entropy calculation is accomplished in a time which is linear in the sequence length. The method is expanded to estimate the entropy of greylevel images which, under raster scanning, may be represented as 1D information sequences. The entropy estimate obtained depends on the image scanning method employed, and consequently, in order to achieve a greater reduction in the bit rate, the scanning should be done in the direction of the highest adjacent pixel statistical dependence. Depending on the image statistics, it is shown that uniform luminance requantization of an image may not lead to an appreciable reduction in the bit rate. The algorithm discussed can be applied to areas such as image compression and string entropy estimation in genetics View full abstract»

The Boltzmann entropy and randomness tests
Publication Year: 1994 , Page(s): 209  216
Cited by: Papers (2)In the context of the dynamical systems of classical mechanics, we introduce two new notions called “algorithmic finegrain and coarsegrain entropy”. The finegrain algorithmic entropy is, on the one hand, a simple variant of the MartinLof (and other) randomness tests, and, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy. The coarsegrain entropy is a slight correction to Boltzmann's coarsegrain entropy. Its main advantage is its less partition dependence, due to the fact that algorithmic entropies for different coarsegrainings are approximations of one and the same finegrain entropy. It has the desirable properties of Boltzmann entropy in a somewhat wider range of systems, including those of interest in the “thermodynamics of computation” View full abstract»

Space and time in computation, topology and discrete physics
Publication Year: 1994 , Page(s): 44  53A step can be regarded as an elementary ordering of two objects (or operators). A step is a distinction combined with an action that crosses the boundary of that distinction. The elementary step can be seen as a reference, as a division of space or as a tick of a clock. By looking at the structure of a step, we provide a context that unifies specific aspects of special relativity, Laws of Form, topology, discrete physics and logic design View full abstract»

Results on twobit gate design for quantum computers
Publication Year: 1994 , Page(s): 14  23
Cited by: Patents (2)We present numerical results which show how twobit logic gates can be used in the design of a quantum computer. We show that the Toffoli gate, which is the universal gate for all classical reversible computation, can be implemented using a particular sequence of exactly five twobit gates. An arbitrary threebit unitary gate, which can be used to build up any arbitrary quantum computation, can be implemented exactly with six twobit gates. The ease of implementation of any particular quantum operation is dependent upon a very nonclassical feature of the operation, its exact quantum phase factor View full abstract»

On the averagecase complexity of the reversibility problem for finite cellular automata
Publication Year: 1994 , Page(s): 151  155
Cited by: Papers (1)Of particular relevance in the theory and applications of cellular automata is the concept of invertibility. We study the computational complexity of deciding whether or not a given finite cellular automata is invertible. This problem is known to be CoNPcomplete, we prove that the expectedtime complexity of its randomized version is “hard”: the problem is CoRNPcomplete. Finally, we discuss some consequences of this result in the theory and applications of cellular automata View full abstract»

Chu spaces: automata with quantum aspects
Publication Year: 1994 , Page(s): 186  195
Cited by: Papers (1)Chu spaces are a model of concurrent computation extending automata theory to express branching time and true concurrency. They exhibit in a primitive form the quantum mechanical phenomena of complementarity and uncertainty. The complementarity arises as the duality of information and time, automata and schedules, and states and events. Uncertainty arises when we define a measurement to be a morphism and notice that increasing structure in the observed object reduces clarity of observation. For a Chu space this uncertainty can be calculated numerically in an attractively simple way directly from its form factor to yield the usual Heisenberg uncertainty relation. Chu spaces correspond to wavefunctions as vectors of Hilbert space, whose inner product operation is realized for Chu spaces as right residuation and whose quantum logic becomes Girard's linear logic View full abstract»

Reversible logic issues in adiabatic CMOS
Publication Year: 1994 , Page(s): 111  118
Cited by: Papers (11)  Patents (7)Power dissipation in CMOS circuits has become increasingly important for the design of portable, embedded and highperformance computing systems. Our VLSI research group has investigated a novel form of energyconserving logic suitable for CMOS. Through small chipbuilding experiments, we have demonstrated the lowpower operation of simple logic functions. These chips have used logical reversibility on a small, sometimes trivial, scale to achieve their lowpower operation. In moving towards more complex functions, the role of reversibility will increase. This paper addresses two problem areas that we have found to be crucial to successfully realizing lowpower operation of CMOS chips using reversible logic techniques. The first area is the energyefficient design of the combined power supply and clock generator. The second is the logical overhead needed to support reversible logic functions. The first problem area, though formidable, seems amenable to systematic approaches. Significant inroads have been made towards finding practical, efficient solutions. The second, however, appears to be by far the more difficult hurdle to overcome irreversible logic is to become an attractive approach for reducing power dissipation in CMOS View full abstract»

Phase transitions and coarsegrained search
Publication Year: 1994 , Page(s): 203  208Abstraction is a method for solving a variety of computational search problems that uses coarsegraining to simplify the search. When a coarsegrained, or abstract, solution is found, it is then refined to give a complete solution. We present a model of this abstraction process for constraint satisfaction problems, a wellknown class of NPcomplete search problems. This model is then used to identify phase transitionlike behavior in the effectiveness of abstraction, as well as to determine the type of abstraction that is likely to be most useful for relatively hard instances of these search problems View full abstract»

Computational spacetimes
Publication Year: 1994 , Page(s): 239  245
Cited by: Papers (2)The execution of an algorithm is limited by physical constraints rooted in the finite speed of signal propagation. To optimize the usage of the physical degrees of freedom provided by a computational engine, one must apply all relevant technological and physical constraints to the temporal and spatial structure of a computational procedure. Computational spacetimes make explicit both technological and physical constraints, and facilitate reasoning about the relative efficiency of parallel algorithms through explicit physical complexity measures. Similar to Minkowski spacetime being the world model for physical events, computational spacetimes are the world model for computational events. Algorithms are specified in a spatial singleassignment form, which makes all assignments spatially explicit. The computational spacetime and the spatial singleassignment form provide the framework for the design, analysis and execution of finegrain parallel algorithms View full abstract»

Space, time, logic, and things
Publication Year: 1994 , Page(s): 36  43We examine the fundamental origins of logic and show how these fundamentals are related to basic concepts of space, time, objects, and events used in both physics and computing. We attempt to show how a universe can be constructed beginning not from first principles, but from no principles. Several possible implications for physics and mathematics are also discussed View full abstract»

Quantum cellular automata: the physics of computing with arrays of quantum dot molecules
Publication Year: 1994 , Page(s): 5  13
Cited by: Papers (28)  Patents (1)We discuss the fundamental limits of computing using a new paradigm for quantum computation, cellular automata composed of arrays of coulombically coupled quantum dot molecules, which we term quantum cellular automata (QCA). Any logical or arithmetic operation can be performed in this scheme. QCA's provide a valuable concrete example of quantum computation in which a number of fundamental issues come to light. We examine the physics of the computing process in this paradigm. We show to what extent thermodynamic considerations impose limits on the ultimate size of individual QCA arrays. Adiabatic operation of the QCA is examined and the implications for dissipationless computing are explored View full abstract»

Toward an information mechanics
Publication Year: 1994 , Page(s): 95  110Presents a chain of reasoning that makes an information mechanics a plausible goal. A radically new model of distributed computation that exceeds Turing's sequential model refutes the perception that quantum mechanics cannot be captured computationally. Our new model, called the `phase web paradigm', is itself captured naturally by a physically relevant mathematics, that of a Clifford algebra. The basic features of the computational model are shown to have natural counterparts in current physical theory, and we close with a discussion of the implications of the framework presented for the fabrication of nanoscale hardware View full abstract»

The stabilisation of quantum computations
Publication Year: 1994 , Page(s): 60  62
Cited by: Papers (2)A quantum computer is a device capable of performing computational tasks that depend on characteristically quantum mechanical effects, in particular coherent quantum superposition. Such devices can efficiently perform classes of computation (e.g. factorisation) which are believed to be intractable on any classical computer. This makes it highly desirable to construct such devices. In this paper, we address the last remaining theoretical obstacle to such a construction, namely the problem of stability or error correction. This problem is more substantial in quantum computation than in classical computation because of the delicate nature of the interference phenomena on which quantum computation depends. We present a new, purely quantum mechanical method of error correction, which has no classical analogue, but can serve to stabilise coherent quantum computations. Like the classical methods, it utilises redundancy, but it does not depend on measuring intermediate results of the computation View full abstract»

Thermal logic circuits
Publication Year: 1994 , Page(s): 119  127
Cited by: Papers (2)  Patents (2)Thermal logic is a hypothetical device technology that allows one to analyze the energetics of computing machines in a simpler setting than real device technologies. The paper describes the rudiments of thermal logic, and uses it to analyze reversible logic pipelines. The similarity between thermal logic and electronic logic is explained, and thermal analogs of electronic devices and circuits are proposed. We show that adiabaticallyreversible logic pipelines have a rich mathematical structure, including a local gauge symmetry, and suggest some directions for future research. Adiabatic power supplies are also addressed View full abstract»

Statistical mechanics of combinatorial search
Publication Year: 1994 , Page(s): 196  202The statistical mechanics of combinatorial search problems is described using the example of the wellknown NPcomplete graph coloring problem. A simple parameter describing the problem structure predicts the difficulty of solving the problem, on average. However, because of the large variance associated with this prediction, it is of limited direct use for individual instances. Additional parameters, describing the problem structure as well as the heuristic effectiveness, are introduced to address this issue. This also highlights the distinction between the statistical mechanics of combinatorial search problems, with their exponentially large search spaces, and physical systems, whose interactions are often governed by a simple Euclidean metric View full abstract»

On a method of solving SAT efficiently using the quantum Turing machine
Publication Year: 1994 , Page(s): 177  185
Cited by: Papers (1)In this paper, under an assumption that superposed physical states can be observed without collapsing the superposition, we show that the satisfiability problem (SAT, for short) can be solved by a quantum Turing machine in O(2^{n/4}) time. This assumption is not widely accepted among physicists, however, (Aharonov et al., 1993) conjecture that a physical state actually exists as a superposition and can be observed without collapsing the superposition View full abstract»

Evolution, entropy, and parallel computation
Publication Year: 1994 , Page(s): 246  254The relationship between evolution and entropy is described for a model of selfreproducing parallel computation. As was recently shown by Thearling and Ray (1994), the performance of some types of parallel computation can be increased though a process analogous to evolution by natural selection. The work discussed in this paper explores the process by which evolution manipulates the entropy of instruction sequences in a population of parallel programs in an effort to discover more efficient uses of parallelism View full abstract»

Some results on invertible cellular automata
Publication Year: 1994 , Page(s): 143  150
Cited by: Papers (1)Addresses certain questions concerning invertible cellular automata, and presents new results in this area. Specifically, we explicitly construct a cellular automaton in a class (a residual class) previously known not to be empty only via a nonconstructive existence proof. This class contains cellular automata that are invertible on every finite support but not on an infinite lattice. Moreover, we show a class that contains invertible cellular automata having bounded neighborhood, but whose inverses constitute a class of cellular automata for which there isn't any recursive function bounding all the neighborhood View full abstract»

The complexity and entropy of Turing machines
Publication Year: 1994 , Page(s): 227  232
Cited by: Papers (1)Addresses the relationship between dynamical systems theory and theoretical computer science, in particular the dynamical, informationtheoretic and computational properties of systems that compute. These properties have been studied in cellular automata and the symbolic dynamics of maps over the unit interval, but have never been addressed in compact systems known to be capable of universal computation. Recent work is described in which the entropy, periodicity and regular language complexity of a large number of randomly generated Turing machines were calculated. The results are discussed in detail and compared with an identical analysis of a universal Turing machine. This comparison yields the first direct quantitative evidence that universal computation lies between ordered and chaotic behavior. The discussion concludes with a list of questions remaining to be answered about the phasespace portrait of computationally complex systems View full abstract»

Can quantum computers have simple Hamiltonians?
Publication Year: 1994 , Page(s): 63  68
Cited by: Papers (1)Recently, P. Shor (1994) has shown that quantum computers (computers which can operate simultaneously on a quantum superposition of inputs) permit efficient (i.e. polynomialtime) solutions of problems for which no efficient classicalmechanical solution is known. This has led to renewed interest in the question of whether or not quantum computers can be physically realized. One kind of quantum computer, quantum cellular automata, can be described by relatively simple Hamiltonians that resemble the Hamiltonians of spin systems. In this paper, we report a quantum cellular automaton which, though not itself computationuniversal, forms an essential part of any quantum cellular automaton which is synchronized using Feynman's technique. This quantum cellular automaton has as its Hamiltonian the onedimensional XY Hamiltonian, which is exactly solvable. Furthermore, there is experimental evidence from lowtemperature measurements of the heat capacity and electric susceptibility that the Hamiltonian of the quantum cellular automaton is realized in nature by the rareearth compound praseodymium ethyl sulfate near 1 K View full abstract»

On physical models of neural computation and their analog VLSI implementation
Publication Year: 1994 , Page(s): 255  264
Cited by: Papers (4)Examines computation in a framework where the problem is essentially that of extracting a signal from noise, filtering (selective amplification) or estimation. The discussion is relevant to computational tasks in sensory communication, such as vision, speech and natural language processing. We consider “real” systems, both natural (neural systems) and human engineered (silicon integrated circuits), where information processing takes the form of an irreversible physical process. We argue, and demonstrate experimentally, that it is possible to see the emergence of truly complex processing structures that are commensurate with the physical properties of the computational substrate and are therefore energetically efficient View full abstract»

Entropy cost of information
Publication Year: 1994 , Page(s): 217  226An entropy analysis of Szilard's (1929) onemolecule Maxwell's demon suggests a general theory of the entropy cost of information. The entropy of the demon increases due to the decoupling of the molecule from the measurement information. In general, neither measurement nor erasure is fundamentally a thermodynamically costly operation; however, the decorrelation of the system from the information must always increase entropy in the systemwithinformation. This causes a net entropy increase in the universe unless, as in the Szilard demon, the information is used to decrease entropy elsewhere before the correlation is lost. Thus information is thermodynamically costly precisely to the extent that it is not used to obtain work from the measured system View full abstract»