By Topic

40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)

17-19 Oct. 1999

Filter Results

Displaying Results 1 - 25 of 69
  • 40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)

    Publication Year: 1999
    Request permission for commercial reuse | PDF file iconPDF (266 KB)
    Freely Available from IEEE
  • Author index

    Publication Year: 1999, Page(s):667 - 668
    Request permission for commercial reuse | PDF file iconPDF (14 KB)
    Freely Available from IEEE
  • Efficient testing of large graphs

    Publication Year: 1999, Page(s):656 - 666
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (192 KB)

    Let P be a property of graphs. An ε-test for P is a randomized algorithm which, given the ability to make queries whether a desired pair of vertices of an input graph G with n vertices are adjacent or not, distinguishes, with high probability, between the case of G satisfying P and the case that it has to be modified by adding and removing more than εn2 edges to make it satisf... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Regular languages are testable with a constant number of queries

    Publication Year: 1999, Page(s):645 - 655
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (256 KB)

    We continue the study of combinatorial property testing, initiated by Goldreich, Goldwasser and Ron (1996). The subject of this paper is testing regular languages. Our main result is as follows. For a regular language L∈{0, 1}* and an integer n there exists a randomized algorithm which always accepts a word w of length n if w∈L, and rejects it with high probability if w has to be modifie... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PSPACE has constant-round quantum interactive proof systems

    Publication Year: 1999, Page(s):112 - 119
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (152 KB)

    We introduce quantum interactive proof systems, which are interactive proof systems in which the prover and verifier may perform quantum computations and exchange quantum messages. It is proved that every language in PSPACE has a quantum interactive proof system that requires a total of only three messages to be sent between the prover and verifier and has exponentially small (one-sided) probabili... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning mixtures of Gaussians

    Publication Year: 1999, Page(s):634 - 644
    Cited by:  Papers (82)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (200 KB)

    Mixtures of Gaussians are among the most fundamental and widely used statistical models. Current techniques for learning such mixtures from data are local search heuristics with weak performance guarantees. We present the first provably correct algorithm for learning a mixture of Gaussians. This algorithm is very simple and returns the true centers of the Gaussians to within the precision specifie... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Hardness of approximating Σ2p minimization problems

    Publication Year: 1999, Page(s):465 - 474
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (132 KB)

    We show that a number of natural optimization problems in the second level of the Polynomial Hierarchy are Σ2p -hard to approximate to within nε factors, for specific ε>0. The main technical tool is the use of explicit dispersers to achieve strong, direct inapproximability results. The problems we consider include Succinct Set Cover, Minimum Equi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Taking a walk in a planar arrangement

    Publication Year: 1999, Page(s):100 - 110
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (172 KB)

    We present a randomized algorithm for computing portions of an arrangement of n arcs in the plane, each pair of which intersect in at most t points. We use this algorithm to perform online walks inside such an arrangement (i.e., compute all the faces that a curve, given in an online manner, crosses), and to compute a level in an arrangement, both in an output-sensitive manner. The expected running... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stochastic load balancing and related problems

    Publication Year: 1999, Page(s):579 - 586
    Cited by:  Papers (21)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (148 KB)

    We study the problems of makespan minimization (load balancing), knapsack, and bin packing when the jobs have stochastic processing requirements or sizes. If the jobs are all Poisson, we present a two approximation for the first problem using Graham's rule, and observe that polynomial time approximation schemes can be obtained for the last two problems. If the jobs are all exponential, we present ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Boosting and hard-core sets

    Publication Year: 1999, Page(s):624 - 633
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (152 KB)

    This paper connects two fundamental ideas from theoretical computer science hard-core set construction, a type of hardness amplification from computational complexity, and boosting, a technique from computational learning theory. Using this connection we give fruitful applications of complexity-theoretic techniques to learning theory and vice versa. We show that the hard-core set construction of R... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Markovian coupling vs. conductance for the Jerrum-Sinclair chain

    Publication Year: 1999, Page(s):241 - 251
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB)

    We show that no Markovian coupling argument can prove rapid mixing of the Jerrum-Sinclair Markov chain for sampling almost uniformly from the set of perfect and near perfect matchings of a given graph. In particular, we show that there exists a bipartite graph G such that any Markovian coupling argument on the Jerrum-Sinclair Markov chain for G must necessarily take time exponential in the number ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the complexity of SAT

    Publication Year: 1999, Page(s):459 - 464
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (168 KB)

    We show that non-deterministic time NTIME(n) is not contained in deterministic time n2-ε and polylogarithmic space, for any ε>0. This implies that (infinitely often), satisfiability cannot be solved in time O(n2-ε) and polylogarithmic space. A similar result is presented for uniform circuits; a log-space uniform circuit of polylogarithmic width computing sat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Approximate nearest neighbor algorithms for Hausdorff metrics via embeddings

    Publication Year: 1999, Page(s):171 - 179
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (244 KB)

    Hausdorff metrics are used in geometric settings for measuring the distance between sets of points. They have been used extensively in areas such as computer vision, pattern recognition and computational chemistry. While computing the distance between a single pair of sets under the Hausdorff metric has been well studied, no results are known for the nearest-neighbor problem under Hausdorff metric... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic planar convex hull operations in near-logarithmic amortized time

    Publication Year: 1999, Page(s):92 - 99
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (196 KB)

    We give a data structure that allows arbitrary insertions and deletions on a planar point set P and supports basic queries on the convex hull of P, such as membership and tangent-finding. Updates take O(log1+ε n) amortized time and queries take O(log n) time each, where n is the maximum size of P and ε is any fixed positive constant. For some advanced queries such as bridge-fi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bounds for small-error and zero-error quantum algorithms

    Publication Year: 1999, Page(s):358 - 368
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (176 KB)

    We present a number of results related to quantum algorithms with small error probability and quantum algorithms that are zero-error. First, we give a tight analysis of the trade-offs between the number of queries of quantum search algorithms, their error probability, the size of the search space, and the number of solutions in this space. Using this, we deduce new lower and upper bounds for quant... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fairness in routing and load balancing

    Publication Year: 1999, Page(s):568 - 578
    Cited by:  Papers (41)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (168 KB)

    We consider the issue of network routing subject to explicit fairness conditions. The optimization of fairness criteria interacts in a complex fashion with the optimization of network utilization and throughput; in this work, we undertake an investigation of this relationship through the framework of approximation algorithms. In this work we consider the problem of selecting paths for routing so a... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Approximation schemes for minimizing average weighted completion time with release dates

    Publication Year: 1999, Page(s):32 - 43
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (200 KB)

    We consider the problem of scheduling n jobs with release dates on m machines so as to minimize their average weighted completion time. We present the first known polynomial time approximation schemes for several variants of this problem. Our results include PTASs for the case of identical parallel machines and a constant number of unrelated machines with and without preemption allowed. Our scheme... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An approximate L1-difference algorithm for massive data streams

    Publication Year: 1999, Page(s):501 - 511
    Cited by:  Papers (13)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (196 KB)

    We give a space-efficient, one-pass algorithm for approximating the L1 difference Σi|ai-bi | between two functions, when the function values ai and bi are given as data streams, and their order is chosen by an adversary. Our main technical innovation is a method of constructing families {Vj} of limited independence ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An algorithmic theory of learning: robust concepts and random projection

    Publication Year: 1999, Page(s):616 - 623
    Cited by:  Papers (31)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (164 KB)

    We study the phenomenon of cognitive learning from an algorithmic standpoint. How does the brain effectively learn concepts from a small number of examples despite the fact that each example contains a huge amount of information? We provide a novel analysis for a model of robust concept learning (closely related to “margin classifiers”), and show that a relatively small number of examp... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Random walks on truncated cubes and sampling 0-1 knapsack solutions

    Publication Year: 1999, Page(s):230 - 240
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB)

    We solve an open problem concerning the mixing time of a symmetric random walk on an n-dimensional cube truncated by a hyperplane, showing that it is polynomial in n. As a consequence, we obtain a full-polynomial randomized approximation scheme for counting the feasible solutions of a 0-1 knapsack problem. The key ingredient in our analysis is a combinatorial construction we call a “balanced... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Finely-competitive paging

    Publication Year: 1999, Page(s):450 - 457
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (120 KB)

    We construct an online algorithm for paging that achieves an O(r+log k) competitive ratio when compared to an offline strategy that is allowed the additional ability to “rent” pages at a cost of 1/r. In contrast, the competitive ratio of the Marking algorithm for this scenario is O(r log k). Our algorithm can be thought of in the standard setting as having a “fine-grained” ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient regular data structures and algorithms for location and proximity problems

    Publication Year: 1999, Page(s):160 - 170
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (352 KB)

    Investigates data structures obtained by a recursive partitioning of the input domain into regions of equal size. One of the most well-known examples of such a structure is the quadtree, which is used in this paper as a basis for more complex data structures; we also provide multidimensional versions of the stratified tree of P. van Emde Boas (1997). We show that, under the assumption that the inp... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fully dynamic algorithms for maintaining all-pairs shortest paths and transitive closure in digraphs

    Publication Year: 1999, Page(s):81 - 89
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (140 KB)

    This paper presents the first fully dynamic algorithms for maintaining all-pairs shortest paths in digraphs with positive integer weights less than b. For approximate shortest paths with an error factor of (2+ε), for any positive constant ε, the amortized update time is O(n2 log2 n/log log n); for an error factor of (1+ε) the amortized update time is O(n... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Error reduction for extractors

    Publication Year: 1999, Page(s):191 - 201
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (192 KB)

    An extractor is a function which extracts (almost) truly random bits from a weak random source, using a small number of additional random bits as a catalyst. We present a general method to reduce the error of any extractor. Our method works particularly well in the case that the original extractor extracts up to a constant function of the source min-entropy and achieves a polynomially small error.... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved combinatorial algorithms for the facility location and k-median problems

    Publication Year: 1999, Page(s):378 - 388
    Cited by:  Papers (46)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (164 KB)

    We present improved combinatorial approximation algorithms for the uncapacitated facility location and k-median problems. Two central ideas in most of our results are cost scaling and greedy improvement. We present a simple greedy local search algorithm which achieves an approximation ratio of 2.414+ε in O˜(n2/ε) time. This also yields a bicriteria approximation tradeoff... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.