By Topic

Proceedings 38th Annual Symposium on Foundations of Computer Science

20-22 Oct. 1997

Filter Results

Displaying Results 1 - 25 of 65
  • Proceedings 38th Annual Symposium on Foundations of Computer Science

    Publication Year: 1997
    Request permission for commercial reuse | PDF file iconPDF (277 KB)
    Freely Available from IEEE
  • Author index

    Publication Year: 1997, Page(s):605 - 606
    Request permission for commercial reuse | PDF file iconPDF (94 KB)
    Freely Available from IEEE
  • A 2-approximation algorithm for the directed multiway cut problem

    Publication Year: 1997, Page(s):548 - 553
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (428 KB)

    A directed multiway cut separates a set of terminals s1,...,sκ in a directed capacitated graph G=(V, E). Finding a minimum capacity directed multiway cut is an NP-complete problem. We give a polynomial-time algorithm that achieves an approximation factor of 2 for this problem. This improves the result of Garg, Vazirani and Yannakakis (1994) who gave an algorithm that ac... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Buy-at-bulk network design

    Publication Year: 1997, Page(s):542 - 547
    Cited by:  Papers (46)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB)

    The essence of the simplest buy-at-bulk network design problem is buying network capacity “wholesale” to guarantee connectivity from all network nodes to a certain central network switch. Capacity is sold with “volume discount”: the more capacity is bought, the cheaper is the price per unit of bandwidth. We provide O(log2n) randomized approximation algorithm for ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved approximations for shallow-light spanning trees

    Publication Year: 1997, Page(s):536 - 541
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB)

    We consider the bicriteria optimization problem of computing a shallow-light tree. Given a directed graph with two unrelated cost functions defined on its edges: weight and length, and a designated root vertex, the goal is to find a minimum weight spanning tree such that the path lengths from its root to the rest of the vertices are bounded. This problem has several applications in network and VLS... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Storage management for evolving databases

    Publication Year: 1997, Page(s):353 - 362
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (864 KB)

    The problem of maintaining data that arrives continuously over time is increasingly prevalent in databases and digital libraries. Building on a model for sliding window indices developed by N. Shivakumar and H. Garcia-Molina (1997), we devise efficient algorithms for some of the central problems that arise. We also show connections between the problems in this model and some fundamental problems i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Nearly tight bounds on the learnability of evolution

    Publication Year: 1997, Page(s):524 - 533
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (952 KB)

    Evolution is often modeled as a stochastic process which modifies DNA. One of the most popular and successful such processes are the Cavender-Farris (CF) trees, which are represented as edge weighted trees. The Phylogeny Construction Problem is that of, given κ samples drawn from a CF tree, output a CF tree which is close to the original. Each CF tree naturally defines a random variable, and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Contention resolution with guaranteed constant expected delay

    Publication Year: 1997, Page(s):213 - 222
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (880 KB)

    We study contention resolution in multiple-access channels such as the Ethernet. Under a stochastic model of continuous packet generation from a set of n processors, we construct a protocol which guarantees constant expected delay for generation rates up to a fixed constant λ0<1. Previous protocols which are stable for constant arrival rates do not guarantee constant expected ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An improved algorithm for quantifier elimination over real closed fields

    Publication Year: 1997, Page(s):56 - 65
    Cited by:  Papers (7)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (864 KB)

    We give a new algorithm for quantifier elimination in the first order theory of real closed fields that improves the complexity of the best known algorithm for this problem till now. Unlike previously known algorithms the combinatorial part of the complexity of this new algorithm is independent of the number of free variables. Moreover, under the assumption that each polynomial in the input depend... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Minimizing flow time nonclairvoyantly

    Publication Year: 1997, Page(s):345 - 352
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (604 KB)

    We consider the problem of scheduling a collection of dynamically arriving jobs with unknown execution times so as to minimize the average response/flow time. This is the classic CPU scheduling problem faced by time sharing operating systems. In the standard 3-field scheduling notation this is the nonclairvoyant version of 1|pmtn, rj|ΣFj. Its easy to see that every algo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning noisy perceptrons by a perceptron in polynomial time

    Publication Year: 1997, Page(s):514 - 523
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (768 KB)

    Learning perceptrons (linear threshold functions) from labeled examples is an important problem in machine learning. We consider the problem where labels are subjected to random classification noise. The problem was known to be PAC learnable via a hypothesis that consists of a polynomial number of linear thresholds (due to A. Blum, A. Frieze, R. Kannan, and S. Vempala (1996)). The question of whet... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Making nondeterminism unambiguous

    Publication Year: 1997, Page(s):244 - 253
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (776 KB)

    We show that in the context of nonuniform complexity, nondeterministic logarithmic space bounded computation can be made unambiguous. An analogous result holds for the class of problems reducible to context-free languages. In terms of complexity classes, this can be stated as: NL/poly=UL/poly LogCFL/poly=UAuxPDA(log n, nO(1))/poly View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Constant depth circuits and the Lutz hypothesis

    Publication Year: 1997, Page(s):595 - 604
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (976 KB)

    Resource-bounded measure theory is a study of complexity classes via an adaptation of the probabilistic method. The central hypothesis in this theory is the assertion that NP does not have measure zero in Exponential Time. This is a quantitative strengthening of NP≠P. We show that the analog in P of this hypothesis fails dramatically. In fact, we show that NTIME[n1/11] has measure ze... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The analysis of a list-coloring algorithm on a random graph

    Publication Year: 1997, Page(s):204 - 212
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (692 KB)

    We introduce a natural k-coloring algorithm and analyze its performance on random graphs with constant expected degree c (Gn,p=cn/). For k=3 our results imply that almost all graphs with n vertices and 1.923 n edges are 3-colorable. This improves the lower bound on the threshold for random 3-colorability significantly and settles the last case of a long-standing open question of Bolloba... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lower bounds for the signature size of incremental schemes

    Publication Year: 1997, Page(s):438 - 447
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (924 KB)

    We show lower bounds for the signature size of incremental schemes which are secure against substitution attacks and support single block replacement. We prove that for documents of n blocks such schemes produce signatures of Ω(n1(2+c)/) bits for any constant c>0. For schemes accessing only a single block resp. A constant number of blocks for each replacement this bound can be ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exploiting locality for data management in systems of limited bandwidth

    Publication Year: 1997, Page(s):284 - 293
    Cited by:  Papers (15)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1120 KB)

    This paper deals with data management in computer systems in which the computing nodes are connected by a relatively sparse network. We consider the problem of placing and accessing a set of shared objects that are read and written from the nodes in the network. These objects are, e.g., global variables in a parallel program, pages or cache lines in a virtual shared memory system, shared files in ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Deciding properties of polynomials without factoring

    Publication Year: 1997, Page(s):46 - 55
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (888 KB)

    The polynomial time algorithm of Lenstra, Lenstra, and Lovasz (1982) for factoring integer polynomials and variants thereof have been widely used to show that various computational problems in number theory have polynomial time solutions. Among them is the problem of factoring polynomials over algebraic number fields, which is used itself as a major subroutine for several other algorithms. Althoug... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The competitive analysis of risk taking with applications to online trading

    Publication Year: 1997, Page(s):336 - 344
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (660 KB)

    Competitive analysis is concerned with minimizing a relative measure of performance. When applied to financial trading strategies, competitive analysis leads to the development of strategies with minimum relative performance risk. This approach is too inflexible. Many investors are interested in managing their risk: they may be willing to increase their risk for some form of reward. They may also ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A random sampling based algorithm for learning the intersection of half-spaces

    Publication Year: 1997, Page(s):508 - 513
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (564 KB)

    We present an algorithm for learning the intersection of half spaces in n dimensions. Over nearly uniform distributions, it runs in polynomial time for up to O(logn/loglogn) half spaces or, more generally for any number of half spaces whose normal vectors lie in an O(log n/log log n) dimensional subspace. Over less restricted “non-concentrated” distributions it runs in polynomial time ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Satisfiability Coding Lemma

    Publication Year: 1997, Page(s):566 - 574
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (708 KB)

    We present and analyze two simple algorithms for finding satisfying assignments of κ-CNFs (Boolean formulae in conjunctive normal form with at most κ literals per clause). The first is a randomized algorithm which, with probability approaching 1, finds a satisfying assignment of a satisfiable κ-CNF formula F in time O(n 2|F|2n-nκ/). The second algorith... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Does parallel repetition lower the error in computationally sound protocols?

    Publication Year: 1997, Page(s):374 - 383
    Cited by:  Papers (10)  |  Patents (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (980 KB)

    Whether or not parallel repetition lowers the error has been a fundamental question in the theory of protocols, with applications in many different areas. It is well known that parallel repetition reduces the error at an exponential rate in interactive proofs and Arthur-Merlin games. It seems to have been taken for granted that the same is true in arguments, or other proofs where the soundness onl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Separation of the monotone NC hierarchy

    Publication Year: 1997, Page(s):234 - 243
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (864 KB)

    We prove tight lower bounds, of up to nε, for the monotone depth of functions in monotone-P. As a result we achieve the separation of the following classes. 1. Monotone-NC≠monotone-P. 2. ∀i⩾1, monotone-NCi≠monotone-NCi+1. 3. More generally: For any integer function D(n), up to nε (for some ε>0), we give an explicit exa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Computable obstructions to wait-free computability

    Publication Year: 1997, Page(s):80 - 89
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (932 KB)

    Effectively computable obstructions are associated to a distributed decision task (ℐ,𝒪,Δ) in the asynchronous, wait-free, read-write shared-memory model. The key new ingredient of this work is the association of a simplicial complex 𝒯, the task complex, to the input-output relation d. The task determines a simplicial map α from 𝒯 to the input complex ℐ. The exi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Tight bounds for depth-two superconcentrators

    Publication Year: 1997, Page(s):585 - 594
    Cited by:  Papers (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (728 KB)

    We show that the minimum size of a depth-two N-superconcentrator is Θ(Nlog2N/loglogN). Before this work, optimal bounds were known for all depths except two. For the upper bound, we build superconcentrators by putting together a small number of disperser graphs; these disperser graphs are obtained using a probabilistic argument. We present two different methods for showing lower b... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Alternating-time temporal logic

    Publication Year: 1997, Page(s):100 - 109
    Cited by:  Papers (55)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (984 KB)

    Temporal logic comes in two varieties: linear-time temporal logic assumes implicit universal quantification over all paths that are generated by system moves; branching-time temporal logic allows explicit existential and universal quantification over all paths. We introduce a third, more general variety of temporal logic: alternating-time temporal logic offers selective quantification over those p... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.