By Topic

IBM Journal of Research and Development

Issue 1 • Date Jan. 2003

Filter Results

Displaying Results 1 - 9 of 9
  • Preface

    Page(s): 3
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (22 KB)  

    The IBM Journal of Research and Development has presented several special issues on the work of the Mathematical Sciences Department at the IBM Thomas J. Watson Research Center. The present issue coincides with the Department's 40th anniversary. The included papers reflect both our legacy and our future directions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The mathematics of halftoning

    Page(s): 5 - 15
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (159 KB)  

    This paper describes some mathematical aspects of halftoning in digital printing. Halftoning is the technique of rendering a continuous range of colors using only a few discrete ones. There are two major classes of methods: dithering and error diffusion. Some discussion is presented concerning the method of dithering, but the main emphasis is on error diffusion. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data-intensive analytics for predictive modeling

    Page(s): 17 - 23
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (77 KB)  

    The Data Abstraction Research Group was formed in the early 1990s, to bring focus to the work of the Mathematical Sciences Department in the emerging area of knowledge discovery and data mining (KD & DM). Most activities in this group have been performed in the technical area of predictive modeling, roughly at the intersection of machine learning, statistical modeling, and database technology. There has been a major emphasis on using business and industrial problems to motivate the research agenda. Major accomplishments include advances in methods for feature analysis, rule-based pattern discovery, and probabilistic modeling, and novel solutions for insurance risk management, targeted marketing, and text mining. This paper presents an overview of the group's major technical accomplishments. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On greedy algorithms, partially ordered sets, and submodular functions

    Page(s): 25 - 30
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (95 KB)  

    Recent developments in the use of greedy algorithms in linear programming are reviewed and extended. We find a common generalization of some theorems of Queyranne—Spieksma— Tardella, Faigle—Kern, and Fujishige about greedy algorithms for linear programs in diverse contexts. Additionally, we extend a well-known theorem of Topkis about submodular functions on the product of chains to submodular functions on the product of lattices. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High-performance linear algebra algorithms using new generalized data structures for matrices

    Page(s): 31 - 55
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (626 KB)  

    We present a novel way to produce dense linear algebra factorization algorithms. The current state-of-the-art (SOA) dense linear algebra algorithms have a performance inefficiency, and thus they give suboptimal performance for most LAPACK factorizations. We show that using standard Fortran and C two-dimensional arrays is the main source of this inefficiency. For the other standard format (packed one-dimensional arrays for symmetric and/or triangular matrices), the situation is much worse. We show how to correct these performance inefficiencies by using new data structures (NDS) along with so-called kernel routines. The NDS generalize the current storage layouts for both standard formats. We use the concept of Equivalence and Elementary Matrices along with coordinate (linear) transformations to prove that our method works for an entire class of dense linear algebra algorithms. Also, we use the Algorithms and Architecture approach to explain why our new method gives higher efficiency. The simplest forms of the new factorization algorithms are a direct generalization of the commonly used LINPACK algorithms. On IBM platforms they can be generated from simple, textbook-type codes by the XLF Fortran compiler. On the IBM POWER3 processor, our implementation of Cholesky factorization achieves 92% of peak performance, whereas conventional SOA full-format LAPACK DPOTRF achieves 77% of peak performance. All programming for our NDS can be accomplished in standard Fortran through the use of three- and four-dimensional arrays. Thus, no new compiler support is necessary. Finally, we describe block hybrid formats (BHF). BHF allow one to use no additional storage over conventional (full and packed) matrix storage. This means that new algorithms based on BHF can be used as a backward-compatible replacement for LAPACK or LINPACK algorithms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Common Optimization INterface for Operations Research: Promoting open-source software in the operations research community

    Page(s): 57 - 66
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (90 KB)  

    The Common Optimization INterface for Operations Research (COIN-OR, http://www.coin-or.org/) is an initiative to promote open-source software for the operations research (OR) community. In OR practice and research, software is fundamental. The dependence of OR on software implies that the ways in which software is developed, managed, and distributed can have a significant impact on the field. Open source is a relatively new software development and distribution model which offers advantages over current practices. Its viability depends on the precise definition of open source, on the culture of a distributed developer community, and on a version-control system which makes distributed development possible. In this paper, we review open-source philosophy and culture, and present the goals and status of COIN-OR. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Ergodic theory of one-dimensional dynamics

    Page(s): 67 - 76
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (173 KB)  

    During the past fifty years a clearer understanding of one-dimensional dynamics has emerged. This paper summarizes the main results of the probabilistic theory of one-dimensional dynamics and shows the behavior to be surprisingly rich and a good starting point for the general theory of dynamics. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Estimating the efficiency of collaborative problem-solving, with applications to chip design

    Page(s): 77 - 88
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (135 KB)  

    We present a statistical framework to address questions that arise in general problems involving collaboration of several contributors. One instance of this problem occurs in the complex process of designing ultralarge-scale-integration (ULSI) semiconductor chips. In these processes, computer-aided design tools are treated as “black boxes.” In most cases, the automated design tools operate on designs and successfully complete a specified task to create designs that satisfy specified design criteria. In other cases involving complex designs, however, the tools are unable to create designs that satisfy the specified criteria. In both situations, the performance of the tools can be enhanced with systematic external intervention that is implemented with some supplemental algorithm. This algorithm can either be fully automated or be implemented by hand, relying on formally describable human expertise. In this intervention, the supplemental algorithm and the automated design tool take turns to move the design from one configuration to another until either the task is complete or further improvements are not possible or necessary. In such a setting, a number of questions arise about how to measure the effectiveness of the external intervention. One question, for example, is whether the external intervention consistently assists the progress of the automated program. This situation is an instance of a general problem that we address in this paper. As an example, we apply the statistical framework to the problem of routing a functional unit of the IBM POWER4 microprocessor. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mathematical sciences in the nineties

    Page(s): 89 - 96
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (76 KB)  

    In the last decade of the twentieth century, we saw great progress in the mathematical sciences as well as changes in the activities of the mathematical scientist. These included the resolution of some well-known conjectures, the introduction of new areas of study, as well as the adoption of new tools and new methods of operation. The IBM Research Division was an active participant in many of these events. We discuss here a selection of these, focusing on some to which contributions were made by mathematicians at IBM Research. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

The IBM Journal of Research and Development is a peer-reviewed technical journal, published bimonthly, which features the work of authors in the science, technology and engineering of information systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Clifford A. Pickover
IBM T. J. Watson Research Center