By Topic

IBM Journal of Research and Development

Issue 6 • Date Nov. 1982

Filter Results

Displaying Results 1 - 17 of 17
  • Document Analysis System

    Publication Year: 1982 , Page(s): 647 - 656
    Cited by:  Papers (118)  |  Patents (69)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (778 KB)  

    This paper outlines the requirements and components for a proposed Document Analysis System, which assists a user in encoding printed documents for computer processing. Several critical functions have been investigated and the technical approaches are discussed. The first is the segmentation and classification of digitized printed documents into regions of text and images. A nonlinear, run-length smoothing algorithm has been used for this purpose. By using the regular features of text lines, a linear adaptive classification scheme discriminates text regions from others. The second technique studied is an adaptive approach to the recognition of the hundreds of font styles and sizes that can occur on printed documents. A preclassifier is constructed during the input process and used to speed up a well-known pattern-matching method for clustering characters from an arbitrary print source into a small sample of prototypes. Experimental results are included. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automatic Scaling of Digital Print Fonts

    Publication Year: 1982 , Page(s): 657 - 666
    Cited by:  Patents (7)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (769 KB)  

    New raster-based printers form character patterns using carefully designed matrices of dots. It is desirable to be able to use fonts designed for one printer on a different machine, but to do so the dot matrix patterns should first be scaled to the second printer's resolution. If the scaling is carried out as a simple interpolation, however, severe degradation in the appearance of the characters may occur. A new algorithm reduces such degradation by recognizing attributes associated with print character quality in the original patterns and then correcting the scaled patterns in order to maintain those attributes. Attributes that are detected and preserved during scaling include local and global symmetries, stroke width, sharpness of corners, and smoothness of contour. The method has been used both to scale low-resolution fonts to a finer representation and to reduce the scale of high-resolution photocomposer fonts for output on an office-type printer. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analysis of Linear Interpolation Schemes for Bi-Level Image Applications

    Publication Year: 1982 , Page(s): 667 - 680
    Cited by:  Papers (3)  |  Patents (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (840 KB)  

    In the office, it is often necessary to scan a picture at a certain resolution and then reproduce it at a different (usually higher) resolution. This conversion can be achieved by interpolating the scanned signal between the sample intervals. This paper discusses a class of linear interpolating methods based on resampling polynomial functions. In addition, we introduce new methods to compare the performance of these interpolating schemes. The signal models used are one-dimensional step and pulse functions. These bi-level models are sufficient to describe many black/white documents. The performance of the linear interpolators is determined by evaluating their accuracy in reconstructing the original bi-level signal. The analysis considers the effects of the coarse scan and fine print intervals as well as the quantization effects. Experiments using the IEEE facsimile chart as input verify the analytical findings. The results show the advantage of using odd-order polynomials, such as the first order and TRW cubic. Also, we discuss the relationship between the interpolating ratio and the number of quantization levels needed to represent the scanned signal. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Word Autocorrelation Redundancy Match (WARM) Technology

    Publication Year: 1982 , Page(s): 681 - 686
    Cited by:  Papers (1)  |  Patents (47)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (501 KB)  

    Word Autocorrelation Redundancy Match (WARM) is an intelligent facsimile technology which compresses the image of textual documents at nominally 145:1 by use of complex symbol matching on both the word and character level. At the word level, the complex symbol match rate is enhanced by the redundancy of the word image. This creates a unique image compression capability that allows a document to be scanned for the 150 most common words, which make up roughly 50% of the text by usage, and upon their match the words are replaced for storage/transmission by a word identification number. The remaining text is scanned to achieve compaction at the character level and compared to both a previously stored library and a dynamically built library of complex symbol (character) shapes. Applying the complex symbol matching approach at both the word and character levels results in greater efficiency than is achievable by state of the art CCITT methods. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Digital Halftoning of Images

    Publication Year: 1982 , Page(s): 687 - 697
    Cited by:  Papers (9)  |  Patents (12)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (1116 KB)  

    Most printers and some display devices are bilevel (black or white) and therefore not capable of reproducing continuous tone pictures. Digital halftoning algorithms transform digital gray scale images into bilevel ones which give the appearance of containing various shades of gray. A halftoning algorithm is presented in which novel concepts are combined resulting in an output image in which moiré patterns are suppressed and, at the same time, the edges are enhanced. Various other artifacts associated with the halftoning process, such as contouring due to coarse quantization or to textural changes, are also absent from the output images in the proposed scheme. The algorithm separates the image into many small clusters which are processed independently and, therefore, it is capable of parallel implementation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Improved Segmentation and Coding Algorithm for Binary and Nonbinary Images

    Publication Year: 1982 , Page(s): 698 - 707
    Cited by:  Papers (8)  |  Patents (3)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (719 KB)  

    This paper presents a new segmentation and coding algorithm for nonbinary images. The algorithm performs contour coding of regions of equally valued and connected pixels. It consists of two distinct phases: raster scanning and border following. In this sense it is similar to algorithms presented by Kruse. However, the algorithm of this paper is considerably improved since it correctly segments truly nonbinary images. The basic idea of the algorithm is to “coat” (color, label) the borders (the cracks) between the regions from both sides in two separate border-following procedures called island following and object following. Thus, all adjacencies between the objects are systematically explored and noted. Furthermore, the raster scanner, which exhaustively searches the image for new regions, can easily determine from existing/nonexisting coating which boundaries have been traced out and which have not. The algorithm can be considerably simplified for the binary image case. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reduced Data Re-Order Complexity Properties of Polynomial Transform 2D Convolution and Fourier Transform Methods

    Publication Year: 1982 , Page(s): 708 - 714
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (497 KB)  

    This paper presents new results concerning the matrix data re-order requirements of polynomial-transform-based 2D convolution and 2D Fourier Transform methods which can be employed in digital processing of images and other 2D problems. The results indicate that several power-of-2 length-modified ring polynomial transform methods developed by Nussbaumer allow the total avoidance of the row-column matrix transpose commonly encountered in other algorithmic approaches, while also providing a number of other computational advantages. It is demonstrated that this property can be the source of significantly improved throughput on a number of existing data processing structures. An execution time comparison with an efficient Fast Fourier Transform algorithm base is made assuming the use of general register architecture and array processor units. It is also assumed that one makes use of recently developed efficient matrix transpose methods by Eklundh and Ari to support 2D FFT data re-order requirements. These comparisons demonstrate a two to four times throughput improvement for the use of the polynomial transform method in place of the 2D FFT approach to circularly convolve or generate 2D Fourier transforms for large 2D fields in the range 1024 × 1024 to 8192 × 8192. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Importance of Higher-Order Components to Multispectral Classification

    Publication Year: 1982 , Page(s): 715 - 723
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (782 KB)  

    A Landsat multispectral image was combined with the corresponding digital terrain elevation data to study several information extraction procedures. Principal component and limited multispectral classification procedures were conducted on 1024 × 1024 four-band Landsat and five-band (Landsat plus terrain data) images, and color composites as well as quantitative information were generated. Selected results of this preliminary investigation confirm the usefulness of the principal component analysis in a qualitative presentation of the multi-band data and its association with a significant reduction in dimensionality. However, unlike some other investigators, we found that the full dimensionality must be retained when the information content of the data has to be preserved quantitatively. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Some Experiments in Image Vectorization

    Publication Year: 1982 , Page(s): 724 - 734
    Cited by:  Papers (9)  |  Patents (3)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (828 KB)  

    The application of vectorization algorithms to digital images derived from natural scenes is discussed. It is argued that the fractal nature of these scenes precludes some of the savings in storage expected from vector over raster representation, although considerable savings still result. Experimental results are given. Algorithms for contour following, line thinning, and polygonal approximation well adapted to complex images are presented. Finally, the Map Manipulation System, an experimental program package designed to explore the interaction between vector and raster information, is described briefly. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Digital Multi-Image Analysis: Application to the Quantification of Rock Microfractography

    Publication Year: 1982 , Page(s): 735 - 745
    Cited by:  Patents (1)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (899 KB)  

    The microfissuration of a rock sample is analyzed using a multi-image formed of micrographs which were obtained under fluorescence and polarizing microscopy of the same sample area. Image analysis methods are applied to obtain descriptions of each type of picture, one showing the microfissure network and the other the texture of the rock. Descriptions in the form of tables of coordinates are used to quantify the features contained in the pictures. Finally, it is shown that relationships between these descriptions can result in the integration of the available information, providing more knowledge about microfissuration in the sample, including characterization and quantification of microcrack types according to their position with respect to the texture of the rock. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Backscatter and Attenuation Imaging from Ultrasonic Scanning in Medicine

    Publication Year: 1982 , Page(s): 746 - 758
    Cited by:  Papers (3)  |  Patents (5)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (1024 KB)  

    Images of backscatter, attenuation, and frequency dependence of attenuation are obtained based on a three-parameter model and computation techniques described in this paper. There are several critical sources of error: backscatter speckle, beamwidth distortion, and cross-coupling artifacts between attenuation and backscatter. An iterative method of imaging and filtering is developed which effectively reduces these errors. Stability of the numerical solution involving the large number of unknowns is obtained by image iteration as opposed to parameter iteration along individual transmitter rays. This method incorporates three basic functional aspects: (1) multiple scans to reduce speckle, beamwidth distortions, and certain cross-coupling artifacts, (2) pre-image filtering to decrease beam distortions and post-image filtering to reduce cross-coupling artifacts, and (3) proper sequencing of image reconstruction and filtering. Backscatter images formed by this image iteration method are significantly superior to standard B-scan images. Further, the image iteration method yields three images of the same scan field. The present investigation is based on simulated echo data from cyst-like and complex targets. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Algorithm for Separating Patterns by Ellipsoids

    Publication Year: 1982 , Page(s): 759 - 764
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (431 KB)  

    We give an algorithm for finding the ellipsoid of least volume containing a set of points in a finite-dimensional Euclidean space. Such ellipsoids have been proposed for separating patterns in a feature space. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cursive Script Recognition by Elastic Matching

    Publication Year: 1982 , Page(s): 765 - 771
    Cited by:  Papers (42)  |  Patents (9)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (560 KB)  

    Dynamic programming has been found useful for performing nonlinear time warping for matching patterns in automatic speech recognition. Here, this technique is applied to the problem of recognizing cursive script. The parameters used in the matching are derived from time sequences of x-y coordinate data of words handwritten on an electronic tablet. Chosen for their properties of invariance with respect to size and translation of the writing, these parameters are found particularly suitable for the elastic matching technique. A salient feature of the recognition system is the establishment, in a training procedure, of prototypes by each writer using the system. In this manner, the system is tailored to the user. Processing is performed on a word-by-word basis after the writing is separated into words. Using prototypes for each letter, the matching procedure allows any letter to follow any letter and finds the letter sequence which best fits the unknown word. A major advantage of this procedure is that it combines letter segmentation and recognition in one operation by, in essence, evaluating recognition at all possible segmentations, thus avoiding the usual segmentation-then-recognition philosophy. Results on cursive writing are presented where the alphabet is restricted to the lower-case letters. Letter recognition accuracy is over 95 percent for each of three writers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Recent papers by IBM authors

    Publication Year: 1982 , Page(s): 772 - 779
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (712 KB)  

    Reprints of the papers listed here may usually be obtained by writing directly to the authors. The authors' IBM divisions or groups are identified as follows: CHQ is Corporate Headquarters; CPD, Communication Products Division; DSD, Data Systems Division; FED, Field Engineering Division; FSD, Federal Systems Division; GPD, General Products Division; GSD, General Systems Division; GTD, General Technology Division; IPD, Information Products Division; ISG, Information Systems Group; IS&CG, Information Systems & Communications Group; IS&TG, Information Systems & Technology Group; NAD, National Accounts Division; NMD, National Marketing Division; RES, Research Division; SPD, System Products Division; and SRI, Systems Research Institute. Journals are listed alphabetically by title; papers are listed sequentially for each journal. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Recent IBM patents

    Publication Year: 1982 , Page(s): 780 - 783
    Save to Project icon | PDF file iconPDF (270 KB)  
    Freely Available from IEEE
  • Author Index for Papers in Volume 26

    Publication Year: 1982 , Page(s): 784 - 787
    Save to Project icon | PDF file iconPDF (274 KB)  
    Freely Available from IEEE
  • Subject Index for Papers in Volume 26

    Publication Year: 1982 , Page(s): 788 - 790
    Save to Project icon | PDF file iconPDF (256 KB)  
    Freely Available from IEEE

Aims & Scope

The IBM Journal of Research and Development is a peer-reviewed technical journal, published bimonthly, which features the work of authors in the science, technology and engineering of information systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Clifford A. Pickover
IBM T. J. Watson Research Center