By Topic

Computer

Issue 10 • Date Oct. 1996

Filter Results

Displaying Results 1 - 16 of 16
  • Essays on Fifty Years of Computing [Guest Editor's Introduction]

    Publication Year: 1996
    Save to Project icon | Request Permissions | PDF file iconPDF (396 KB)  
    Freely Available from IEEE
  • Software patterns: Design utilities

    Publication Year: 1996
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1289 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Theoretical contributions of AI

    Publication Year: 1996
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (866 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software engineering programs for the next century

    Publication Year: 1996 , Page(s): 117 - 118
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (276 KB)  

    The field of software engineering and its subdisciplines have grown tremendously in the past six years. IEEE Computer Society conferences and working committees have helped this process by greatly increasing information dissemination and the opportunity to publish quality work in many topic areas. These activities are coordinated by the Technical Council on Software Engineering, an international voluntary organization consisting of Computer Society members and others interested in promoting the field. The author looks at the work of the council which provides a forum for the exchange of ideas among interested practitioners, researchers, developers, maintainers, and students throughout the world View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The challenge of artificial intelligence

    Publication Year: 1996 , Page(s): 86 - 98
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4420 KB)  

    Artificial intelligence (AI) is a relatively young discipline, yet it has already led to general-purpose problem-solving methods and novel applications. Ultimately, AI's goals of creating models and mechanisms of intelligent action can be realized only in the broader context of computer science. Creating mechanisms for sharing of knowledge, knowhow, and literacy is the challenge. The great Chinese philosopher Kuan-Tzu once said: “If you give a fish to a man, you will feed him for a day. If you give him a fishing rod, you will feed him for life.” We can go one step further: If we can provide him with the knowledge and the know-how for making that fishing rod, we can feed the whole village. Therein lies the promise-and the challenge-of AI View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Perspectives on supercomputing: three decades of change

    Publication Year: 1996 , Page(s): 99 - 111
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3532 KB)  

    I am fortunate to have had access to supercomputers for the last 28 years. Over this time I have used them to simulate time-dependent fluid flows in the compressible regime. Strong shocks and unstable multifluid boundaries, along with the phenomenon of fluid turbulence, have provided the simulation complexity that demands supercomputer power. The supercomputers I have used-the CDC 6600, 7600, and Star-100, the Cray-1, Cray-XMP, Cray-2, and Cray C-90, the Connection Machines CM-2 and CM-5, the Cray T3D, and the Silicon Graphics Challenge Array and Power Challenge Array-span three revolutions in supercomputer design: the introduction of vector supercomputing, parallel supercomputing on multiple CPUs, and supercomputing on hierarchically organized clusters of microprocessors with cache memories. The last revolution is still in progress, so its outcome is somewhat uncertain. I view these design revolutions through the prism of my specialty and through applications of the supercomputers I have used. Also, because these supercomputer design changes have driven equally important changes in numerical algorithms and the programs that implement them, I describe the three revolutions from this perspective View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A fast-track standards process

    Publication Year: 1996 , Page(s): 115 - 116
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (432 KB)  

    The IEEE Portable Application Standards Committee (PASC), in cooperation with the IEEE Standards Department, will be trying a new fast-track process that would let us adopt a new standard in as little as six months. Normally, it takes IEEE standards committees from two to 10 years to adopt a standard. If the fast-track process trial is successful, other IEEE standards committees could adopt the procedure. The fast-track process includes all of the procedures used in the normal IEEE standards-adoption process, with two small exceptions. The fast-track method saves time by allowing some of the adoption procedure's steps to be undertaken at the same time, rather than in sequence, and by simplifying the procedure used for votes on proposed standards. The process would let us adopt standards more quickly, which would let vendors, users, and others reap the benefits of these standards sooner. The fast-track process is not suitable for many proposed standards. For example, it won't work for standards that are very complex or controversial, because they need more time for study and discussion. The process is best suited for small, stand-alone standards and amendments to standards. PASC wants to use the process for proposals that are backed by vendors, users, and others who would be affected. PASC would not sponsor a ballot for a standard when a competing version is being developed by another group View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • WWW: past, present, and future

    Publication Year: 1996 , Page(s): 69 - 77
    Cited by:  Papers (21)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1696 KB)  

    The World Wide Web is simply defined as the universe of global network-accessible information. It is an abstract space within which people can interact, and it is chiefly populated by interlinked pages of text, images, and animations, with occasional sounds, videos, and three-dimensional worlds. The Web marks the end of an era of frustrating and debilitating incompatibility between computer systems. It has created an explosion of accessibility, with many potential social and economical impacts. The Web was designed to be a space within which people could work on a project. This was a powerful concept, in that: people who build a hypertext document of their shared understanding can refer to it at all times; people who join a project team can have access to a history of the team's activities, decisions, and so on; the work of people who leave a team can be captured for future reference; and a team's operations, if placed on the Web, can be machine-analyzed in a way that could not be done otherwise. The Web was originally supposed to be a personal information system and a tool for groups of all sizes, from a team of two to the entire world. People have rapidly developed new features for the Web, because of its tremendous commercial potential. This has made the maintenance of globalWeb interoperability a continuous task. This has also created a number of areas into which research must continue View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Microprocessor-based computers

    Publication Year: 1996 , Page(s): 27 - 37
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2292 KB)  

    The development of the computer and the evolution of the integrated circuit have been intertwined since the first commercial IC appeared in 1961. The author explores microprocessor history and ponders future developments View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • History and impact of computer standards

    Publication Year: 1996 , Page(s): 79 - 85
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1200 KB)  

    A computer trade publication recently devoted a column in one of its issues to complaints about the uselessness of standards developing organizations and, by extension, of standards. Despite those complaints, the author of the column probably saved a great deal of time and effort by composing it on a computer system made possible only through standards. Without such standards, the author probably would have been using a manual typewriter with a proprietary keyboard, paper, and ribbon. There are many other examples of how important standards are to the computer industry and to many other industries. For example, various sources estimate that US industry spends between $17 billion and $30 billion each year on standards. The computer industry contributes its share to that total. In fact, standards are especially important to a young and quickly changing industry like the computer industry. They stabilize technology and encourage investment View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Toward the information network

    Publication Year: 1996 , Page(s): 59 - 67
    Cited by:  Papers (8)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2448 KB)  

    The information service industry will drive economic wealth in the twenty-first century. The transport part of that industry-information networks-are emerging as the critical infrastructure for commerce, similar to the role of ships and railroads in the 1800s and automobiles and airlines in the 1900s. Information networks will transport the primary commercial commodity of the next century-digitized information-and information appliances will put it at your fingertips. Computer communications networks, or “information highways”, have evolved from their beginnings in the Arpanet into the modern Internet. And the suite of Internet protocols has spread into commercial enterprises, in the form of intranets. Simultaneously, advances in computer processing and storage have enabled agents, directory services, and network computing to enrich the available information and tools. The continuing proliferation of media-rich applications, such as videoconferencing, on-line gaming, and virtual reality will drive the further evolution of the next generation of connectivity, the Information Network. This article explains the sequential evolution of three phases of networking-the Arpanet, the Internet, and the Information Network-and explores the barriers to realizing the new information network paradigm View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • At a crossroads: Connectivity model determines distribution of power

    Publication Year: 1996 , Page(s): 122 - 124
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (376 KB)  

    More expectation and hyperbole surround the Internet than perhaps any other innovation in recent history. We hear a lot about the “information superhighway” and how we want to get everyone connected, but we rarely hear any serious discussion about the appropriate model for that connectivity. Whether or not the Internet lives up to our expectations may well depend on the model of connectivity we adopt-that is, on who holds the tools of communication and information dissemination in our society. Historically, access to information and our ability to communicate with large numbers of people over a wide area have been limited, and control of information has been a means to power. However, technological innovations have vastly increased our communication options. Indeed, technology is power, and the model of Internet connectivity we choose may determine not only how power is distributed but also how we relate to each other and to our institutions. A look back at earlier technology-driven power shifts can show why View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Don't judge a software license by its cover

    Publication Year: 1996 , Page(s): 114 - 115
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (276 KB)  

    Software licenses are of vital concern to vendors and users. Software vendors use contracts, called licenses, to make sure that their products are used in a way that will benefit them. Users want to know the conditions that licenses impose on software so they can buy software that meets their needs. Beyond this, however, licenses and their enforceability are not always a straightforward matter. Are you bound by the conditions of a license even if the license is inside a container of shrinkwrap software, and you can't see its terms until after you buy the product? What if you can't see the license until you load your software into your computer and its terms appear on the monitor? This is particularly an issue with software sold by phone or mail, or over the Internet. In some of these cases, buyers purchase only a serial number or security code that activates publicly accessible software. In many cases, buyers don't even receive a solid product. They receive only a stream of electrons that contains data, an application program, instructions, and license conditions. The thorny legal issues that these situations raise recently confronted the US Court of Appeals for the Seventh Circuit, which hears appeals of cases from US District Courts in Illinois, Indiana, and Wisconsin. The changing nature of the software business has raised questions about the enforceability of shrinkwrap licenses View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Schema evolution: Concepts, terminology, and solutions

    Publication Year: 1996 , Page(s): 119 - 121
    Cited by:  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (596 KB)  

    Most applications must keep objects from one session to the next. This is known as persistence. But objects are not raw data: They are instances of classes. What happens if an object's class (its generator) changes from one session to the next? This problem is known as schema evolution (the term schema is borrowed from relational databases). This column defines a framework for addressing schema evolution in object technology View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Advances in software engineering

    Publication Year: 1996 , Page(s): 47 - 58
    Cited by:  Papers (10)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4112 KB)  

    Software is the key technology in applications as diverse as accounting, hospital management, aviation, and nuclear power. Application advances in different domains such as these-each with different requirements-have propelled software development from small batch programs to large, real-time programs with multimedia capabilities. To cope, software's enabling technologies have undergone tremendous improvement in hardware, communications, operating systems, compilers, databases, programming languages, and user interfaces, among others. In turn, those improvements have fueled even more advanced applications. Improvements in VLSI technology and multimedia, for example, have resulted in faster, more compact computers that significantly widened the range of software applications. Database and user interface enhancements, on the other hand, have spawned more interactive and collaborative development environments. Such changes have a ripple effect on software development processes as well as on software techniques and tools. In this article, we highlight software development's crucial methods and techniques of the past 30 years View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evolution of data management

    Publication Year: 1996 , Page(s): 38 - 46
    Cited by:  Papers (2)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1704 KB)  

    Computers can now store all forms of information: records, documents, images, sound recordings, videos, scientific data, and many new data formats. Society has made great strides in capturing, storing, managing, analyzing, and visualizing this data. These tasks are generically called data management. This article sketches the evolution of data management systems. There have been six distinct phases in data management. Initially, data was manually processed. The next step used punched-card equipment and electromechanical machines to sort and tabulate millions of records. The third phase stored data on magnetic tape and used stored-program computers to perform batch processing on sequential files. The fourth phase introduced the concept of a database schema and on-line navigational access to the data. The fifth step automated access to relational databases and added distributed and client server processing. We are now in the early stages of sixth-generation systems that store richer data types, notably documents, images, voice, and video data. These sixth-generation systems are the storage engines for the emerging Internet and intranets. Early data management systems automated traditional information processing. Today they allow fast, reliable, and secure access to globally distributed data. Tomorrow's systems will access and summarize richer forms of data. It is argued that multimedia databases will be a cornerstone of cyberspace View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

Computer, the flagship publication of the IEEE Computer Society, publishes highly acclaimed peer-reviewed articles written for and by professionals representing the full spectrum of computing technology from hardware to software and from current research to new applications.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Sumi Helal
University of Florida
sumi.helal@gmail.com