By Topic

Computer

Issue 8 • Date Aug 1995

Filter Results

Displaying Results 1 - 11 of 11
  • Intellectual property protection for multimedia application. 2. Putting the pieces together

    Publication Year: 1995 , Page(s): 99 - 100
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (184 KB)  

    In its recently published guidelines (60 Fed. Reg. 28778, June 2, 1995), the US Patent and Trademark Office (PTO) said computer software programs stored in a tangible medium, such as a floppy disk, are patentable and must be examined to determine whether the substance of a computer-program related invention is a significant advance over prior technical achievement justifying the grant of a patent. In the past, the PTO had simply refused to examine the substance of such an invention. The PTO attributed its new approach to recent decisions by the Federal Circuit Court of Appeals, which decides all patent appeals, favoring the patenting of software-related inventions. The paper discusses the items affected and considers multimedia applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What goes into an information warehouse?

    Publication Year: 1995 , Page(s): 84 - 85
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (164 KB)  

    Recent articles dealing with the “information superhighway” and “information warehouse” have raised public awareness and interest in these topics, but they have contributed little toward answering a fundamental question: what kind of information do managers and workers use? For the Information Age, we need measures on the cost of creating, using and transporting information, as well as the cost of finding and fixing errors. We could also use some supplemental facts, such as the ratio of useful to extraneous information, the quantity of information created and destroyed annually, and the relative volume stored in paper, magnetic or optical form, or redundantly in multiple forms. Personnel, products and clients are the three areas that constitute the basic operating information that drives modern business. A hypothetical case study is presented that strongly suggests four topics in need of significant research to develop truly effective data-warehouse concepts and tools: (i) cross-references between domains, (ii) redundancy between paper and online storage, (iii) the ratio of graphics and images to alphanumeric or textual information, and (iv) effective metrics for normalizing information on data volumes and data quality View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • From nationalism to pragmatism: IT policy in China

    Publication Year: 1995 , Page(s): 64 - 73
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1004 KB)  

    The challenges and obstacles faced are many, but through foreign investment, joint ventures and technology transfer, China is slowly achieving hard-won advances in IT development and use. Its IT strategy has shifted from developing indigenous technological capabilities and producing a full range of computers to promoting IT use and producing PCs and components. Tariffs have been lowered to encourage use. Massive investments are planned to expand and upgrade the telecommunications network. To promote production, the government has established software parks, encouraged joint ventures with foreign IT firms, and organized state computer enterprises on a more commercial basis. China's policies have shown signs of success, as computer use and production have grown dramatically in recent years. The key to the success of China's technology policy seems to be rooted in pragmatism. Policy makers appear willing to change and adapt when existing policies are not achieving their goals or when new opportunities appear. This flexibility is critical when responding to the rapid changes in technology and international markets. If the trend toward increased market orientation and pragmatism continues through the transition to the post-Deng Xiaoping era, IT use and production should continue to flourish. An improved information infrastructure and increased IT use can in turn benefit the economy as a whole by improving economic productivity and by making timely market information available to producers and consumers in the huge Chinese economy View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Program comprehension during software maintenance and evolution

    Publication Year: 1995 , Page(s): 44 - 55
    Cited by:  Papers (108)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1004 KB)  

    Code cognition models examine how programmers understand program code. The authors survey the current knowledge in this area by comparing six program comprehension models: the Letovsky (1986) model; the Shneiderman and Mayer (1979) model; the Brooks (1983) model; Soloway, Adelson and Ehrlich's (1988) top-down model; Pennington's (1987) bottom-up model; and the integrated metamodel of von Mayrhauser and Vans (1994). While these general models can foster a complete understanding of a piece of code, they may not always apply to specialized tasks that more efficiently employ strategies geared toward partial understanding. We identify open questions, particularly considering the maintenance and evolution of large-scale code. These questions relate to the scalability of existing experimental results with small programs, the validity and credibility of results based on experimental procedures, and the challenges of data availability View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SPEC as a performance evaluation measure

    Publication Year: 1995 , Page(s): 33 - 42
    Cited by:  Papers (15)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (800 KB)  

    Potential computer system users or buyers usually employ a computer performance evaluation technique only if they believe its results provide valuable information. System Performance Evaluation Cooperative (SPEC) measures are perceived to provide such information and are therefore the ones most commonly used. SPEC measures are designed to evaluate the performance of engineering and scientific workstations, personal vector computers, and even minicomputers and superminicomputers. Along with the Transaction Processing Council (TPC) measures for database I/O performance, they have become de facto industry standards, but do SPEC's evaluation outcomes actually provide added information value? In this article, we examine these measures by considering their structure, advantages and disadvantages. We use two criteria in our examination: are the programs used in the SPEC suite properly blended to reflect a representative mix of different applications, and are they properly synthesized so that the aggregate measures correctly rank computers by performance? We conclude that many programs in the SPEC suites are superfluous; the benchmark size can be reduced by more than 50%. The way the measure is calculated may cause distortion. Substituting the harmonic mean for the geometric mean used by SPEC roughly preserves the measure, while giving better consistency. SPEC measures reflect the performance of the CPU rather than the entire system. Therefore, they might be inaccurate in ranking an entire system. To remedy these problems, we propose a revised methodology for obtaining SPEC measures View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Computer science research in Mexico

    Publication Year: 1995 , Page(s): 56 - 62
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (740 KB)  

    The North American Free Trade Agreement (NAFTA) brings Canada, Mexico and the USA closer as commercial partners and raises many questions because of the disparity in the size and strength of their economies. In the long run, the low labor costs said to give Mexico a competitive edge will become less relevant, since Mexican government officials see NAFTA as a tool to create jobs, increase salaries and raise average family income. The real issue is whether Mexico can effectively compete in a larger market and stimulate economic growth. To this end, technological development is one of the most important factors. To compete in the global economy, Mexico must stimulate its latent research community. NAFTA can help accomplish this, but scientists, universities and industry must play their role View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fibre Channel: a connection to the future

    Publication Year: 1995 , Page(s): 88 - 90
    Cited by:  Papers (5)  |  Patents (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (296 KB)  

    With commodity microprocessors on the horizon capable of processing hundreds of MIPS, current transmission rates cannot accommodate “Amdahl's Law”-where 1 Mbps of I/O is required for every MIPS of processing power-and will become a bottleneck to system performance in data-intensive applications. To remedy this shortcoming, ANSI Committee X3T11 initiated development of Fibre Channel, a switched protocol capable of transmitting at rates exceeding 1 Gbps, while still supporting existing protocols over both optical fiber and copper cables. Fibre Channel combines the best attributes of legacy channels and networks into a single standard that is a generic transport mechanism for data, voice and video. It is the key to scientific and business applications implemented in open and distributed architectures, because it removes the barriers to performance presented by the old methods of data communications. Fibre Channel introduces the high-performance, easy-to use, low-cost communications required by a new breed of processors and applications. Available today are new high-speed, scalable links to storage; high-performance networks enabling clusters, backbones, imaging, and visualization; and low-cost arbitrated loops providing efficient peripheral I/O View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Of deck chairs, suits, and quiz shows [Internet]

    Publication Year: 1995 , Page(s): 80 - 81
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (184 KB)  

    Should marketing-based, centralized management and control be forced on to a distributed and fundamentally anarchic network? It seems that those responsible for representing the public interest-specifically the Federal Communications Commission (FCC)-appear surprisingly uninterested in examining whether the public is currently being well-served by the existing Internet architecture. No-one seems to be asking the real National Information Infrastructure (NII) decision-makers the awkward questions on the public's behalf View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The value of the formal standards process

    Publication Year: 1995 , Page(s): 82 - 83
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (200 KB)  

    The purpose of this article is to critique the process of developing formal standards, which are those that have been approved by an official standards-making body. The bodies that impact US computer standards most include the International Standards Organization (ISO), the International Telecommunications Union (ITU), the American National Standards Institute (ANSI) and the Institute of Electrical and Electronics Engineers (IEEE). Other groups develop important standards that are outside the formal process. These include the Open Software Foundation (OSF), X/Open, and the Internet Engineering Task Force (IETF). The main difference between a formal standards organization and the other groups is the legal framework in which the body operates. The formal organizations are often chartered by the government with strict procedures and rules imposed on the standards development process. OSF and X/Open are each directed by a board of directors, whereas the IETF is an independent, self-governing body that develops its own rules and procedures View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Linux operating system

    Publication Year: 1995 , Page(s): 74 - 79
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (652 KB)  

    The enormous consumer market for IBM PCs and compatibles has made them affordable. Now, with a free operating system called Linux, these inexpensive machines can be converted into powerful workstations for teaching, research and software development. For professionals who use Unix-based workstations at work, Linux permits virtually identical working environments on their personal home machines. For cost-conscious educational institutions-especially in developing nations-Linux can create world-class computing environments from inexpensive, easily maintained PC clones. And for university students-especially in science and engineering-Linux provides an essentially cost-free path into Unix and X Windows View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Where is software headed? A virtual roundtable

    Publication Year: 1995 , Page(s): 20 - 32
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (56 KB)  

    To find out where software is headed, experts in academia and industry share their vision of software's future. It is a snapshot in time of where we have been and possibly where we are headed. The subjects discussed are: the desktop; software technology; objects; software agents; software engineering; parallel software; and the curriculum. The results suggest a strong polarization within the software community: a chasm exists between academia and industry. It appears that these two groups share radically different views on where software is headed. The impression is the heavy emphasis on programming languages, operating systems and algorithms by the academic group, in contrast to the clear emphasis on standards and market-leading trends by the industrial group. Academics worry about evolutionary or incremental changes to already poorly designed languages and systems, while industrialists race to keep up with revolutionary changes in everything. Academics are looking for better ideas, industrialists for better tools. To an industrial person, things are moving fast-they are revolutionary. To an academic, things are moving too slowly, and in the wrong direction-they are only evolutionary changes which are slave to an installed base View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

Computer, the flagship publication of the IEEE Computer Society, publishes highly acclaimed peer-reviewed articles written for and by professionals representing the full spectrum of computing technology from hardware to software and from current research to new applications.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Sumi Helal
University of Florida
sumi.helal@gmail.com