By Topic

Computer

Issue 7 • Date July 2000

Filter Results

Displaying Results 1 - 11 of 11
  • How perspective-based reading can improve requirements inspections

    Page(s): 73 - 79
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (147 KB)  

    Because defects constitute an unavoidable aspect of software development, discovering and removing them early is crucial. Overlooked defects (like faults in the software system requirements, design, or code) propagate to subsequent development phases where detecting and correcting them becomes more difficult. At best, developers will eventually catch the defects, but at the expense of schedule delays and additional product-development costs. At worst, the defects will remain, and customers will receive a faulty product. The authors explain their perspective based reading (PBR) technique that provides a set of procedures to help developers solve software requirements inspection problems. PBR reviewers stand in for specific stakeholders in the document to verify the quality of requirements specifications. The authors show how PBR leads to improved defect detection rates for both individual reviewers and review teams working with unfamiliar application domains. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A staged model for the software life cycle

    Page(s): 66 - 71
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (104 KB)  

    Software engineers have traditionally considered any work after initial delivery as simply software maintenance. Some researchers have divided this work into various tasks, including making changes to functionality (perfective), changing the environment (adaptive), correcting errors (corrective), and making improvements to avoid future problems (preventive). However, many have considered maintenance basically uniform over time. Because software development has changed considerably since its early days, the authors believe this approach no longer suffices. They describe a new view of the software life cycle in which maintenance is actually a series of distinct stages, each with different activities, tools, and business consequences. While the industry still considers postdelivery work as simply software maintenance, the authors claim that the process actually falls into stages. They think both business and engineering can benefit from understanding these stages and their transitions View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reliability and survivability of wireless and mobile networks

    Page(s): 49 - 55
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (120 KB)  

    The world is becoming more dependent on wireless and mobile services, but the ability of wireless network infrastructures to handle the growing demand is questionable. As wireless and mobile services grow, weaknesses in network infrastructures become clearer. Failures not only affect current voice and data use but could also limit emerging wireless applications such as e-commerce and high-bandwidth Internet access. As wireless and mobile systems play greater roles in emergency response, including 911 and enhanced 911 services, network failures take on life-or-death significance. Therefore, in addition to directing some attention to designing survivable wireless and mobile networks, developers must also keep in mind that increasingly pervasive and demanding services will further escalate the importance of reliability and survivability requirements. The authors explain several options providers must consider to decrease the number of network failures and to cope with failures when they do occur View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Requirements that handle IKIWISI, COTS, and rapid change

    Page(s): 99 - 102
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (128 KB)  

    In the good old days, dealing with software requirements was relatively easy. Software requirements were the first order of business and took place before design, cost estimation, planning, or programming. Of course, it wasn't simple. Certain straightforward criteria required satisfaction: completeness; consistency; traceability; and testability. The recent developments of IKIWISI (I'll know it when I see it), COTS (commercial-off-the-shelf) software, and the increasingly rapid change in information technology have combined to unsettle the foundations of the old airtight requirements approach. These complicating factors are examined View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SPEC CPU2000: measuring CPU performance in the New Millennium

    Page(s): 28 - 35
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (200 KB)  

    As computers and software have become more powerful, it seems almost human nature to want the biggest and fastest toy you can afford. But how do you know if your toy is tops? Even if your application never does any I/O, it's not just the speed of the CPU that dictates performance. Cache, main memory, and compilers also play a role. Software applications also have differing performance requirements. So whom do you trust to provide this information? The Standard Performance Evaluation Corporation (SPEC) is a nonprofit consortium whose members include hardware vendors, software vendors, universities, customers, and consultants. SPEC's mission is to develop technically credible and objective component- and system-level benchmarks for multiple operating systems and environments, including high-performance numeric computing, Web servers, and graphical subsystems. On 30 June 2000, SPEC retired the CPU95 benchmark suite. Its replacement is CPU2000, a new CPU benchmark suite with 19 applications that have never before been in a SPEC CPU suite. The article discusses how SPEC developed this benchmark suite and what the benchmarks do View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using paths to measure, explain, and enhance program behavior

    Page(s): 57 - 65
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (144 KB)  

    What happens when a computer program runs? The answer can be frustratingly elusive, as anyone who has debugged or tuned a program knows. As it runs, a program overwrites its previous state, which might have provided a clue as to how the program got to the point at which it computed the wrong answer or otherwise failed. This all-too-common experience is symptomatic of a more general problem: the difficulty of accurately and efficiently capturing and analyzing the sequence of events that occur when a program executes. Program paths offer an insight into a program's dynamic behavior that is difficult to achieve any other way. Unlike simpler measures such as program profiles, which aggregate information to reduce the cost of collecting or storing data, paths capture some of the usually invisible dynamic sequencing of statements. The article exploits the insight that program statements do not execute in isolation, but are typically correlated with the behavior of previously executed code View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Full text access may be available. Click article title to sign in or learn about subscription options.
  • Managing system and active-content integrity

    Page(s): 108 - 110
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (88 KB)  

    In a shared, multiuser environment, protecting data from damage or misappropriation by unauthorized users is a major concern. The widespread use of active (executable) content such as Microsoft ActiveX controls and Javascripts has given rise to a dangerous, common practice: executing unknown, untrusted code. Security-minded users typically address this problem by executing only signed content that a familiar entity has verified. However, code signing does not protect against bugs already present in the signed code. Patched or new versions of the code can be issued, but the loader (which verifies and loads the executable content, and then transfers the execution control to the module) will still accept the old version, unless the newer version is installed over it. We propose a method that addresses the executable content management problem. Our method employs an executable content loader (which we call a strong loader) and a short-lived configuration management file to address the software aging problem. The loader is tightly integrated to the operating system. It downloads the configuration file from an integrity server; then it verifies and loads executable modules by applying the policy in this configuration file View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fashioning a foundation for the computing profession

    Page(s): 97 - 98
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (64 KB)  

    Members of a true computing profession must look beyond the problems they are paid to solve, paying heed to how their profession and the world at large affect each other. The body of knowledge and skill that defines the profession to which the IEEE Computer Society's members belong is suggested by the technical articles that appear in the Society's publications. The computing profession is relatively new, still forming, and as yet uncertain of its identity. It is in danger of losing the respect of its wider community, which associates the profession with apparent fiascoes like the Y2K affair, or feared fiascoes like the Internet stock bubble. The future health of the computing profession depends on its members taking an interest in issues outside its body of knowledge and skills. These issues fall into three classes, those that relate to: the profession itself; the constraints that might or should be imposed on the profession by the external community; and the profession's effect on the external community View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PicoRadio supports ad hoc ultra-low power wireless networking

    Page(s): 42 - 48
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (160 KB)  

    Technology advances have made it conceivable to build and deploy dense wireless networks of heterogeneous nodes collecting and disseminating wide ranges of environmental data. Applications of such sensor and monitoring networks include smart homes equipped with security, identification, and personalization systems; intelligent assembly systems; warehouse inventory control; interactive learning toys; and disaster mitigation. The opportunities emerging from this technology give rise to new definitions of distributed computing and the user interface. Crucial to the success of these ubiquitous networks is the availability of small, lightweight, low-cost network elements, which the authors call PicoNodes. The authors present a configurable architecture that enables these opportunities to be efficiently realized in silicon. They believe that this energy-conscious system design and implementation methodology will lead to radio nodes that are two orders of magnitude more efficient than existing solutions View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The wireless Internet: promises and challenges

    Page(s): 36 - 41
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB)  

    In the 1980s, the PC migrated from the hobbyist's den to the corporate desktop, a huge development in information technology. Ten years later, that honor fell to the Internet and wireless telephones, which until now have followed separate paths. The author predicts that this decade will see the convergence of wireless communications and the Internet. Although the commercial impact of wireless communications has thus far been limited to cellular telephones, the business and technical communities anticipate rapid growth in wireless data services. Almost daily, some prominent company announces plans for a “wireless e-commerce” enhancement to its business. The author examines the outlook for wireless data. Specifically, he considers the utility of wireless data services and why they have not been widely adopted until now. He also looks at the technology trends promoting wireless Internet convergence, and the obstacles preventing their implementation. As computing becomes increasingly mobile, the limitations of third-generation cellular telephony and the wireless applications protocol become increasingly apparent. The author asserts that only a new approach can make the Internet truly wireless View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

Computer, the flagship publication of the IEEE Computer Society, publishes highly acclaimed peer-reviewed articles written for and by professionals representing the full spectrum of computing technology from hardware to software and from current research to new applications.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Ron Vetter
University of North Carolina
Wilmington