By Topic

Computer

Issue 12 • Date Dec. 1999

Filter Results

Displaying Results 1 - 17 of 17
  • Operations are free; data motion isn't [Letters]

    Page(s): 4 - 6
    Save to Project icon | Request Permissions | PDF file iconPDF (207 KB)  
    Freely Available from IEEE
  • A semantic approach adds meaning to the web

    Page(s): 13 - 16
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (322 KB)  

    It is argued that new standards would let users apply the power of their technology to make the most of the Web. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • IETF rejects net wireap proposal

    Page(s): 20 - 21
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (355 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Subject index

    Page(s): 74 - 82
    Save to Project icon | Request Permissions | PDF file iconPDF (639 KB)  
    Freely Available from IEEE
  • Author Index

    Page(s): 82 - 83
    Save to Project icon | Request Permissions | PDF file iconPDF (298 KB)  
    Freely Available from IEEE
  • Directories don't get no respect

    Page(s): 150 - 152
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (271 KB)  

    Ignored for more than a decade, directory technology may finally get deserved recognition. Information systems administrators will soon feel pressure to convert to Windows 2000 simply to get Active Directory services. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Where teamwork and software meet [New Books]

    Page(s): 137
    Save to Project icon | Request Permissions | PDF file iconPDF (189 KB)  
    Freely Available from IEEE
  • Toward a new generation of simpler PCs

    Page(s): 17 - 19
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (280 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Internet Streaming SIMD Extensions

    Page(s): 26 - 34
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (420 KB)  

    Because floating-point computation is the heart of 3D geometry, speeding up floating-point computation is vital to overall 3D performance. To produce a visually perceptible difference in graphics applications, Intel's 32-bit processors-based on the IA-32 architecture-required an increase of 1.5 to 2 times the native floating-point performance. One path to better performance involves studying how the system uses data. Today's 3D applications can execute a lot faster by differentiating between data used repeatedly and streaming data-data used only once and then discarded. The Pentium III's new floating-point extension lets programmers designate data as streaming and provides instructions that handle this data efficiently. The authors designed the Internet Streaming SIMD Extensions (ISSE) to enable a new level of visual computing on the volume PC platform. They discuss their results in terms of boosting the performance of 3D and video applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Solving Einstein's equations on supercomputers

    Page(s): 52 - 58
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (988 KB)  

    In 1916, Albert Einstein published his famous general theory of relativity, which contains the rules of gravity and provides the basis for modern theories of astrophysics and cosmology. For many years, physicists, astrophysicists and mathematicians have striven to develop techniques for unlocking the secrets contained in Einstein's theory of gravity; more recently, computational science research groups have added their expertise to the endeavor. Because the underlying scientific project provides such a demanding and rich system for computational science, techniques developed to solve Einstein's equations will apply immediately to a large family of scientific and engineering problems. The authors have developed a collaborative computational framework that allows remote monitoring and visualization of simulations, at the center of which lies a community code called Cactus. Many researchers in the general scientific computing community have already adopted Cactus, as have numerical relativists and astrophysicists. In June 1999, an international team of researchers at various sites ran some of the largest such simulations in numerical relativity yet undertaken, using a 256-processor SGI Origin 2000 supercomputer at the National Center for Supercomputing Applications (NCSA). Other globally distributed scientific teams are running visual simulations of Einstein's equations on the gravitational effects of colliding black holes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Smart cards aren't always the smart choice

    Page(s): 142 - 143
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (272 KB)  

    Smart cards are beneficial in some scenarios, but they are not the security panacea that some people believe them to be. In some user environments, the costs and inconveniences clearly outweigh the potential benefits of using smart cards View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Virtue: performance visualization of parallel and distributed applications

    Page(s): 44 - 51
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (916 KB)  

    High-speed, wide-area networks have made it both possible and desirable to interconnect geographically distributed applications that control distributed collections of scientific data, remote scientific instruments and high-performance computer systems. Historically, performance analysis has focused on monolithic applications executing on large, stand-alone, parallel systems. In such a domain, measurement, postmortem analysis and code optimization suffice to eliminate performance bottlenecks and optimize applications. Distributed visualization, data mining and analysis tools allow scientists to collaboratively analyze and understand complex phenomena. Likewise, real-time performance measurement and immersive performance display systems-i.e. systems providing large stereoscopic displays of complex data-enable collaborating groups to interact with executing software, tuning its behavior to meet research and performance goals. To satisfy these demands, the authors designed Virtue, a prototype system that integrates collaborative, immersive performance visualization with real-time performance measurement and adaptive control of applications on computational grids. These tools enable physically distributed users to explore and steer the behavior of complex software in real time and to analyze and optimize distributed application dynamics View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What software reality is really about

    Page(s): 148 - 149
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (228 KB)  

    Software reality is more about people working together than it is about defined processes; it's more about science than it is about computer science; it's more about understanding than it is about documentation; it's more about inquiry than it is about metrics; it's more about skill than it is about methods; it's more about becoming better than it is about being good; and it's more about being good enough than it is about being right or wrong View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Visualization in teleimmersive environments

    Page(s): 66 - 73
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (876 KB)  

    In collaborative virtual reality (VR), the goal is to reproduce a face-to-face meeting in minute detail. Teleimmersion moves beyond this idea, integrating collaborative VR with audio- and video-conferencing that may involve data mining and heavy computation. In teleimmersion, collaborators at remote sites share the details of a virtual world that can autonomously control computation, query databases and gather results. They don't meet in a room to discuss a car engine; they meet in the engine itself. The University of Illinois at Chicago's Electronic Visualization Laboratory (EVL) has hosted several applications that demonstrate rudimentary teleimmersion. All users are members of Cavern (CAVE Research Network) [<http://www.evl.uic.edu/cavern>] $a collection of participating industrial and research institutions equipped with CAVE (Cave Automated Virtual Environment), ImmersaDesk VR systems and high-performance computing resources, including high-speed networks. There are more than 100 CAVE and ImmersaDesk installations worldwide. The pressing challenge now is how to support collaborative work among Cavern users without having them worry about the details of sustaining a collaboration. Another problem is providing both synchronous and asynchronous collaboration. The authors detail how they've built new display devices to serve as more convenient teleimmersion end-points and to support their international networking infrastructure with sufficient bandwidth to support the needs of teleimmersive applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Interactive simulation and visualization

    Page(s): 59 - 65
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (836 KB)  

    Most researchers who perform data analysis and visualization do so only after everything else is finished, which often means that they don't discover errors invalidating the results of their simulation until post-processing. A better approach would be to improve the integration of simulation and visualization into the entire process so that they can make adjustments along the way. This approach, called computational steering, is the capacity to control all aspects of the computational science pipeline. Recently, several tools and environments for computational steering have begun to emerge. These tools range from those that modify an application's performance characteristics (either by automated means or by user interaction) to those that modify the underlying computational application. A refined problem-solving environment should facilitate everything from algorithm development to application steering. The authors discuss some tools that provide a mechanism to integrate modeling, simulation, data analysis and visualization View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Distance visualization: data exploration on the grid

    Page(s): 36 - 43
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (584 KB)  

    Our increased ability to model and measure a wide variety of phenomena has left us awash in data. In the immediate future, the authors anticipate collecting data at the rate of terabytes per day from many classes of applications, including simulations running on teraFLOPS-class computers and experimental data produced by increasingly more sensitive and accurate instruments, such as telescopes, microscopes, particle accelerators and satellites. Generating or acquiring data is not an end in itself but a vehicle for obtaining insights. While data analysis and reduction have a role to play, in many situations we achieve understanding only when a human being interprets the data. Visualization has emerged as an important tool for extracting meaning from the large volumes of data that scientific instruments and simulations produce. The authors describe an online system that supports 3D tomographic image reconstruction-and subsequent collaborative analysis-of data from remote scientific instruments View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A really good idea [object-oriented software development]

    Page(s): 144 - 147
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (376 KB)  

    Reflects on the contributions made by object-oriented (OO) development and its future. Don't blame OO programming in general for the limitations of those who don't know how to apply the principles! Only a minority of the industry has tried seriously and without compromise. The experience of others-those who go at it half-heartedly-is not a good technology. So the author's answer to the OO critics is the application to object technology of Gandhi's retort when he was asked for his thoughts about Western civilization: it would be a good idea View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

Computer, the flagship publication of the IEEE Computer Society, publishes highly acclaimed peer-reviewed articles written for and by professionals representing the full spectrum of computing technology from hardware to software and from current research to new applications.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Ron Vetter
University of North Carolina
Wilmington