By Topic

Virtual Reality, 1993. Proceedings., IEEE 1993 Symposium on Research Frontiers in

Date 25-26 Oct. 1993

Filter Results

Displaying Results 1 - 21 of 21
  • Proceedings of 1993 IEEE Research Properties in Virtual Reality Symposium

    Save to Project icon | Request Permissions | PDF file iconPDF (73 KB)  
    Freely Available from IEEE
  • VR as a forcing function: Software implications of a new paradigm

    Page(s): 5 - 8
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (522 KB)  

    Immersive virtual reality systems provide a qualitatively different perceptual experience than do conventional desktop systems and impose greater demands on support hardware and software. The authors address three software issues for VR applications: interaction models, object oriented systems, and application development frameworks.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • DIVER: A Distributed Virtual Environment Research platform

    Page(s): 10 - 15
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (501 KB)  

    DIVER is the Virtual Environment software architecture library developed at the University of Virginia. It is similar to SGI's Inventor, but was developed specifically as a toolkit for creating Virtual Environments rather than for mouse-based applications. Like Inventor, DIVER was implemented on top of the SGI's GL library and provides a powerful hierarchical graphics database. Unlike Inventor, DIVER is distributed, divorcing the application computations from the rendering computations. DIVER runs the application on a remote CPU and transparently spawns asynchronous rendering processes onto one or more SGIs. DIVER extends the graphics database hierarchy by allowing programmers to perform graphics transformations in any other object's nested coordinate system, and by allowing the programmer to nest the virtual camera (viewpoint) anywhere in the hierarchy. DIVER has been active use for over a year by more than twenty researchers.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Volume haptization

    Page(s): 16 - 23
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (594 KB)  

    The authors describe haptic representation of volume data. Volume visualization is a powerful tool in the field of scientific visualization. However, visual representation of full three-dimensional volume is hard to comprehend because of occlusion. Higher-dimensional and multi-parameter data sets are also difficult to present by visual image. The authors propose methods for presentation of volume data by force sensation. A six degree-of-freedom force reflective master manipulator is used for haptization. The manipulator is combined to real-time visual image of volume data. Methods of haptic representation of scalar, vector, and tensor data are discussed. Recognition performance tests of scalar and multiparameter volume data are examined.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Interactive collision detection

    Page(s): 24 - 31
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (766 KB)  

    Collision detection and response can make a virtual-reality application seem more believable. Unfortunately, existing collision-detection algorithms are too slow for interactive use. The authors present a new algorithm that is not only fast but also interruptible, allowing an application to trade quality for more speed. The algorithm uses simple four-dimensional geometry to approximate motion, and sets of spheres to approximate three-dimensional surfaces. The algorithm allows a sample application to run five to seven times faster than it runs with existing algorithms.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Calibration and evaluation of virtual environment displays

    Page(s): 33 - 40
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (727 KB)  

    Designing safe and effective virtual environment (VE) systems requires human factors research to produce specifications for VE systems that are based on sensory and perceptual requirements necessary for optimal human performance. Part of this research requires establishing performance benchmarks, established in physical environments, with which to compare performance in virtual environments. The authors present two studies in which they used benchmarks based on two fundamental perceptual and motor components of spatial perception and orientation. (1) A virtual visual display was calibrated by comparing the accuracy of pointing to targets in physical and virtual environments. (2) The authors assessed the degree to which a VE represented a physical environment by comparing users' judgments of target direction in the two environments. They discuss implications of the research for the design of VEs.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A user study comparing head-mounted and stationary displays

    Page(s): 41 - 45
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (425 KB)  

    Head-mounted displays, as popularized by virtual reality systems, offer the opportunity to immerse a user in a synthetically generated environment. While there is much anecdotal evidence that this is a qualitative jump in the user interface, there is little quantitative data to establish that emersion improves task performance. The authors present the results of a user study: users performing a generic search task decrease task performance time by roughly half (42% reduction) when they change from a stationary display to a head-mounted display with identical properties (resolution, field-of-view, etc.). A second result is that users who practice with the head-mounted display reduce task completion time by 23% in later trials with the stationary display, suggesting a transfer effect.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Perceptual decomposition of virtual haptic surfaces

    Page(s): 46 - 53
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (747 KB)  

    The analysis and construction of virtual haptic surfaces are considered from a perceptual point of view rather than from the dynamics and controls approach of prior work. The authors developed a perceptual decomposition of surface contact sensation by examining three qualities associated with the different stages of interaction with a haptic wall simulation. These qualities are the crispness of initial contact, the hardness of surface rigidity, and the cleanness of final release from the virtual wall's surface. These qualities, plus an overall rating of wall quality, were employed consistently by seven subjects to evaluate a set of six simple haptic wall simulations. Three of the wall models consisted of single linear springs; the remainder, single viscous dampers. Highest rankings of subjective hardness were associated with the spring models; damper models received the highest crispness rankings. Subjects favored the simple spring models as having, overall, the more wall-like perceptual character.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Understanding synthetic experience must begin with the analysis of ordinary perceptual experience

    Page(s): 54 - 57
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (378 KB)  

    The emergence of teleoperation and virtual environment has greatly increased interest in synthetic experience, a mode of experience made possible by both these newer technologies and earlier ones, such as telecommunication and sensory prosthetics. The authors maintain that understanding synthetic experience must begin with the recognition that the phenomenology of synthetic experience is continuous with that of ordinary experience. They demonstrate the continuity of synthetic experience and normal perceptual experience with respect to two issues: the determination of a person's phenomenal location in space and the experience of being in touch with near and remote objects.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Scientists in wonderland: A report on visualization applications in the CAVE virtual reality environment

    Page(s): 59 - 66
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (875 KB)  

    The authors present the experiences at the Electronic Visualization Laboratory (EVL) in introducing computational scientists to the use of virtual reality as a research tool. They describe the virtual environment, the CAVE. They then describe several applications currently being developed at EVL using the CAVE and conclude with a discussion on possible research paths to follow in making virtual reality an effective tool for visualization.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Applying virtual reality in education: A prototypical virtual physics laboratory

    Page(s): 67 - 74
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1158 KB)  

    A prototypical virtual physics laboratory has been constructed that allows students to control the laboratory environment as well as the physical properties of objects in that laboratory. Those environment factors that can be controlled in the current implementation include gravity (both magnitude and direction), surface friction, and atmospheric drag. The coefficients of restitution of elastic bodies can also be altered. Trajectories of objects can be traced to facilitate measurements. The laboratory allows students to measure both displacements and elapsed time. Time may be frozen to allow for precise observation of time-varying phenomena. This laboratory will ultimately be extended into the macroscopic and microscopic domains -giving students access to direct observations that were heretofore impossible. This new application of computer graphics in education has the potential to augment or replace traditional laboratory instruction with approaches that offer superior motivation, retention, and intellectual stimulation.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On recording virtual environments

    Page(s): 80 - 83
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (317 KB)  

    Documentation is fundamental to science. For virtual reality to be an effective scientific visualization tool, it must be recordable. This position statement outlines the benefits of image-based recording of virtual environments and introduces the VR-VCR, a device for immersive playback of recorded virtual environments.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Auditory distance perception by translating observers

    Page(s): 92 - 99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (659 KB)  

    The authors consider auditory distance perception of a moving observer and its relevance for the perception of stationary and moving sources. They begin with a review of some of the acoustic cues to source distance, focusing on the dynamic cues available under observer translation (motion parallax and acoustic tau). They report an experiment indicating the level of accuracy with which stationary and translating observers are able to localize stationary sources from 2 to 6 m away. Given the significant errors associated with these near distances, it would appear that the perceptual assessment of the 3-D trajectory of a real or virtual source, especially a distant one, is likely to be substantially in error. Even so, motion parallax and acoustic tau are informative about the relative motion between observer and source.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Is "presence" a training issue?

    Page(s): 124 - 125
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (199 KB)  

    A fighter pilot must create an accurate mental model of dynamic three-dimensional spatial relationships from two-dimensional head-up displays. The authors describe a US Air Force training research project to develop and assess visualization training media for their effectiveness in promoting mental model building. The theoretical approach guiding this project is based on the psychological construct of "presence" afforded by virtual world technologies.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Virtual gain for audio windows

    Page(s): 85 - 91
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (660 KB)  

    Audio windowing is a front-end, or user interface, to an audio system with a spatial sound backend. Besides the directionalization of the DSP spatialization, gain adjustment is used to control the volume of the various sources. Virtual gain can be synthesized from components derived from iconic size, distance, orientation, and directivity, and selectivity enabled according to room-wise partitioning of sources across sinks. The authors describe the mathematical derivation of the calculation of virtual gain, and outlines the deployment of these calculations in an audio windowing system.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What you see is what you hear-Acoustics applied in virtual worlds

    Page(s): 100 - 107
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (739 KB)  

    Up to now virtual reality (VR) systems have emphasized visual graphics and display technology. With audio hardware and system software readily available it is now possible to take the next evolutionary step and use acoustic simulations to enhance virtual worlds. An audiovisual system addresses two important human senses and provides realistic impressions that are natural to the daily life and environment. The VR toolkit of IGD features several visual and acoustic renderers, which are applied by IGD's general purpose VR system "Virtual Design". Examples demonstrate several virtual audiovisual worlds.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • X11 in virtual environments

    Page(s): 118 - 119
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (180 KB)  

    The authors present a brief description of a method for running 2-D text based or graphical applications from within a virtual environment. This is accomplished by a port of the X11 server which renders into a shared memory segment, which is in turn texture mapped onto the surface of an object in a 3-D virtual world. Events that occur in the virtual space can be translated into pseudo mouse or keyboard input events, which are passed to the X server via shared memory queues, fifo's, or network connections. This system allows any X application to be used from within a virtual space, without rewriting any software. Any number of such servers can exist and be manipulated inside of the virtual environment. Use of transparency as a background color allows text windows to float in space or be used as heads-up displays.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An eye tracking computer user interface

    Page(s): 120 - 121
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (223 KB)  

    The authors describe an inexpensive eye movement controlled user interface for 2-D and 3-D interaction. It is based on electro-oculography (EOG) rather than the very expensive reflectance based methods. The authors have built the hardware and software to demonstrate the viability of EOG for human-computer communication. The experiments indicate that EOG provides the basis for an adequate input interaction device. Being very inexpensive, the system is applicable for many virtual reality systems and video games as well as for the handicapped.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Robots and simulated environments-first steps towards virtual robotics

    Page(s): 122 - 123
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (176 KB)  

    The context of the paper is the design of an appropriate working environment for learning robots. The authors are interested in robots which are able to structure incoming sensor data on the basis of internal reference schemes and robot tasks. For efficiency reasons, they want to perform the learning cycles in simulated environments explored by the robot. The authors outline how advanced visualization and interaction techniques as developed in the field of virtual reality could be employed to study the development and properties of the internal data of a semi-autonomous robot, as well as the learning process itself.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cosmic Explorer: A virtual reality environment for exploring cosmic data

    Page(s): 75 - 79
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (428 KB)  

    Supercomputer simulations now can produce multilevel, multi-scale large data sets that require new techniques in scientific visualization and higher levels of hardware performance. The authors explore the use virtual reality (VR) technology in this connection. Cosmic Explorer is a VR environment visualizing numerical and observational cosmology data. They have implemented multi-scale visualization techniques that will work on large, multilevel time-dependent data sets. VR provides natural ways to navigate in an immersive environment, and scientific visualization requires overviewing. The techniques bridge the two in a VR system involving the BOOM and the DataGlove. Combining modern visualization hardware, software, and VR technology, the authors are able to create a system that lets users explore the virtual space created by numerical simulations with ease and naturalness.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A virtual environment architecture

    Page(s): 126 - 127
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (204 KB)  

    Part of MITRE's charter as a Federally Funded Research and Development Center (FFRDC) is to objectively evaluate and compare current technologies, and to recommend courses of action for numerous government programs. As such, the authors have been involved in assessing workstation, graphics, and user interface technology. They are currently developing a virtual environment architecture (VEA), to be used as a foundation for several prototype applications.<> View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.