Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Augmented Reality, 2001. Proceedings. IEEE and ACM International Symposium on

Date 29-30 Oct. 2001

Filter Results

Displaying Results 1 - 25 of 33
  • Proceedings IEEE and ACM International Symposium on Augmented Reality

    Publication Year: 2001
    Save to Project icon | Request Permissions | PDF file iconPDF (169 KB)  
    Freely Available from IEEE
  • Author index

    Publication Year: 2001 , Page(s): 224 - 225
    Save to Project icon | Request Permissions | PDF file iconPDF (54 KB)  
    Freely Available from IEEE
  • Current status of the Varioscope AR, a head-mounted operating microscope for computer-aided surgery

    Publication Year: 2001 , Page(s): 20 - 29
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (437 KB) |  | HTML iconHTML  

    Computer-aided surgery (CAS), the intraoperative application of biomedical visualization techniques, appears to be one of the most promising fields of application for augmented reality (AR), the display of additional computer generated graphics over a real-world scene. Typically a device such as a head-mounted display (HMD) is used for AR. However considerable technical problems connected with AR have limited the intraoperative application of HMDs up to now. One of the difficulties in using HMDs is the requirement for a common optical focal plane for both the real-world scene and the computer generated image, and acceptance of the HMD by the user in a surgical environment. In order to increase the clinical acceptance of AR, we have adapted the Varioscope (Life Optics, Vienna), a miniature, cost-effective head-mounted operating microscope, for AR. In this work, we present the basic design of the modified HMD, and the method and results of an extensive laboratory study for photogrammetric calibration of the Varioscope's computer displays to a real-world scene. In a series of sixteen calibrations with varying zoom factors and object distances, mean calibration error was found to be 1.24±0.38 pixels or 0.12±0.05 mm for a 640×480 display. Maximum error accounted for 3.33±1.04 pixels or 0.33±0.12 mm. The location of a position measurement probe of an optical tracking system was transformed to the display with an error of less than I mm in the real world in 56% of all cases. For the remaining cases, error was below 2 mm. We conclude that the accuracy achieved in our experiments is sufficient for a wide range of CAS applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Real time tomographic reflection: phantoms for calibration and biopsy

    Publication Year: 2001 , Page(s): 11 - 19
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1366 KB) |  | HTML iconHTML  

    We aim to validate Real Time Tomographic Reflection (RTTR) as an image guidance technique for needle biopsy. RTTR is a new method of in situ visualization, which merges the visual outer surface of a patient with a simultaneous ultrasound scan of the patient's interior using a half-silvered mirror. The ultrasound image is visually merged with the patient, along with the operator's hands and the invasive tool in the operator's natural field of view. Geometric relationships are preserved in a single environment, without the tool being restricted to lie in the plane of the ultrasound slice. The present experiment illustrates the effectiveness of needle biopsy using RTTR on a phantom consisting of an olive embedded in a turkey breast and discusses several prototypes of calibration phantoms View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optical see-through calibration with vision-based trackers: propagation of projection matrices

    Publication Year: 2001 , Page(s): 147 - 156
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (651 KB) |  | HTML iconHTML  

    Recently, M. Tuceryan and N. Navab (2000) introduced a method for calibrating an optical see-through system based on the alignment of a set of 2D markers on the display with a single point in the scene, while not restricting the user's head movements (the single point active alignment method or SPAAM). This method is applicable with any tracking system, provided that it gives the pose of the sensor attached to the see-through display. When cameras are used for tracking, one can avoid the computationally intensive and potentially unstable pose estimation process. A vision-based tracker usually consists of a camera attached to the optical see-through display, which observes a set of known features in the scene. From the observed locations of these features, the pose of the camera can be computed. Most pose computation methods are very involved and can be unstable at times. The authors propose to keep the projection matrix for the tracker camera without decomposing it into intrinsic and extrinsic parameters and use it within the SPAAM method directly. The propagation of the projection matrices from the tracker camera to the virtual camera, representing the eye and the optical see-through display combination as a pinhole camera model, allows us to skip the most time consuming and potentially unstable step of registration, namely, estimating the pose of the tracker camera View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Augmented reality visualization of ultrasound images: system description, calibration, and features

    Publication Year: 2001 , Page(s): 30 - 39
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1321 KB) |  | HTML iconHTML  

    We developed a system for augmented reality visualization of ultrasound images. This system is based on our earlier "augmented workspace" setup. The user wears a custom video-see-through head-mounted display (HMD). Two color video cameras attached to the HMD provide a stereo view of the scene. A third head-mounted video camera is added for tracking. A set of markers is attached to the ultrasound transducer; a set of stationary markers can be positioned above the workspace. The system runs at the full 30 Hz video frame rate with a latency of about 0.1 sec, generating a stable augmentation with no apparent jitter visible in the composite images. Three SGI Visual Workstations provide the computing power for the system. The authors describe the details of the system, its calibration with a configuration of optical markers partially immersed in a water bath, and some system features like spatiotemporal freezing and 3D target localization that promise to be helpful for practical applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A quick method for synthesizing photorealistic color images under various illumination conditions

    Publication Year: 2001 , Page(s): 173 - 174
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (325 KB) |  | HTML iconHTML  

    We propose a fast method to synthesize realistic images under novel lighting directions using three color images. In the method, we use a look-up table procedure to estimate the surface orientations, instead of complex computations. Moreover, we explain a way to separate the diffuse and specular reflection components from the three color images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Augmented maintenance of powerplants: a prototyping case study of a mobile AR system

    Publication Year: 2001 , Page(s): 124 - 133
    Cited by:  Papers (3)  |  Patents (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3159 KB) |  | HTML iconHTML  

    Augmented reality (AR) research has progressed in great strides over the past few years. Most current demonstrations focus on providing robust tracking solutions since this is the most critical issue when demonstrating AR systems. An issue that is typically neglected concerns the online access, analysis and visualization of information. The information required by AR demonstration systems is kept to a minimum, is prepared ahead of time, and is stored locally in the form of three-dimensional geometric descriptions. In complex mobile settings, these simplifying assumptions do not work. The authors report on recent efforts at the TU Munich to analyze the information generation, retrieval, transmission, and visualization process in the context of maintenance procedures that are performed in nuclear power plants. The use of AR to present such information online has significant implications for the way information must be acquired, stored, and transmitted. The paper focuses on pointing out open questions, discussing options for addressing them, and evaluating them in prototypical implementations View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of a component-based augmented reality framework

    Publication Year: 2001 , Page(s): 45 - 54
    Cited by:  Papers (23)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1251 KB) |  | HTML iconHTML  

    The authors propose a new approach to building augmented reality (AR) systems using a component-based software framework. This has advantages for all parties involved with AR systems. A project manager can reuse existing components in new applications; an end user can reconfigure his system by plugging modules together, an application developer can view the system at a high level of abstraction; and a component developer can focus on technical problems. Our proposed framework consists of reusable distributed services for key subproblems of AR, the middleware to combine them, and an extensible software architecture. We have implemented services for tracking, modeling real and virtual objects, modeling structured navigation or maintenance instructions, and multimodal user interfaces. As a working proof of our concept, we have built an indoor and outdoor campus navigation system using different modes of tracking and user interaction View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using a head-mounted projective display in interactive augmented environments

    Publication Year: 2001 , Page(s): 217 - 223
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1572 KB) |  | HTML iconHTML  

    Head-mounted projective displays (HMPD) have been recently proposed as an alternative to conventional eyepiece-type head-mounted displays (HMDs). HMPDs consist of a pair of miniature projection lenses, beamsplitters, and displays mounted on the helmet and retro-reflective sheeting materials placed strategically in the environment. The HMPD technology is first reviewed briefly, which includes its features and capabilities and a comparison with conventional visualization techniques, as well as our recent implementation of a compact HMPD prototype. Then we present some preliminary findings on retro-reflective materials and discuss a framework for collaborative AR environments, which supports at least three modes of collaboration: interactive local collaboration in an AR environment, passive distant collaboration, and interactive distant collaboration. Finally, two preliminary application examples of the HMPD technology for interactive collaboration in augmented environments are included, which demonstrate some of the HMPD characteristics and embody the framework for distant collaboration View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Finger tracking for interaction in augmented environments

    Publication Year: 2001 , Page(s): 55 - 64
    Cited by:  Papers (17)  |  Patents (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (263 KB) |  | HTML iconHTML  

    Optical tracking systems allow three-dimensional input for virtual environment applications with high precision and without annoying cables. Spontaneous and intuitive interaction is possible through gestures. The authors present a finger tracker that allows gestural interaction and is simple, cheap, fast, robust against occlusion and accurate. It is based on a marked glove, a stereoscopic tracking system and a kinematic 3D model of the human finger. Within our augmented reality application scenario, the user is able to grab, translate, rotate, and release objects in an intuitive way. We demonstrate our tracking system in an augmented reality chess game, allowing a user to interact with virtual objects View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Augmented reality (AR) for assembly processes - an experimental evaluation

    Publication Year: 2001 , Page(s): 185 - 186
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (21 KB) |  | HTML iconHTML  

    A PAL-video-camera (Toshiba ACM 413 E with 2.2 mm lens) and a HMD were connected to a Silicon Graphics Workstation O2 (SGI). As the HMD, a clip-on display (MircoOptical) with a 640x480 resolution was used. The clip-on display can be used like other HMDs in video see-through mode. It works like a little monitor in front of the user's eyes which covers just a small part of his field of view. The assembling person was not restricted by the inconveniences of the video see-through mode. Test subjects have to grasp a little wooden cylinder in a box. By passing a flap to accede to the cylinder the workstation gets a mouse signal to write the time into a log-file and to show the assembly position on a pegboard with 48 holes in the clip-on display. For the tests 12 apprentices and students with similar practical experience in mechanics and electronics were selected View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MR2 (MR Square): a mixed-reality meeting room

    Publication Year: 2001 , Page(s): 169 - 170
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (228 KB) |  | HTML iconHTML  

    Our new meeting room, the MR Square, was built as a testing ground of a variety of multimedia user interfaces and applications. The MR Square features mixed-reality technologies such as wide-area trackers and tens of head mount displays, as well as one 200-inch and two 150-inch screens. Participants sit and collaborate in fully wired, retractable seats. The MR Square illustrates how state of the art multimedia technologies can be incorporated into a meeting room View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mobile AR4ALL

    Publication Year: 2001 , Page(s): 181 - 182
    Cited by:  Papers (4)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (120 KB) |  | HTML iconHTML  

    The AR-PDA project develops a framework that allows the use of mobile AR applications for the consumer market. Adapting existing technologies for PDAs (personal digital assistants), high speed wireless computing, computer graphics and computer vision, AR services will be provided at low costs. We discuss the suitability of mobile devices for AR, and address problems of wireless network transmissions and latencies. We present a first running prototype of a wireless PDA with AR features demonstrating a proof-of-concept for a mobile AR system that addresses the consumer market View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extendible tracking by line auto-calibration

    Publication Year: 2001 , Page(s): 97 - 103
    Cited by:  Papers (4)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (186 KB) |  | HTML iconHTML  

    One of the key requirements for an augmented reality system is a tracking system that determines the user's viewpoint accurately. Many vision-based tracking systems exhibit limited tracking range due to their dependence on pre-calibrated features. Previous extendible-tracking systems only make use of point features. This paper describes an extendible tracking method that integrates pre-calibrated landmarks and natural line features for camera tracking. Extendible and robust tracking is achieved by dynamically calibrating prior uncalibrated 3D structure of line features. The experimental results indicate the effectiveness of this approach for extending the tracking range and reducing dependence on the prepared environment. (Plus color plates) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Augmented reality as a new media experience

    Publication Year: 2001 , Page(s): 197 - 206
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (207 KB) |  | HTML iconHTML  

    The authors discuss their work on applying media theory to the creation of narrative augmented reality (AR) experiences. We summarize the concepts of remediation and media forms as they relate to our work, argue for their importance to the development of a new medium such as AR, and present two examples AR experiences we have designed using these conceptual tools. In particular, we focus on leveraging the interaction between the physical and virtual world, remediating existing media (film, stage and interactive CD-ROM), and building on the cultural expectations of our users View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Real-time and markerless vision-based tracking for outdoor augmented reality applications

    Publication Year: 2001 , Page(s): 189 - 190
    Cited by:  Papers (10)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (203 KB) |  | HTML iconHTML  

    A novel concept for markerless optical tracking called "tracking with reference images" is introduced. This concept provides a flexible and practicable framework and is especially relevant for outdoor augmented reality applications, for which no tracking solution currently exists. The implementation is achieved using an image matching technique, which compares the current live video image with one or more of the reference images. The complete system has been tested outdoors in the context of a mobile AR-application for archeology. The system runs on a laptop at around 10 Hz and provides views of virtual monuments superposed to their ruins View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic virtual convergence for video see-through head-mounted displays: maintaining maximum stereo overlap throughout a close-range work space

    Publication Year: 2001 , Page(s): 137 - 146
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (882 KB) |  | HTML iconHTML  

    We present a technique that allows users of video see-through head-mounted displays to work at close range without the typical loss of stereo perception due to reduced nasal side stereo overlap in most of today's commercial HMDs. Our technique dynamically selects parts of the imaging frustums acquired by wide-angle head-mounted cameras and re-projects them for the narrower field-of-view displays. In addition to dynamically maintaining maximum stereo overlap for objects at a heuristically estimated working distance, it also reduces the accommodation-vergence conflict, at the expense of a newly introduced disparity-vergence conflict. We describe the hardware (assembled from commercial components) and software implementation of our system and report on our experience while using this technique within two different AR applications. (Plus color plates) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Markerless augmented reality with a real-time affine region tracker

    Publication Year: 2001 , Page(s): 87 - 96
    Cited by:  Papers (12)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1181 KB) |  | HTML iconHTML  

    We present a system for planar augmented reality based on a new real-time affine region tracker. Instead of tracking fiducial points, we track planar local image patches, and bring these into complete correspondence, so a virtual texture can directly be added to them. Moreover, the local image patches can be extracted in an invariant way, even without any a priori information from previous frames. Hence, it is possible to use them as natural beacons, that can be used to recognize the scene and to identify the individual patches. This results in a powerful system that can work without artificial markers or fiducial points and with a minimal amount of user interference View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Real-time 3D object recognition for automatic tracker initialization

    Publication Year: 2001 , Page(s): 175 - 176
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (88 KB) |  | HTML iconHTML  

    We propose a vision based real-time object recognition system, that provides object identification and 3D position data for the automatic initialization of a 3D tracking system. A-priori information is generated using the models of objects which may be present in the image. During recognition this data is accessed, using the position of corner features in the video image for indexing. A voting scheme is used to recognize objects and their positions in a single run. Upon recognition, a 3D object tracker can be automatically provided with initialization data View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Illuminating the mixed reality stage: applying complex lighting conditions to AR

    Publication Year: 2001 , Page(s): 187 - 188
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (26 KB) |  | HTML iconHTML  

    The Mixed Reality Stage is an augmented reality environment which aims to support creative and collaborative stage design processes. In this application scenario, the proper simulation of real lighting is crucial for a seamless integration of virtual objects into a physical model stage. The authors present their approach to emulating complex real world lighting conditions for rendering in real-time. They introduce mechanisms that significantly reduce the effective number of light sources, while simultaneously minimizing the loss of visualization quality View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic superimposition of synthetic objects on rigid and simple-deformable real objects

    Publication Year: 2001 , Page(s): 5 - 10
    Cited by:  Papers (3)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (94 KB) |  | HTML iconHTML  

    A current challenge in augmented reality applications is the accurate superimposition of synthetic objects on real objects within the environment. This challenge is heightened when the real objects are in motion and/or are nonrigid. In this article, we present a robust method for realtime, optical superimposition of synthetic objects on dynamic rigid and simple-deformable real objects. Moreover, we illustrate this general method with the VRDA Tool, a medical education application related to the visualization of internal human knee joint anatomy on a real human knee View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Testing information delivery methods using augmented reality

    Publication Year: 2001 , Page(s): 171 - 172
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (19 KB) |  | HTML iconHTML  

    This paper describes an experiment conducted to compare three technologies in delivering a multi-step maintenance procedure to a factory worker. Optical see-through augmented reality, a web browser and a traditional paper-based manual used as control were the three technologies. Both augmented reality and the web browser expoited voice recognition and text to speech capability thus enabling a hands free user interface. The measure of the experiment was time to completion. An entrance and exit questionnaire were given to subjects to obtain qualitative measurements. Thirty-six individuals participated in a randomized block experiment design with repeated measures View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mobile collaborative augmented reality

    Publication Year: 2001 , Page(s): 114 - 123
    Cited by:  Papers (16)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (254 KB) |  | HTML iconHTML  

    The combination of mobile computing and collaborative augmented reality into a single system makes the power of computer enhanced interaction and communication in the real world accessible anytime and everywhere. The paper describes our work to build a mobile collaborative augmented reality system that supports true stereoscopic 3D graphics, a pen and pad interface and direct interaction with virtual objects. The system is assembled from off-the-shelf hardware components and serves as a basic test bed for user interface experiments related to computer supported collaborative work in augmented reality. A mobile platform implementing the described features and collaboration between mobile and stationary users are demonstrated View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Linear solutions for visual augmented reality registration

    Publication Year: 2001 , Page(s): 183 - 184
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (141 KB) |  | HTML iconHTML  

    Correct registration of virtual objects into real scenes requires robust estimation of camera pose. Since most augmented reality applications also require real-time performance in potentially restricted environments with no a priori motion model, we seek pose estimation algorithms which are fast, perform well with few reference objects and require no initialization. We present a pair of linear pose estimation algorithms for arbitrary point and line correspondences and demonstrate their suitability for augmented reality applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.