By Topic

Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on

Date 5-8 Nov. 2012

Filter Results

Displaying Results 1 - 25 of 92
  • Freely Available from IEEE
  • Committee

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (34 KB)  
    Freely Available from IEEE
  • Demos

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (26 KB)  
    Freely Available from IEEE
  • Doctoral Consortium

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (26 KB)  
    Freely Available from IEEE
  • General chairs

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (40 KB)  
    Freely Available from IEEE
  • Portable reality: Expanding available space

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (29 KB)  
    Freely Available from IEEE
  • A new era of Human Computer Interaction

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (38 KB)  
    Freely Available from IEEE
  • Preface

    Page(s): 1 - 3
    Save to Project icon | Request Permissions | PDF file iconPDF (50 KB)  
    Freely Available from IEEE
  • [Program Committee]

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (26 KB)  
    Freely Available from IEEE
  • Reviewers

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | PDF file iconPDF (45 KB)  
    Freely Available from IEEE
  • [Steering Committee]

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (28 KB)  
    Freely Available from IEEE
  • Tracking competition

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (31 KB)  
    Freely Available from IEEE
  • Tutorial 1: Adaptive augmented reality (A2R): Where AR meets user's interest

    Page(s): 1
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (33 KB)  

    Adaptive behavior is one of the main challenges in building computerized systems, especially in the case of systems which are delivering information to the end users. Indeed, since the information overload has become the main drawback for the future development of such systems (cf. Big Data challenge), there is a huge movement in the research community to develop concepts for better adaptation of the form and size of information that will be delivered to a user (usually taking different forms of the personalization). However, the main effort has been dedicated to the contextualization of the user's task in order to determine what is the best way to tailor/adapt the presentation of information to the user, neglecting the role of the user's internal context, expressed as the user's (short-term) interest. The same is valid for the AR systems. In this tutorial we present novel results in modeling users' interest in the context of AR systems and demonstrate some practical results in realizing such an approach in a multisensor AR system based on the usage of the see-through AR glasses. Due to the need for continuously adapt the AR content to the user's interest, such models are facing many challenges in sensing the user's behavior (using acoustic-, video-, gesture- and bio-sensors), interpreting it as an interest and deciding in real-time what kind of the adaptation to perform. We argue that this lead to a new class of AR system that we coined as adaptive AR (AR) systems. This work has been partially realized within the scope of the FP7 ICT research project ARtSENSE (www.artsense.eu), that is developing new AR concepts for improving personalized museum's experience. The tutorial will present practical results from applying the approach in three cultural heritage institutions in Europe (Paris, Madrid and Liverpool). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Tutorial 2: Integrating and using panoramas and photographic images in AR experiences

    Page(s): 1
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (30 KB)  

    Some AR browsers and other mobile phone apps (e.g. Argon, Photosynth, 360 Cities, Tourwrist) allow the user to create, display or interact with panoramas and other forms of historical and contemporary imagery. These technologies open up exciting possibilities for cultural heritage, entertainment and other uses in location-based experiences. Full panoramas or historical photographs merged into the visual field can provide the user with a perspective on a place as it looked in the past or might look in a possible future. We propose to offer the participants in this tutorial an introduction to the technical issues involved in creating and integrating such imagery into an AR/MR application. We will also provide relevant historical background regarding panoramas and consider issues of aesthetics and user experience design. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Tutorial 3: AR mobile game development: Getting started

    Page(s): 1
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (33 KB)  

    This tutorial is a half-day project based tutorial to demonstrate how to create an AR mobile game prototype from game design to art, animation and technical production. Tools such as Unity, Maya and Vuforia will be used in this tutorial. Standard game development topics which can be applied to all digital game projects such as the game design process, pre-production planning, 3D modeling, rigging and animation techniques, game engine workflow as well as the unique elements of AR mobile game development will be covered in this tutorial. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Tutorials

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (25 KB)  
    Freely Available from IEEE
  • Workshop

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (32 KB)  
    Freely Available from IEEE
  • Workshop 1: 2nd IEEE ISMAR workshop on authoring solutions for augmented reality

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (35 KB)  

    The motivation of this workshop is to discuss future direction of content authoring in the field of Augmented Reality, as well as to discuss the current state of art on content creation and content authoring for augmented reality. The workshop will comprise of a paper session where authoring papers, late-breaking results and overviews over state-of-the-art are presented. In the afternoon, we will follow up with discussion sessions on different topics ranging from content creation and authoring to content distribution for AR and a short closing session. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Workshop 2: Classifying the AR presentation space

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (38 KB)  

    Already 3D visualization environments provide a large design space not being investigated to the same extent as traditional WIMP-spaces. When using this design space in combination with AR, the design space even further grows. Information can not only be presented in a 3D space, AR also puts virtual information in relation to real objects, locations or events. The different properties of presentation in AR need to be investigated to develop a comprehensive set of dimensions of presentation principles. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Workshop 3: IEEE ISMAR 2012 workshop on tracking methods and applications (TMA)

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (40 KB)  

    The focus of this workshop is on presenting, discussing and demonstrating recent tracking methods and applications that work well in practice and that show some superiority over state-of-the-art methods. Rather than focusing on pure novelty, this workshop encourages presentations that concentrate on complete systems and integrated approaches. The TMA workshop looks at pose tracking from an end-to-end point of view. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Science & technology papers

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (77 KB)  
    Freely Available from IEEE
  • Wide-area scene mapping for mobile visual tracking

    Page(s): 3 - 12
    Multimedia
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4417 KB) |  | HTML iconHTML  

    We propose a system for easily preparing arbitrary wide-area environments for subsequent real-time tracking with a handheld device. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared environment. We combine panoramas captured using a handheld omnidirectional camera from several viewpoints to create a point cloud model. After the offline modeling step, live camera pose tracking is initialized by feature point matching, and continuously updated by aligning the point cloud model to the camera image. Given a reconstruction made with less than five minutes of video, we achieve below 25 cm translational error and 0.5 degrees rotational error for over 80% of images tested. In contrast to camera-based simultaneous localization and mapping (SLAM) systems, our methods are suitable for handheld use in large outdoor spaces. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Live tracking and mapping from both general and rotation-only camera motion

    Page(s): 13 - 22
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4100 KB) |  | HTML iconHTML  

    We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements. Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the `Geometric Robust Information Criterion' to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map. We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Kinectrack: Agile 6-DoF tracking using a projected dot pattern

    Page(s): 23 - 29
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4036 KB) |  | HTML iconHTML  

    We present Kinectrack, a new six degree-of-freedom (6-DoF) tracker which allows real-time and low-cost pose estimation using only commodity hardware. We decouple the dot pattern emitter and IR camera of the Kinect. Keeping the camera fixed and moving the IR emitter in the environment, we recover the 6-DoF pose of the emitter by matching the observed dot pattern in the field-of-view of the camera to a pre-captured reference image. We propose a novel matching technique to obtain dot pattern correspondences efficiently in wide- and adaptive-baseline scenarios. We also propose an auto-calibration method to obtain the camera intrinsics and dot pattern reference image. The performance of Kinectrack is evaluated and the rotational and translational accuracy of the system is measured relative to ground truth for both planar and multi-planar scene geometry. Our system can simultaneously recover the 6-DoF pose of the device and also recover piecewise planar 3D scene structure, and can be used as a low-cost method for tracking a device without any on-board computation, with small size and only simple electronics. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • [Blank page]

    Page(s): 30
    Save to Project icon | Request Permissions | PDF file iconPDF (5 KB)  
    Freely Available from IEEE