SonifEye: Sonification of Visual Information Using Physical Modeling Sound Synthesis | IEEE Journals & Magazine | IEEE Xplore

SonifEye: Sonification of Visual Information Using Physical Modeling Sound Synthesis


Abstract:

Sonic interaction as a technique for conveying information has advantages over conventional visual augmented reality methods specially when augmenting the visual field wi...Show More

Abstract:

Sonic interaction as a technique for conveying information has advantages over conventional visual augmented reality methods specially when augmenting the visual field with extra information brings distraction. Sonification of knowledge extracted by applying computational methods to sensory data is a well-established concept. However, some aspects of sonic interaction design such as aesthetics, the cognitive effort required for perceiving information, and avoiding alarm fatigue are not well studied in literature. In this work, we present a sonification scheme based on employment of physical modeling sound synthesis which targets focus demanding tasks requiring extreme precision. Proposed mapping techniques are designed to require minimum training for users to adapt to and minimum mental effort to interpret the conveyed information. Two experiments are conducted to assess the feasibility of the proposed method and compare it against visual augmented reality in high precision tasks. The observed quantitative results suggest that utilizing sound patches generated by physical modeling achieve the desired goal of improving the user experience and general task performance with minimal training.
Published in: IEEE Transactions on Visualization and Computer Graphics ( Volume: 23, Issue: 11, November 2017)
Page(s): 2366 - 2371
Date of Publication: 10 August 2017

ISSN Information:

PubMed ID: 28809687

Contact IEEE to Subscribe

References

References is not available for this document.