By Topic

Temporal, Environmental, and Social Constraints of Word-Referent Learning in Young Infants: A Neurorobotic Model of Multimodal Habituation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Veale, R. ; Human-Robot Interaction Lab., Indiana Univ., Bloomington, IN, USA ; Schermerhorn, P. ; Scheutz, M.

Infants are able to adaptively associate auditory stimuli with visual stimuli even in their first year of life, as demonstrated by multimodal habituation studies. Different from language acquisition during later developmental stages, this adaptive learning in young infants is temporary and still very much stimulus-driven. Hence, temporal aspects of environmental and social factors figure crucially in the formation of prelexical multimodal associations. Study of these associations can offer important clues regarding how semantics are bootstrapped in real-world embodied infants. In this paper, we present a neuroanatomically based embodied computational model of multimodal habituation to explore the temporal and social constraints on the learning observed in very young infants. In particular, the model is able to explain empirical results showing that auditory word stimuli must be presented synchronously with visual stimulus movement for the two to be associated.

Published in:

Autonomous Mental Development, IEEE Transactions on  (Volume:3 ,  Issue: 2 )