By Topic

Autonomous Mental Development, IEEE Transactions on

Issue 2 • Date June 2011

Filter Results

Displaying Results 1 - 14 of 14
  • Table of contents

    Publication Year: 2011 , Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (118 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Autonomous Mental Development publication information

    Publication Year: 2011 , Page(s): C2
    Save to Project icon | Request Permissions | PDF file iconPDF (36 KB)  
    Freely Available from IEEE
  • Grounding Language in Action

    Publication Year: 2011 , Page(s): 109 - 112
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | PDF file iconPDF (442 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • Language Does Something: Body Action and Language in Maternal Input to Three-Month-Olds

    Publication Year: 2011 , Page(s): 113 - 128
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3082 KB) |  | HTML iconHTML  

    We conducted a naturalistic study in which German mothers interacted with their three-month-old infants during diaper changing as an everyday activity. Following the idea that “acoustic packaging” educates infants' attention, we explored whether the verbal input to the infants in natural interactions simultaneously contains action information. Applying a microanalysis method, we first analyzed the data qualitatively by identifying classes of body movement and vocal activities (that we called vocal types). We used these categories to observe the multimodal interaction practices of mothers and to describe the interaction ecology of everyday activity. Second, we analyzed the co-occurrence of language (in the form of different vocal activities) and action (in the form of body movements) quantitatively. We found that during early interaction with infants, German mothers vocalize in a tight temporal relationship with action over a considerable part of the overall interaction time, thereby making the vocal signal both perceivable and tangible to the infants. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Temporal, Environmental, and Social Constraints of Word-Referent Learning in Young Infants: A Neurorobotic Model of Multimodal Habituation

    Publication Year: 2011 , Page(s): 129 - 145
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1024 KB) |  | HTML iconHTML  

    Infants are able to adaptively associate auditory stimuli with visual stimuli even in their first year of life, as demonstrated by multimodal habituation studies. Different from language acquisition during later developmental stages, this adaptive learning in young infants is temporary and still very much stimulus-driven. Hence, temporal aspects of environmental and social factors figure crucially in the formation of prelexical multimodal associations. Study of these associations can offer important clues regarding how semantics are bootstrapped in real-world embodied infants. In this paper, we present a neuroanatomically based embodied computational model of multimodal habituation to explore the temporal and social constraints on the learning observed in very young infants. In particular, the model is able to explain empirical results showing that auditory word stimuli must be presented synchronously with visual stimulus movement for the two to be associated. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Emergence of Protosentences in Artificial Communicating Systems

    Publication Year: 2011 , Page(s): 146 - 153
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (697 KB) |  | HTML iconHTML  

    This paper investigates the relationship between embodied interaction and symbolic communication. We report about an experiment in which simulated autonomous robotic agents, whose control systems were evolved through an artificial evolutionary process, use abstract communication signals to coordinate their behavior in a context independent way. This use of signals includes some fundamental aspects of sentences in natural languages which are discussed by using the concept of joint attention in relation to the grammatical structure of sentences. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Acoustic Packaging: Maternal Speech and Action Synchrony

    Publication Year: 2011 , Page(s): 154 - 162
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (842 KB) |  | HTML iconHTML  

    The current study addressed the degree to which maternal speech and action are synchronous in interactions with infants. English-speaking mothers demonstrated the function of two toys, stacking rings and nesting cups to younger infants (6-9.5 months) and older infants (9.5-13 months). Action and speech units were identified, and speech units were coded as being ongoing action descriptions or nonaction descriptions (examples of nonaction descriptions include attention-getting utterances such as “Look!” or statements of action completion such as “Yay, we did it!”). Descriptions of ongoing actions were found to be more synchronous with the actions themselves in comparison to other types of utterances, suggesting that: 1) mothers align speech and action to provide synchronous “acoustic packaging” during action demonstrations; and 2) mothers selectively pair utterances directly related to actions with the action units themselves rather than simply aligning speech in general with actions. Our results complement past studies of acoustic packaging in two ways. First, we provide a quantitative temporal measure of the degree to which speech and action onsets and offsets are aligned. Second, we offer a semantically based analysis of the phenomenon, which we argue may be meaningful to infants known to process global semantic messages in infant-directed speech. In support of this possibility, we determined that adults were capable of classifying low-pass filtered action- and nonaction-describing utterances at rates above chance. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Are We There Yet? Grounding Temporal Concepts in Shared Journeys

    Publication Year: 2011 , Page(s): 163 - 175
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1377 KB) |  | HTML iconHTML  

    An understanding of time and temporal concepts is critical for interacting with the world and with other agents in the world. What does a robot need to know to refer to the temporal aspects of events-could a robot gain a grounded understanding of “a long journey,” or “soon?” Cognitive maps constructed by individual agents from their own journey experiences have been used for grounding spatial concepts in robot languages. In this paper, we test whether a similar methodology can be applied to learning temporal concepts and an associated lexicon to answer the question “how long” did it take to complete a journey. Using evolutionary language games for specific and generic journeys, successful communication was established for concepts based on representations of time, distance, and amount of change. The studies demonstrate that a lexicon for journey duration can be grounded using a variety of concepts. Spatial and temporal terms are not identical, but the studies show that both can be learned using similar language evolution methods, and that time, distance, and change can serve as proxies for each other under noisy conditions. Effective concepts and names for duration provide a first step towards a grounded lexicon for temporal interval logic. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Experiment on Behavior Generalization and the Emergence of Linguistic Compositionality in Evolving Robots

    Publication Year: 2011 , Page(s): 176 - 189
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1694 KB) |  | HTML iconHTML  

    Populations of simulated agents controlled by dynamical neural networks are trained by artificial evolution to access linguistic instructions and to execute them by indicating, touching, or moving specific target objects. During training the agent experiences only a subset of all object/action pairs. During postevaluation, some of the successful agents proved to be able to access and execute also linguistic instructions not experienced during training. This owes to the development of a semantic space, grounded in the sensory motor capability of the agent and organized in a systematized way in order to facilitate linguistic compositionality and behavioral generalization. Compositionality seems to be underpinned by a capability of the agents to access and execute the instructions by temporally decomposing their linguistic and behavioral aspects into their constituent parts (i.e., finding the target object and executing the required action). The comparison between two experimental conditions, in one of which the agents are required to ignore rather than to indicate objects, shows that the composition of the behavioral set significantly influences the development of compositional semantic structures. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • IEEE Foundation [advertisement]

    Publication Year: 2011 , Page(s): 190
    Save to Project icon | Request Permissions | PDF file iconPDF (320 KB)  
    Freely Available from IEEE
  • Introducing ieee.tv [advertisement]

    Publication Year: 2011 , Page(s): 191
    Save to Project icon | Request Permissions | PDF file iconPDF (203 KB)  
    Freely Available from IEEE
  • Scitopia.org [advertisement]

    Publication Year: 2011 , Page(s): 192
    Save to Project icon | Request Permissions | PDF file iconPDF (269 KB)  
    Freely Available from IEEE
  • IEEE Computational Intelligence Society Information

    Publication Year: 2011 , Page(s): C3
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Autonomous Mental Development Information for authors

    Publication Year: 2011 , Page(s): C4
    Save to Project icon | Request Permissions | PDF file iconPDF (28 KB)  
    Freely Available from IEEE

Aims & Scope

IEEE Transactions on Autonomous Mental Development (TAMD) includes computational modeling of mental development, including mental architecture, theories, algorithms, properties, and experiments.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Zhengyou Zhang
Microsoft Research