Skip to Main Content
In this paper we present a computational model for incremental word meaning acquisition. It is designed to rapidly build category representations which correspond to the meaning of words. In contrast to existing approaches, our model further extracts word meaning-relevant features using a statistical learning technique. Both category learning and feature extraction are performed simultaneously. To achieve the contradictory needs of rapid as well as statistical learning, we employ mechanisms inspired by Complementary Learning Systems theory. Therefore, our framework is composed of two recurrently coupled components: (1) An adaptive Normalized Gaussian network performs a one-shot memorization of new word-scene associations and uses the acquired knowledge to categorize novel situations. The network further reactivates memorized associations based on its internal representation. (2) Based on the reactivated patterns an additional component subsequently extracts features which facilitate the categorization task. An iterative application of the learning mechanism results in a gradual memory consolidation which let the internal representation of a word meaning become more efficient and robust. We present simulation results for a scenario in which words for object relations concerning position, size, and color have been trained. The results demonstrate that the model learns from few training exemplars and correctly extracts word meaning-relevant features.