Skip to Main Content
This paper deals with the possible benefits of perceptual learning in artificial intelligence. On the one hand, perceptual learning is more and more studied in neurobiology and is now considered as an essential part of any living system. In fact, perceptual learning and cognitive learning are both necessary for learning and often depend on each other. On the other hand, many works in machine learning are concerned with "abstraction" in order to reduce the amount of complexity related to some learning tasks. In the abstraction framework, perceptual learning can be seen as a specific process that learns how to transform the data before the traditional learning task itself takes place. In this paper, we argue that biologically inspired perceptual learning mechanisms could be used to build efficient low-level abstraction operators that deal with real-world data. To illustrate this, we present an application where perceptual-learning-inspired metaoperators are used to perform an abstraction on an autonomous robot visual perception. The goal of this work is to enable the robot to learn how to identify objects it encounters in its environment.