Skip to Main Content
A conceptual framework for tactually guided exploration and shape perception using a robotic medium is provided. The conceptual framework identifies the needed sensory information, the spatial and temporal transformations of this information, the control mechanism, both feedforward and feedback, for performing the missions and the perception machinery. These attributes, in turn, sharpen the focus on the major building blocks needed in design of artificial skins, tactile sensing, and processing systems of the future. These building blocks are identification of central nervous system (CNS) machinery in living systems that perform the required computations, mappings, processors for extraction of the needed manipulation and recognition parameters, processing of outputs of populations of natural tactile sensors, and finally, understanding the dynamics of sensor-imbedded skin and artificial skin in fixed, gliding, and rolling contact with known and unknown objects and surfaces.