Algorithms for dextrous robot grasping always have to cope with the challenge of achieving high object specialisation for a wide range of grasping contexts. In this paper, we present a tactile-driven approach that dynamically uses the robot's grasping experience to address this issue. During the grasp movement, the current contact information is used to dynamically adapt the grasping control by targeting the best matching posture from the experience base. Thus, the robot recalls and actuates a grasp it already successfully performed in a similar tactile context. To efficiently represent the experience, we introduce the grasp manifold assuming that grasp postures form a smooth manifold in hand posture space. We present a simple way of providing approximations of grasp manifolds using self-organising maps (SOMs). The algorithm is evaluated on three different geometry primitives - box, cylinder and sphere - in a physics-based computer simulation.