ABSTRACT
Haptic perception remains a grand challenge for artificial hands. The functionality of artificial dexterous manipulators could be enhanced by artificial “haptic intelligence” that enables identification of objects and their features via touch alone. This could be especially useful when reliable visual and/or proprioceptive feedback are unavailable. Studies will be presented in which a robot hand outfitted with a deformable, multimodal tactile sensor was used to replay human-inspired haptic “exploratory procedures” to perceive salient geometric features such as edges and fingertip-sized bumps and pits. Tactile signals generated by active fingertip motions were used to extract inputs for offline support vector classification and regression models. More recently, we have been using reinforcement learning to learn goal-based policies for a functional contour-following task: the closure of a ziplock bag. Q-learning was used to learn a policy for online decision-making according to a reward structure that favored functional fingerpad-zipper spatial relationships. Preliminary results will be shown for Q-learning and Contextual Multi-armed Bandit approaches. The ability to perceive local shape could be used to advance semi-autonomous robot systems and to provide haptic feedback to human teleoperators of devices ranging from neuroprostheses to bomb defusal robots. The ability to perceive and manipulate deformable contours could be extended to the robotic manipulation of thread, wire, and rope.