This was a hybrid event with in-person attendance in Wu and Chen and virtual attendance…
In this talk, I will introduce the development of a high-resolution robotic tactile sensor GelSight, and how it can help robots understand and interact with the physical world. GelSight is a vision-based tactile sensor that measures the geometry of the contact surface with a spatial resolution of around 25 micrometers, and it also measures the shear forces and torques at the contact surface. With the help of high-resolution information, a robot could easily detect the precise shape and texture of the object surfaces and therefore recognize them. But it can help robots get more information from contact, such as understanding different physical properties of the objects and assisting with manipulation tasks. The talk will cover our work on using GelSight to detect slip during grasping and perceiving object properties such as hardness and viscosity of the liquid. I will also present our work in simulating the tactile sensor and using the simulated sensor input to boost the robot’s capability to perform perception and grasping tasks in reality. These simulation tools can also help us to rethink the sensor design challenge and how tactile sensors can be used for various types of robots.