This event was held LIVE in Wu and Chen, with virtual attendees…
A haptic interface is a mechatronic system that modulates the physical interaction between a human and their tangible surroundings. Such systems typically take the form of grounded kinesthetic devices, ungrounded wearable devices, or surface devices, and they enable the user to act on and feel a remote or virtual environment. I will elucidate key approaches to creating effective haptic interfaces by showcasing several systems my team created and evaluated over the years. I will go into more detail about Haptipedia, our online database of grounded force-feedback devices, and Haptify, the system we recently created to quantitatively benchmark the performance of such interfaces. The talk will then transition to physical human-robot interaction (pHRI), where the engineered system acts as a social agent rather than a tool. In addition to inventing tactile sensors, we have created a robot that plays exercise games with its human partner and have developed methods for learning dynamic physical interactions from demonstrations, both with applications to rehabilitation. Finally, I will present HuggieBot, a custom robot that uses visual and haptic sensing to give good interactive hugs. The presented research stems from collaborations with Hasti Seifi, Karon MacLean, Farimah Fazlollahi, Naomi Fitter, Mayumi Mohan, Michelle Johnson, Siyao “Nick” Hu, Alexis Block, and many others from Penn, MPI-IS, and elsewhere.