This was a hybrid event with in-person attendance in Levine 307 and virtual attendance…
Coupling advanced wearable and environmental sensors with dynamic AI frameworks has the potential to transform how we engage with machines and with the natural world. By co-developing intelligent sensors and deployable machine learning pipelines, we can unlock the power of data to address impactful challenges in fields ranging from human-robot collaboration to environmental science. Realizing this vision will require a multifaceted approach including networks that extract insights from large streams of continuous data, algorithms that adapt to new subjects or environments from limited examples, unobtrusive sensing with embedded autonomy, scalable multimodal dataset curation, and paradigms for fluid human interactions with AI systems. Moving towards these goals, this talk will present approaches to creating more intelligent and fluid human-robot interactions by leveraging wearable sensors for brain, muscle, and motion activity combined with adaptive learning pipelines. It will explore how these techniques can scale to curate multimodal datasets of human behavior that aim to support foundational models of physical intelligence. To help improve sensing capabilities, it will also discuss techniques for creating soft wearable sensors with embedded learning pipelines. Finally, recent results of using wearable and deployable sensors to study non-human species will be introduced; combining robust wearable sensor tags, drones, and new machine learning pipelines can reveal exciting insights into the language and culture of sperm whales.