LIVESTREAM - GRASP on Robotics


GRASP on Robotics is an inaugural series of talks hosted by the GRASP Laboratory. GRASP leverages academic, research, and industry connections to deliver a set of high class tech talks with the mission of providing technical topics and meaningful discussions. 

Join the live-stream here Fridays from 10:30am -11:45am followed by a Q&A panel between our speaker, faculty, and students until 11:45am.

Spring 2025 GRASP on Robotics: Bruno Olshausen, University of California, Berkeley & Redwood Center for Theoretical Neuroscience, “Invariance and equivariance in brains and machine”

This will be a hybrid event with in-person attendance in Wu and Chen and virtual attendance on Zoom.

ABSTRACT

The goal of building machines that can perceive and act in the world as humans and other animals do has been a focus of AI research efforts for over half a century. Over this same period, neuroscience has sought to achieve a mechanistic understanding of the brain processes underlying perception and action. It stands to reason that these parallel efforts could inform one another. Here I propose an approach to the long-standing problem invariant and equivariant representation in vision – that is, how do we recognize objects independent of pose, lighting and other variations, and how do we perceive such variations independent of object shape? The approach is rooted in observations of animal behavior and informed by both neurobiological mechanisms (recurrence, dendritic nonlinearities, phase coding) and mathematical principles (group theory, residue numbers). What emerges from this approach is a neural circuit for factorization that can learn about shapes and their transformations from image data, and a model of the grid-cell system based on high-dimensional encodings of residue numbers. These models provide efficient solutions to long-studied problems that are well-suited for implementation in neuromorphic hardware or as a basis for forming hypotheses about visual cortex and entorhinal cortex.