Mobile robot systems operate in dynamic environments, such as forests, caves, or roadways, where they must perceive and react to incoming stimuli. Each of these environments presents unique challenges such as perception in low-light conditions, high-speed understanding of objects, and low power requirements. Neuromorphic Vision Systems such as event-based cameras have appealing properties for the above challenges such as high-temporal resolution, low-power footprint, and high-dynamic-range. As opposed to conventional frame-based imagers, event-based cameras output a stream of asynchronous events consisting of spatial illumination changes. This presents an expansive new processing model for event-based computer vision applications. In this talk, I will begin with a tutorial on event-based cameras, review processing techniques for the event-stream, and highlight our recent work on developing high-speed perception action systems for object catching.