ABSTRACT
Computer science and optics are usually studied separately — separate people, in separate departments, meet at separate conferences. This is changing. The exciting promise of technologies like virtual reality and self-driving cars demand solutions that draw from the best aspects of computer vision, computer graphics, and optics. Previously, it has proved difficult to bridge these communities. For instance, the laboratory setups in optics are often designed to image millimeter-size scenes in a vibration-free darkroom.
Very specifically, this talk explores the potential of computational photography in the context of 3D imaging. First, we demonstrate a mapping from the polarization of light (orientation of a light wave in space) to the 3D geometry of a scene. Second, we show how it may be possible to use modify existing 3D camera technology to venture “beyond geometry”, demonstrating new forms of photography like imaging through scattering media, relighting of photographs, and fluorescence imaging. Such applications are enabled through the use of a time of flight camera, which has been altered to spatially and temporally encode light transport into the captured images. Finally, we discuss the broader impact of this design paradigm on the future of 3D depth sensors, interferometers, computational photography, medical imaging and many other applications.