The concept of “x-ray vision” is widely understood to be the ability to see through structures that are not transparent to the human eye. This concept would be a useful feature for surgeons and surgical robots, particularly when navigating complex anatomy. The Photoacoustic & Ultrasonic Systems Engineering (PULSE) Lab is developing imaging systems to offer this capability, but not with ionizing x-rays. Instead, we are utilizing a different wavelength on the electromagnetic spectrum, specifically the nm wavelengths required to induce the photoacoustic effect and enable photoacoustic imaging. To implement this vision, laser pulses delivered through optical fibers illuminate surgical regions of interest, causing an acoustic response that is detectable with ultrasound transducers. Beamforming is then implemented to create a photoacoustic image. In this talk, I will highlight novel light delivery systems, new spatial coherence beamforming theory, deep learning alternatives to beamforming, and robotic integration methods, each pioneered by the PULSE Lab to enable an exciting new frontier known as photoacoustic-guided surgery. This new paradigm has the potential to eliminate the occurrence of major complications (e.g., inaccurate targeting, excessive bleeding, paralysis, accidental patient death) during a wide range of delicate surgeries and procedures, including neurosurgery, cardiac catheter-based interventions, liver surgery, spinal fusion surgery, hysterectomies, biopsies, and teleoperative robotic surgeries.