Abstract
I will present recent results in the HKUST UAV group towards robust autonomous navigation using only the sensory information from a monocular camera and an IMU. We argue that this is the minimum sensor suite that enables the full capability for autonomous navigation, including state estimation, dense mapping, trajectory planning, and feedback control. We will present a self-calibrating visual-inertial estimator that handles high-speed motion, and is stable from the ground level to high altitude. We pay special attention to the initialization problem and propose a numerically stable method for high-altitude estimator initialization and failure recovery. Building on top of the estimator, we propose an onboard method to generate large-scale dense maps that are sufficient for obstacle detection and avoidance. We close the perception-action loop by an optimization-based trajectory generation method that finds safe trajectories in an online fashion. Experimental results are demonstrated on our custom-built quadrotor testbed.