Monocular Dense Mapping and Blur-aware Localization for Aerial Robots
3:30pm
Room 2611 (Lifts 31 & 32), 2/F Academic Building, HKUST

Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:

Examination Committee

Prof Ling SHI, ECE/HKUST (Chairperson)
Prof Shaojie SHEN, ECE/HKUST (Thesis Supervisor)
Prof Ming LIU, ECE/HKUST
 
 

Abstract

Autonomous micro aerial vehicles (MAVs) have cost and mobility benefits, making them ideal robotic platforms for applications including aerial photography, surveillance, and search and rescue. As the platform scales down, MAVs become more capable of operating in confined environments, but they also introduce challenges such as environment perception using the minimum sensor suite and state estimation under the aggressive motion. In fact, a monocular camera together with an inertial measurement unit (IMU) becomes the minimum sensor suite allowing autonomous flight with sufficient environmental awareness. With more agility, there comes more serious motion blur in the images which disrupts the classic visual-based localization methods.


In this thesis, we firstly present a GPU-accelerated monocular dense mapping that conditions on the estimated pose providing wide-angle situational awareness. Through a truncated signed distance function (TSDF) fusion, a global dense mesh map is fused for autonomous flight as well as depth perception for blur-aware motion estimation. A Spline-based pose representation is then adopted and optimized using both blurred images and IMU measurements. Extensive experimental results are provided to validate individual modules. Finally, an autonomous navigation application based on our dense mapping is presented, the overall performance in both indoor and outdoor environments are shown. 

講者/ 表演者:
Yi LIN
語言
英文