Mobile Robot Navigation, Active Localization, and Stealth Recovery

The ability to operate for long periods of time and then return back safely - is a critical feature for Unmanned Underwater Vehicles (UUVs) in many important applications such as subsea inspection, remote surveillance, and seasonal monitoring. A major challenge for a robot in such long-term missions is to estimate its location accurately since GPS signals cannot penetrate the ocean's surface, and Wi-Fi or radio communication infrastructures are not available underwater. Using a dedicated surface vessel for acoustic referencing or coming up to the water surface for GPS signals are power hungry, computationally expensive, and often impossible (in stealth applications). This project makes scientific and engineering advances by using a novel optics-based framework and on-board AI technologies to solve this problem. The proposed algorithms and systems allows underwater robots to estimate their location with GPS-quality accuracy without ever resurfacing. More importantly, these features enables long-term autonomous navigation and safe recovery of underwater robots without the need for dedicated surface vessels for acoustic guidance.

Overall, the outcomes of this project contributes to the long-term marine ecosystem monitoring and ocean climate observatory research as well as in remote stealth mission execution for defense applications. This project is supported by the the NSF Foundational Research in Robotics (FRR) program. We are working on this project in collaboration with the FOCUS lab (Dr. Koppal) and APRIL lab (Dr. Shin) at UF.


This project advocates a novel solution to the foundational problem of underwater robot localization and navigation by introducing the notion of `optical homing and penning'. This new optics-based framework incorporates three sets of novel technologies for (a) distant UUV (Unmanned Underwater Vehicle) positioning with blue-green laser speckles, (b) accurate 3D orientation measurements from coded bokeh spectrums, and (c) GPS-quality pose estimates by a directionally-controlled adaptive LIDAR. The combined optical sensory system will be deployable from specialized buoys acting as floating lighthouses. An intelligent visual SLAM system will also be developed for robust state estimation in deep waters when no lighthouse beacons are visible. For feasibility analysis and assessment, this project will formalize real-world deployment strategies on two UUV platforms through comprehensive ocean trials in the northern Gulf of Mexico and the Atlantic Ocean.