Unmanned aerial vehicles (UAVs) have become increasingly popular for inspection tasks due to their speed, agility, and reduced need for human intervention. Unlike ground-based inspections, UAVs are not restricted by challenging terrain or difficult-to-access spaces. As a result, they excel in both outdoor and indoor environments and can be employed for tasks ranging from agricultural monitoring to aircraft maintenance checks. Although many types of UAVs exist—each optimised for a particular environment—they share a growing focus on autonomous functionality. Localisation is a key element of this autonomy: the ability to accurately determine a vehicle’s position and orientation (attitude).
While GPS-based navigation combined with inertial measurement units (IMUs) often provides reliable data, GPS signals can be weak or completely unavailable in certain conditions (for example, indoors or in areas with interference). In these situations, relying solely on IMU data causes significant drift errors over time. One way to overcome this limitation is through vision-based localisation, using a camera to determine where the UAV is in relation to its environment. This approach dovetails perfectly with inspection tasks, as most UAVs already carry cameras for visual inspection, offering a natural path to a fully autonomous solution without adding extra sensors.
Marker-Based vs. Markerless Localisation
Vision-based localisation methods typically fall into two categories:
- Marker-Based Localisation
This method involves placing fiducial markers in the inspection area. These markers help the UAV’s system calculate position and orientation but require additional setup and maintenance. - Markerless Localisation
In this scenario, the UAV’s camera detects natural features (or “natural markers”) in the environment without any added physical markers. Markerless localisation is highly desirable because it minimises human intervention and avoids altering the inspection area.
Because the goal is to reduce external inputs and allow for flexible, wide-ranging inspections, markerless localisation is particularly well-suited for UAV-based inspection tasks.
Research Goals and Objectives
The overarching goal of this research project is to develop a system capable of estimating the position and orientation of a vehicle relative to an inspected object or environment. This system is designed to function in real time and leverages vision-based techniques. It is particularly well-suited for unmanned aerial vehicles (UAVs) that carry onboard cameras for inspection tasks.
To achieve this goal, the research first seeks to create a 3D feature catalogue derived from pre-recorded video footage. This catalogue serves as a “memory” of the inspection object or environment. Once this map is generated, the next step is to identify areas with a high density of features and cluster them to form natural markers. This clustering process allows the system to recognise and track these feature hotspots more efficiently in subsequent analysis.
Building on the ability to generate and cluster features, the project then focuses on developing a method to detect and segment these natural markers in real-time video feeds. This would enable a UAV’s onboard camera to identify them without artificial fiducial markers. Subsequently, the system uses these detected features to localise the vehicle relative to the pre-recorded 3D map, providing precise positional and attitudinal information.
Further enhancing this vision-based approach, the project incorporates an inertial measurement unit (IMU) to propagate position and orientation estimates between image frames. The fusion of IMU data with vision-based localisation improves the system’s robustness and increases its sampling rate. Finally, the project entails practical implementation and testing to evaluate the proposed solution’s performance, reliability, and computational demands, thereby ensuring it meets the requirements for autonomous inspection in both controlled and real-world conditions.
Final Thoughts
UAVs offer remarkable versatility for inspection tasks—whether in agriculture, construction, or aviation. By leveraging vision-based localisation, these vehicles can become more autonomous and precise in environments where GPS might be unreliable or unavailable. This research focuses on building a self-contained pipeline, starting from creating a robust 3D feature map and ending with real-time detection and position estimation. Through careful clustering of natural features and the integration of sensor data, the system aims to operate accurately and efficiently, even in challenging conditions.
By shifting away from fiducial markers, UAVs gain greater autonomy and flexibility, making them better suited for complex or large-scale inspections. Although limitations remain—such as depth range and static environments—this vision-based approach provides a strong foundation for advanced UAV navigation and inspection capabilities. As technology continues to evolve, UAVs stand to become even more critical in streamlining and enhancing inspection tasks across various industries.
Download and read the complete research:https://scholar.sun.ac.za/items/aeb9a6da-5ff5-4017-beff-6db6da077a4e