Indoor Visual-Based Localization System for Multi-Rotor UAVs - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Aug 3;22(15):5798.
doi: 10.3390/s22155798.

Indoor Visual-Based Localization System for Multi-Rotor UAVs

Affiliations

Indoor Visual-Based Localization System for Multi-Rotor UAVs

Massimiliano Bertoni et al. Sensors (Basel). .

Abstract

Industry 4.0, smart homes, and the Internet of Things are boosting the employment of autonomous aerial vehicles in indoor environments, where localization is still challenging, especially in the case of close and cluttered areas. In this paper, we propose a Visual Inertial Odometry localization method based on fiducial markers. Our approach enables multi-rotor aerial vehicle navigation in indoor environments and tackles the most challenging aspects of image-based indoor localization. In particular, we focus on a proper and continuous pose estimation, working from take-off to landing, at several different flying altitudes. With this aim, we designed a map of fiducial markers that produces results that are both dense and heterogeneous. Narrowly placed tags lead to minimal information loss during rapid aerial movements while four different classes of marker size provide consistency when the camera zooms in or out according to the vehicle distance from the ground. We have validated our approach by comparing the output of the localization algorithm with the ground-truth information collected through an optoelectronic motion capture system, using two different platforms in different flying conditions. The results show that error mean and standard deviation can remain constantly lower than 0.11 m, so not degrading when the aerial vehicle increases its altitude and, therefore, strongly improving similar state-of-the-art solutions.

Keywords: Visual Inertial Odometry; aerial vehicles; fiducial markers; indoor localization.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Typical cascaded controller structure: subscripts indicating the reference frames have been removed for sake of clarity.
Figure 2
Figure 2
Tag-based localization: this localization method is based on the identification of the roto-translation between the reference frame associated with the detected fiducial marker (AprilTag-frame) and the reference frame in-built with the camera mounted on the aerial platform (camera_color_optimal-frame) having known roto-translation with respect to the UAV body frame.
Figure 3
Figure 3
Basic tag design: this pattern has been used to generate the fiducial map by applying roto-translation and mirroring operations.
Figure 4
Figure 4
ROS2 architecture: ROS2 implementation of the proposed VIO localization strategy involves three principal nodes: camera driver, fiducial marker detector, visual odometry estimator.
Figure 5
Figure 5
Star-shaped multi-rotor UAVs used for validation: (a) a small-size quadrotor having mass approximately 0.5 kg and (b) a medium size hexarotor having mass approximately 3.5 kg.
Figure 6
Figure 6
PX4 Autopilot internal control architecture: the inputs of the whole control block are the estimated position and yaw angle of the vehicle, the output is the set of duty cycles to impose to the actuators for realizing the computed normalized force commands, expressed as aileron, elevator, rudder and thrust. Note that the attitude is modeled adopting the quaternion convention.
Figure 7
Figure 7
PX4 Autopilot EKF and Output Predictor internal architecture: Output Predictor guarantees the state prediction without delays and the concurrent EKF estimation is exploited to correct the resulting prediction. When accounting for the measurements of both the IMU sensors in the flight controller, two EKF instances run in parallel, and a selector compares the internal coherence of each one and determines the best sensors mix in terms of data consistency.
Figure 8
Figure 8
T1-QR01: trend of the UAV position components (ac) and 3D path (d), comparing the output of VIO localization (vio) and VICON motion capture system (vcn).
Figure 9
Figure 9
T1-HR01: trend of the UAV position components (ac) and 3D path (d), comparing the output of VIO localization (vio) and VICON motion capture system (vcn).
Figure 10
Figure 10
T1-QR01: statistical description of error e.
Figure 11
Figure 11
T1-HR01: statistical description of error e.
Figure 12
Figure 12
T2-QR01: trend of the UAV position components (ac) and 3D path (d), comparing the output of VIO localization (vio) and VICON motion capture system (vcn).
Figure 13
Figure 13
T2-HR01: trend of the UAV position components (ac) and 3D path (d), comparing the output of VIO localization (vio) and VICON motion capture system (vcn).
Figure 14
Figure 14
T2-QR01: statistical description of error e.
Figure 15
Figure 15
T2-HR01: statistical description of error e.
Figure 16
Figure 16
T2-QR01& HR01: mean and standard deviation of error e with respect to the AprilTag map distance.
Figure 16
Figure 16
T2-QR01& HR01: mean and standard deviation of error e with respect to the AprilTag map distance.

Similar articles

Cited by

References

    1. Hamandi M., Usai F., Sablé Q., Staub N., Tognon M., Franchi A. Design of Multirotor Aerial Vehicles: A Taxonomy Based on Input Allocation. Int. J. Robot. Res. 2021;40:1015–1044. doi: 10.1177/02783649211025998. - DOI
    1. Shakeri R., Al-Garadi M.A., Badawy A., Mohamed A., Khattab T., Al-Ali A.K., Harras K.A., Guizani M. Design challenges of multi-UAV systems in cyber-physical applications: A comprehensive survey and future directions. IEEE Commun. Surv. Tutor. 2019;21:3340–3385. doi: 10.1109/COMST.2019.2924143. - DOI
    1. Shakhatreh H., Sawalmeh A.H., Al-Fuqaha A., Dou Z., Almaita E., Khalil I., Othman N.S., Khreishah A., Guizani M. Unmanned aerial vehicles (UAVs): A survey on civil applications and key research challenges. IEEE Access. 2019;7:48572–48634. doi: 10.1109/ACCESS.2019.2909530. - DOI
    1. Islam N., Rashid M.M., Pasandideh F., Ray B., Moore S., Kadel R. A Review of Applications and Communication Technologies for Internet of Things (IoT) and Unmanned Aerial Vehicle (UAV) Based Sustainable Smart Farming. Sustainability. 2021;13:1821. doi: 10.3390/su13041821. - DOI
    1. Dastgheibifard S., Asnafi M. A review on potential applications of unmanned aerial vehicle for construction industry. Sustain. Struct. Mater. 2018;1:44–53.

Grants and funding

This work was partly supported by by the University of Padova under the BIRD-SEED CAR.

LinkOut - more resources