Abstract
Disaster response and search and rescue missions are among the most difficult missions in which an autonomous robot can be deployed. These require a robot to autonomously navigate chaotic, unstructured indoor and outdoor environments. However popular state estimation and mapping methods using vision and lidar are severely handicapped by fog, smoke, or other airborne particulates; conditions common in disaster scenarios. This work presents radar-based methods for state estimation and mapping that are not affected by smoke and fog. We demonstrate the performance of these methods are comparable to other popular methods in favorable conditions. We also demonstrate visual and lidar-based methods degrade quickly in fog, while our methods do not.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Barfoot, T., Forbes, R.J., Furgale, P.: Pose estimation using linearized rotations and quaternion algebra: Acta Astronaut. 68, 101–112 (2011)
Cen, S.H., Newman, P.: Precise ego-motion estimation with millimeter-wave radar under diverse and challenging conditions. In: 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, Brisbane, QLD (2018)
Cen, S.H., Newman, P.: Radar-only ego-motion estimation in difficult settings via graph matching. arXiv:1904.11476 [cs] (2019)
Deming, W.: Statistical Adjustment of Data. J. Wiley & Sons, Incorporated (1943). https://books.google.com/books?id=9awgAAAAMAAJ
Dickmann, J., Appenrodt, N., Bloecher, H., Brenk, C., Hackbarth, T., Hahn, M., Klappstein, J., Muntzinger, M., Sailer, A.: Radar contribution to highly automated driving. In: 2014 44th European Microwave Conference (2014)
Ebadi, K., Chang, Y., Palieri, M., Stephens, A., Hattel, A., Heiden, E., Thakur, A., Morrell, B., Carlone, L., Aghamohammadi, A.: Lamp: large-scale autonomous mapping and positioning for exploration of perceptually-degraded subterranean environments. In: 2020 IEEE International Conference on Robotics and Automation (ICRA). Paris, France (2020)
Forster, C., Carlone, L., Dellaert, F., Scaramuzza, D.: On-manifold preintegration for real-time visual-inertial odometry. Trans. Rob. 33(1), 1–21 (2017)
Geneva, P., Eckenhoff, K., Yang, Y., Huang, G.: Lips: lidar-inertial 3D plane slam. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 123–130 (2018)
Hornung, A., Wurm, K.M., Bennewitz, M., Stachniss, C., Burgard, W.: OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Autonomous Robots (2013). Software available at http://octomap.github.com
Khattak, S., Papachristos, C., Alexis, K.: Keyframe-based direct thermal-inertial odometry. CoRR abs/1903.00798 (2019). http://arxiv.org/abs/1903.00798
Kramer, A., Stahoviak, C., Santamaria-Navarro, A., Aghamohammadi, A., Heckman, C.: booktitle=2020 IEEE International Conference on Robotics and Automation (ICRA)
Leutenegger, S., Furgale, P., Rabaud, V., Chli, M., Konolige, K., Siegwart, R.: Keyframe-based visual-inertial slam using nonlinear optimization (2013)
Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)
Michael, N., Shen, S., Mohta, K., Kumar, V., Nagatani, K., Okada, Y., Kiribayashi, S., Otake, K., Yoshida, K., Ohno, K., Takeuchi, E., Tadokoro, S.: Collaborative mapping of an earthquake-damaged building via ground and aerial robots. J. Field Robot. 29(5), 832–841 (2012)
Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint kalman filter for vision-aided inertial navigation. In: Proceedings 2007 IEEE International Conference on Robotics and Automation, , Rome, Italy, pp. 3565–3572. IEEE (2007)
O’Toole, M., Achar, S., Narasimhan, S.G., Kutulakos, K.N.: Homogeneous codes for energy-efficient illumination and imaging. ACM Trans. Graph. 34(4), 1–13 (2015)
Qi, C.R., Su, H., Mo, K., Guibas, L.J.: PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. arXiv:1612.00593 [cs] (2017). http://arxiv.org/abs/1612.00593. ArXiv: 1612.00593
Santamaria-Navarro, A., Loianno, G., Solà, J., Kumar, V., Andrade-Cetto, J.: Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors. Autonom. Robots 42(6), 1263–1280 (2018)
Santamaria-Navarro, A., Solà, J., Andrade-Cetto, J.: High-frequency mav state estimation using low-cost inertial and optical flow measurement units. In: Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on, pp. 1864–1871 (2015)
Schuster, F., Keller, C.G., Rapp, M., Haueis, M., Curio, C.: Landmark based radar SLAM using graph optimization. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), pp. 2559–2564 (2016)
Schuster, F., Wörner, M., Keller, C.G., Haueis, M., Curio, C.: Robust localization based on radar signal clustering, pp. 839–844 (2016)
Shen, S., Mulgaonkar, Y., Michael, N., Kumar, V.S.A.: Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In: Robotics: Science and Systems (2013)
Sibley, G.: A Sliding Window Filter for SLAM p. 17
Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., Grixa, I.L., Ruess, F., Suppa, M., Burschka, D.: Toward a fully autonomous UAV: research platform for indoor and outdoor urban search and rescue. IEEE Robot. Autom. Mag. 19(3), 46–56 (2012). https://doi.org/10.1109/MRA.2012.2206473
Torr, P.H.S., Zisserman, A.: MLESAC: a new robust estimator with application to estimating image geometry. Comput. Vis. Image Underst. 78, 138–156 (2000)
Vivet, D., Checchin, P., Chapuis, R.: Localization and Mapping Using Only a Rotating FMCW Radar Sensor. Sensors (Basel, Switzerland) 13(4), 4527–4552 (2013). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3673098/
Zhang, J., Singh, S.: Loam: lidar odometry and mapping in real-time. In: Robotics: Science and Systems (2014)
Acknowledgements
The authors would like to thank the members of the CoStar team for sharing both their lab space and technical expertise. We would also like to thank Shakeeb Ahmad of the CU Boulder Mechanical Engineering department for his help in demonstrating our radar state estimator’s use in closed-loop control of a micro-aerial vehicle as shown in our video submission.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kramer, A., Heckman, C. (2021). Radar-Inertial State Estimation and Obstacle Detection for Micro-Aerial Vehicles in Dense Fog. In: Siciliano, B., Laschi, C., Khatib, O. (eds) Experimental Robotics. ISER 2020. Springer Proceedings in Advanced Robotics, vol 19. Springer, Cham. https://doi.org/10.1007/978-3-030-71151-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-71151-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-71150-4
Online ISBN: 978-3-030-71151-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)