Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration
Abstract
:1. Introduction
- The accuracy of generating heading data from low-cost hardware is enhanced by utilizing RTK-GPS and visual-inertial fusion to calculate true-north direction vectors. This approach effectively improves the stability of geospatial azimuth estimation, enabling virtual objects to maintain accurate and stable orientation in complex environments. It enables the adaptation of virtual objects to the arbitrary motion of user devices in space.
- A fusion method is employed to establish the initial attitude of the rigid body relative to the geographic north direction by combining the true-north direction vector and the gravity vector. By integrating the visual-inertial fusion method with the RTK-GPS multi-point positioning fusion method, the initialization error of the estimated rotation matrix in the visual-inertial fusion method is reduced. This effectively eliminates alignment errors between geospatial data and the ground surface, enabling accurate initial alignment of virtual objects with real-world features such as terrain and buildings.
- The initial attitude relative to the geographic north direction is dynamically combined with the motion-estimated attitude using the visual-inertial fusion method. This allows for high-precision, multi-source coordinate system transformations, ensuring the spatial rotational invariance of the rigid body with respect to the geospatial reference frame. This approach effectively reduces the precision requirements and costs of the global pose for the visual-inertial fusion sensor and the high-frequency pose for RTK-GPS. As a result, it can be applied to various AR precision systems, including smartphones and AR glasses.
2. Related Work
2.1. Sensor-Based Methods
2.2. Vision-Based Methods
2.3. Hybrid Method
- Matching of actual geographic surface: Maintaining the orientation and angles of virtual objects relative to real terrain features, enabling high-precision alignment between the two, and enhancing the realism and immersion of AR maps.
- Stability of geo-registration for virtual spatial objects: Ensuring that virtual objects remain stable in their orientation and rotational state when observed by users, avoiding jitter, drift, and other phenomena, thereby enhancing the user experience.
- Accuracy of geographic orientation estimation: Leveraging information such as true-north direction vectors and gravity vectors, along with the rotational invariance of the geographic reference, to improve the estimation process of pose initialization and update fusion, reducing initialization errors and accumulated errors in attitude estimation, and improving the accuracy and stability of the results.
- Precision of coordinate transformation: By combining high-precision position data from RTK-GPS and the pose estimation from visual-inertial fusion methods, constructing translation matrices and rotation-invariant matrices, achieving high-precision coordinate transformation, and reducing errors in coordinate system conversions in AR maps.
3. Methods
3.1. Overview of the Rotation-Invariant Estimation Approach
- Estimation of geographic north orientation: The true-north heading angle was calculated using the RTK-GPS and visual-inertial fusion. A true-north direction vector was then constructed as the reference for the geographic coordinate system.
- Extraction of gravity vector: The RANSAC algorithm was employed to extract the gravity vector from the plane equation of the Earth’s surface. This gravity vector serves as an initial constraint for the visual-inertial fusion method.
- Attitude construction and fusion: The initial attitude was constructed by combining the true-north direction vector and the gravity vector. This enables the fusion of high-frequency, high-precision relative attitude heading angles from the visual-inertial fusion method with the high stability and accuracy of the global attitude heading angle from the RTK-GPS. Furthermore, a continuous rotation-invariant matrix was constructed to reduce the initialization error and the cumulative error in estimating the rotation matrix of the visual-inertial fusion method.
- Coordinate system transformation: A translation matrix was constructed by fusing the high-precision position from the RTK-GPS with the pose estimation of the visual-inertial fusion method. This, combined with the rotation-invariant matrix, enables high-precision coordinate conversion and reduces errors in complex coordinate transformations for AR mapping.
- The AR world coordinate system is responsible for determining the camera’s position on the AR device in the real world. Each point in this system corresponds to a specific location in the physical world. It is introduced by the camera and can represent various objects, with units typically in meters (m).
- The AR camera coordinate system is defined with the optical center of the camera as the origin. The z-axis aligns with the optical axis, which points forward from the camera and is perpendicular to the imaging plane. The positive directions of the x- and y-axes are parallel to the object’s coordinate system. The units are typically in meters (m).
- The AR geographic coordinate system is a coordinate system that maps the entire Earth, with the z-axis pointing towards the Earth’s center and aligned with the gravity vector.
3.2. Calculation of True-North Direction Vector Based on RTK-GPS and Visual-Inertial Fusion
3.3. Extraction of Gravity Vector by Solving the Ground Surface Plane Equation Using the RANSAC Algorithm
3.4. Attitude Construction and Fusion
3.4.1. Initialization of Quaternion Attitude Based on True-North Vector and Gravity Vector
3.4.2. Continuous Rotation-Invariant Matrix Based on True-North Direction Vector and Gravity Vector
3.5. Coordinate Transformation Based on Rotation-Invariant Matrices
3.5.1. Transformation from AR Geographic Coordinate System to AR World Coordinate System
3.5.2. Transformation from the AR World Coordinate System to the AR Camera Coordinate System
4. Experiments
4.1. Experimental Platform
4.2. Testing of Positional Alignment Results in AR Geo-Registration
4.3. Testing of Surface Registration Results in AR Geo-Registration
4.4. Stability Testing of Motion Tracking in AR Geo-Registration
5. Conclusions
- The proposed method exhibited slightly higher errors within short distances (within 10 m) compared to visual-inertial fusion methods, mainly due to the lower accuracy of RTK-GPS within a small range. Therefore, a more flexible sensor-switching mechanism should be designed to select the most suitable sensor combination for different distance ranges, aiming to achieve optimal AR geolocation results.
- In complex and varying outdoor terrain scenarios, high-precision and markerless pose tracking remains challenging. Employing recursive filtering methods may improve the final orientation accuracy of rotation-invariant matrices and enable dynamic calibration during motion. Combining filtering methods with smoother coordinate transformation techniques can enhance the visual quality of AR geolocation results, yielding smoother visualization in AR geolocation alignment. Additionally, the proposed method in this paper primarily focuses on AR application scenarios, where the gravity direction is perpendicular to the ground. In future research, ongoing efforts will be made to further improve this situation.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cheng, Y.; Zhu, G.; Yang, C.; Miao, G.; Ge, W. Characteristics of augmented map research from a cartographic perspective. Cartogr. Geogr. Inf. Sci. 2022, 49, 426–442. [Google Scholar] [CrossRef]
- Behzadan, A.H.; Kamat, V.R. Georeferenced Registration of Construction Graphics in Mobile Outdoor Augmented Reality. J. Comput. Civ. Eng. 2007, 21, 247–258. [Google Scholar] [CrossRef]
- Ren, X.; Sun, M.; Jiang, C.; Liu, L.; Huang, W. An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform. Sensors 2018, 18, 3739. [Google Scholar] [CrossRef] [Green Version]
- Liu, D.; Chen, J.; Hu, D.; Zhang, Z. Dynamic BIM-augmented UAV safety inspection for water diversion project. Comput. Ind. 2019, 108, 163–177. [Google Scholar] [CrossRef]
- Portalés, C.; Lerma, J.L.; Navarro, S. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments. ISPRS J. Photogramm. Remote Sens. 2010, 65, 134–142. [Google Scholar] [CrossRef]
- Xiao, W.; Mills, J.; Guidi, G.; Rodríguez-Gonzálvez, P.; Gonizzi Barsanti, S.; González-Aguilera, D. Geoinformatics for the conservation and promotion of cultural heritage in support of the UN Sustainable Development Goals. ISPRS J. Photogramm. Remote Sens. 2018, 142, 389–406. [Google Scholar] [CrossRef]
- Ma, X.; Sun, J.; Zhang, G.; Ma, M.; Gong, J. Enhanced Expression and Interaction of Paper Tourism Maps Based on Augmented Reality for Emergency Response. In Proceedings of the 2018 2nd International Conference on Big Data and Internet of Things—BDIOT 2018, Beijing, China, 24–26 October 2018; ACM Press: New York, NY, USA, 2018; pp. 105–109. [Google Scholar]
- Gazcón, N.F.; Trippel Nagel, J.M.; Bjerg, E.A.; Castro, S.M. Fieldwork in Geosciences assisted by ARGeo: A mobile Augmented Reality system. Comput. Geosci. 2018, 121, 30–38. [Google Scholar] [CrossRef]
- Li, W.; Han, Y.; Liu, Y.; Zhu, C.; Ren, Y.; Wang, Y.; Chen, G. Real-time location-based rendering of urban underground pipelines. ISPRS Int. J. Geo-Inf. 2018, 7, 32. [Google Scholar] [CrossRef] [Green Version]
- Suh, J.; Lee, S.; Choi, Y. UMineAR: Mobile-tablet-based abandoned mine hazard site investigation support system using augmented reality. Minerals 2017, 7, 198. [Google Scholar] [CrossRef] [Green Version]
- Huang, K.; Wang, C.; Wang, S.; Liu, R.; Chen, G.; Li, X. An Efficient, Platform-Independent Map Rendering Framework for Mobile Augmented Reality. ISPRS Int. J. Geo-Inf. 2021, 10, 593. [Google Scholar] [CrossRef]
- Li, P.; Qin, T.; Hu, B.; Zhu, F.; Shen, S. Monocular Visual-Inertial State Estimation for Mobile Augmented Reality. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nantes, France, 9–13 October 2017; pp. 11–21. [Google Scholar]
- Von Stumberg, L.; Usenko, V.; Cremers, D. Direct Sparse Visual-Inertial Odometry Using Dynamic Marginalization. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2510–2517. [Google Scholar]
- Trimpe, S.; D’Andrea, R. Accelerometer-based tilt estimation of a rigid body with only rotational degrees of freedom. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2630–2636. [Google Scholar]
- Zhang, Z.-Q.; Yang, G.-Z. Calibration of Miniature Inertial and Magnetic Sensor Units for Robust Attitude Estimation. IEEE Trans. Instrum. Meas. 2014, 63, 711–718. [Google Scholar] [CrossRef] [Green Version]
- Tedaldi, D.; Pretto, A.; Menegatti, E. A robust and easy to implement method for IMU calibration without external equipments. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 3042–3049. [Google Scholar]
- Thong, Y.K.; Woolfson, M.S.; Crowe, J.A.; Hayes-Gill, B.R.; Challis, R.E. Dependence of inertial measurements of distance on accelerometer noise. Meas. Sci. Technol. 2002, 13, 1163–1172. [Google Scholar] [CrossRef] [Green Version]
- Ryohei, H.; Michael, C. Outdoor Navigation System by AR. SHS Web Conf. 2021, 102, 04002. [Google Scholar] [CrossRef]
- Wang, Y.J.; Gao, J.Q.; Li, M.H.; Shen, Y.; Hasanyan, D.; Li, J.F.; Viehland, D. A review on equivalent magnetic noise of magnetoelectric laminate sensors. Philos. Trans. R. Soc. Math. Phys. Eng. Sci. 2014, 372, 20120455. [Google Scholar] [CrossRef] [Green Version]
- Morales, Y.; Tsubouchi, T. DGPS, RTK-GPS and StarFire DGPS Performance Under Tree Shading Environments. In Proceedings of the 2007 IEEE International Conference on Integration Technology, Shenzhen, China, 20–24 March 2007; pp. 519–524. [Google Scholar]
- Kim, M.G.; Park, J.K. Accuracy Evaluation of Internet RTK GPS by Satellite Signal Reception Environment. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2013, 31, 277–283. [Google Scholar] [CrossRef]
- Burkard, S.; Fuchs-Kittowski, F. User-Aided Global Registration Method using Geospatial 3D Data for Large-Scale Mobile Outdoor Augmented Reality. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020; pp. 104–109. [Google Scholar]
- Randeniya, D.I.B.; Sarkar, S.; Gunaratne, M. Vision–IMU Integration Using a Slow-Frame-Rate Monocular Vision System in an Actual Roadway Setting. IEEE Trans. Intell. Transp. Syst. 2010, 11, 256–266. [Google Scholar] [CrossRef]
- Suwandi, B.; Kitasuka, T.; Aritsugi, M. Low-cost IMU and GPS fusion strategy for apron vehicle positioning. In Proceedings of the TENCON 2017—2017 IEEE Region 10 Conference, Penang, Malaysia, 5–8 November 2017; pp. 449–454. [Google Scholar]
- Wang, S.; Deng, Z.; Yin, G. An Accurate GPS-IMU/DR Data Fusion Method for Driverless Car Based on a Set of Predictive Models and Grid Constraints. Sensors 2016, 16, 280. [Google Scholar] [CrossRef] [Green Version]
- Mahdi, A.E.; Azouz, A.; Abdalla, A.; Abosekeen, A. IMU-Error Estimation and Cancellation Using ANFIS for Improved UAV Navigation. In Proceedings of the 2022 13th International Conference on Electrical Engineering (ICEENG), Cairo, Egypt, 29–31 March 2022; pp. 120–124. [Google Scholar]
- Huang, W.; Sun, M.; Li, S. A 3D GIS-based interactive registration mechanism for outdoor augmented reality system. Expert Syst. Appl. 2016, 55, 48–58. [Google Scholar] [CrossRef]
- Qimin, X.; Bin, C.; Xu, L.; Xixiang, L.; Yuan, T. Vision-IMU Integrated Vehicle Pose Estimation based on Hybrid Multi-Feature Deep Neural Network and Federated Filter. In Proceedings of the 2021 28th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS), Saint Petersburg, Russia, 31 May–2 June 2021; pp. 1–5. [Google Scholar]
- Liu, R.; Zhang, J.; Chen, S.; Yang, T.; Arth, C. Accurate real-time visual SLAM combining building models and GPS for mobile robot. J. Real-Time Image Process. 2021, 18, 419–429. [Google Scholar] [CrossRef]
- Toker, A.; Zhou, Q.; Maximov, M.; Leal-Taix’e, L. Coming Down to Earth: Satellite-to-Street View Synthesis for Geo-Localization. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 6484–6493. [Google Scholar]
- Mithun, N.C.; Minhas, K.S.; Chiu, H.-P.; Oskiper, T.; Sizintsev, M.; Samarasekera, S.; Kumar, R. Cross-View Visual Geo-Localization for Outdoor Augmented Reality. In Proceedings of the 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Shanghai, China, 25–29 March 2023; pp. 493–502. [Google Scholar]
- Ventura, J.; Höllerer, T. Wide-area scene mapping for mobile visual tracking. In Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Atlanta, GA, USA, 5–8 November 2012; pp. 3–12. [Google Scholar]
- Qin, T.; Cao, S.; Pan, J.; Shen, S. A General Optimization-based Framework for Global Pose Estimation with Multiple Sensors 2019. arXiv 2019, arXiv:1901.03642. [Google Scholar] [CrossRef]
- Qu, X.; Soheilian, B.; Habets, E.; Paparoditis, N. Evaluation of sift and surf for vision based localization. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B3-685, 685–692. [Google Scholar] [CrossRef] [Green Version]
- Wan, G.; Yang, X.; Cai, R.; Li, H.; Zhou, Y.; Wang, H.; Song, S. Robust and Precise Vehicle Localization Based on Multi-Sensor Fusion in Diverse City Scenes. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 4670–4677. [Google Scholar]
- Hesch, J.A.; Kottas, D.G.; Bowman, S.L.; Roumeliotis, S.I. Consistency Analysis and Improvement of Vision-aided Inertial Navigation. IEEE Trans. Robot. 2014, 30, 158–176. [Google Scholar] [CrossRef] [Green Version]
- Corke, P.; Lobo, J.; Dias, J. An Introduction to Inertial and Visual Sensing. Int. J. Robot. Res. 2007, 26, 519–535. [Google Scholar] [CrossRef]
- Foxlin, E.; Naimark, L. VIS-Tracker: A wearable vision-inertial self-tracker. In Proceedings of the IEEE Virtual Reality, 2003, Los Angeles, CA, USA, 22–26 March 2003; pp. 199–206. [Google Scholar]
- Schall, G.; Wagner, D.; Reitmayr, G.; Taichmann, E.; Wieser, M.; Schmalstieg, D.; Hofmann-Wellenhof, B. Global pose estimation using multi-sensor fusion for outdoor Augmented Reality. In Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Orlando, FL, USA, 19–22 October 2009; pp. 153–162. [Google Scholar]
- Waegel, K.; Brooks, F.P. Filling the gaps: Hybrid vision and inertial tracking. In Proceedings of the 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Adelaide, SA, Australia, 1–4 October 2013; pp. 1–4. [Google Scholar]
- Oskiper, T.; Samarasekera, S.; Kumar, R. Global Heading Estimation For Wide Area Augmented Reality Using Road Semantics For Geo-referencing. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bari, Italy, 4–8 October 2021; pp. 427–428. [Google Scholar]
- Hansen, L.H.; Fleck, P.; Stranner, M.; Schmalstieg, D.; Arth, C. Augmented Reality for Subsurface Utility Engineering, Revisited. IEEE Trans. Vis. Comput. Graph. 2021, 27, 4119–4128. [Google Scholar] [CrossRef]
- Leick, A.; Rapoport, L.; Tatarnikov, D. Geodesy. In GPS satellite surveying; Wiley: Hoboken, NJ, USA, 2015; pp. 129–206. ISBN 978-1-119-01861-2. [Google Scholar]
- Chen, D.; Zhang, L.; Li, J.; Liu, R. Urban building roof segmentation from airborne lidar point clouds. Int. J. Remote Sens. 2012, 33, 6497–6515. [Google Scholar] [CrossRef]
- Pujol, J. Hamilton, Rodrigues, Gauss, Quaternions, and Rotations: A Historical Reassessment. Commun. Math. Anal. 2012, 13, 1–14. [Google Scholar]
- Huang, W.; Wan, W.; Liu, H. Optimization-Based Online Initialization and Calibration of Monocular Visual-Inertial Odometry Considering Spatial-Temporal Constraints. Sensors 2021, 21, 2673. [Google Scholar] [CrossRef]
- Lu, R.S.; Li, Y.F. A global calibration method for large-scale multi-sensor visual measurement systems. Sens. Actuators Phys. 2004, 116, 384–393. [Google Scholar] [CrossRef]
- Han, T.; Zhou, G. Pseudo-spectrum-based multi-sensor multi-frame detection in mixed coordinates. Digit. Signal Process. 2023, 134, 103931. [Google Scholar] [CrossRef]
- Thomas, C.M.; Featherstone, W.E. Validation of Vincenty’s Formulas for the Geodesic Using a New Fourth-Order Extension of Kivioja’s Formula. J. Surv. Eng. 2005, 131, 20–26. [Google Scholar] [CrossRef]
- Nowak, E.; Nowak Da Costa, J. Theory, strict formula derivation and algorithm development for the computation of a geodesic polygon area. J. Geod. 2022, 96, 20. [Google Scholar] [CrossRef]
- Huang, K.; Wang, C.; Liu, R.; Chen, G. A Fast and Accurate Spatial Target Snapping Method for 3D Scene Modeling and Mapping in Mobile Augmented Reality. ISPRS Int. J. Geo-Inf. 2022, 11, 69. [Google Scholar] [CrossRef]
- SDK Downloads|ARCore. Available online: https://developers.google.com/ar/develop/downloads (accessed on 27 May 2023).
- Nowacki, P.; Woda, M. Capabilities of ARCore and ARKit Platforms for AR/VR Applications. In Proceedings of the Engineering in Dependability of Computer Systems and Networks, Brunów, Poland, 1–5 July 2019; Zamojski, W., Mazurkiewicz, J., Sugier, J., Walkowiak, T., Kacprzyk, J., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 358–370. [Google Scholar]
- Li, H.; Wang, J.; He, S.; Lee, C.-H. Nonlinear Optimal Impact-Angle-Constrained Guidance with Large Initial Heading Error. J. Guid. Control Dyn. 2021, 44, 1663–1676. [Google Scholar] [CrossRef]
- Sukhareva, E.; Tomchinskaya, T.; Serov, I. SLAM-based Indoor Navigation in University Buildings. In Proceedings of the 31th International Conference on Computer Graphics and Vision, Nizhny Novgorod, Russia, 27–30 September 2021; Volume 2, pp. 611–617. [Google Scholar]
- Servières, M.; Renaudin, V.; Dupuis, A.; Antigny, N. Visual and Visual-Inertial SLAM: State of the Art, Classification, and Experimental Benchmarking. J. Sens. 2021, 2021, 2054828. [Google Scholar] [CrossRef]
- Ménoret, V.; Vermeulen, P.; Le Moigne, N.; Bonvalot, S.; Bouyer, P.; Landragin, A.; Desruelle, B. Gravity measurements below 10−9 g with a transportable absolute quantum gravimeter. Sci. Rep. 2018, 8, 12300. [Google Scholar] [CrossRef]
Parameters | Value |
---|---|
RTK-GPS device | HUACE H20 receiver equipped with the RTK kit |
RTK-GPS kit | RTK-GPS single-antenna kit [51] |
Smartphone device and OS | Mi 12 (the Android operating system) |
Dependencies of geo-registration API | C++ and OpenGL libraries, along with ARCore [52,53] |
The magnetic sensor kit | AKM kit (Asahi Kasei Microdevices) |
The walking speed during the experiment | Restricted to approximately 0.8–1.2 m per second to conform to typical walking motion patterns |
Top-mounted fixed receiver on the gimbal bracket | Fixed on top of a gimbal bracket at a height of 2 m, facing the same direction as the smartphone |
Parameters | Value |
---|---|
Map data layers | 6 vector layers, 3 scene layers, 1 navigation layer,1 POI (Point of Interest) layer, and 1 text annotation layer |
Data source of 3D building data | National BIM Library |
Data source of the vector road data | OpenStreetMap |
Total dataset size | 3.7 GB |
AR Geo-Registration Methods | 10.0 m | 50.0 m | 100.0 m | 150.0 m |
---|---|---|---|---|
Visual-inertial fusion method | 4.7 cm | 23.7 cm | 1.31 m | 2.49 m |
RTK-GPS+IMU | 27.1 cm | 33.0 cm | 44.9 cm | 52.1 cm |
Our method | 5.1 cm | 8.9 cm | 11.3 cm | 15.8 cm |
Method | Average Angle (deg) | Standard Deviation |
---|---|---|
RTK-GPS+IMU | 4.722 | 2.314 |
Our method | 1.901 | 1.106 |
Distance from the Origin (m) | RTK-GPS+IMU | Our Method |
---|---|---|
0.0 | 12.2 cm | 3.1 cm |
50.0 | 14.1 cm | 2.7 cm |
100.0 | 16.7 cm | 3.6 cm |
−50.0 | 14.6 cm | 2.5 cm |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, K.; Wang, C.; Shi, W. Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration. Remote Sens. 2023, 15, 3709. https://doi.org/10.3390/rs15153709
Huang K, Wang C, Shi W. Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration. Remote Sensing. 2023; 15(15):3709. https://doi.org/10.3390/rs15153709
Chicago/Turabian StyleHuang, Kejia, Chenliang Wang, and Wenjiao Shi. 2023. "Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration" Remote Sensing 15, no. 15: 3709. https://doi.org/10.3390/rs15153709
APA StyleHuang, K., Wang, C., & Shi, W. (2023). Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration. Remote Sensing, 15(15), 3709. https://doi.org/10.3390/rs15153709