摘要
高精度的定位技术是嫦娥三号月面巡视探测器在月表实施科学探测任务的重要保障,本文提出了适用于月面环境无高精度控制点的立体图像条带网定位方法.首先,在地面控制场中,采用基于控制点的光束法平差方法获得精确的相机标定参数,进而计算出立体相机的摄影基线和相对方位元素.其次,通过像点匹配、前方交会完成图像的立体模型构建,然后根据不同摄站间的连接点序列建立立体图像条带网.依据连接点坐标观测值的权矩阵和扩展正交Procrustes理论,对立体图像条带网的数学模型进行最小二乘平差,直接获得月面巡视探测器的位置与姿态信息.最后在室内试验场内进行了新方法和传统光束法平差方法的对比验证,结果表明新方法在稳定性和计算效率方面优于后者,探测器定位精度相当.在此基础上,新方法应用于嫦娥三号月面巡视探测器的遥操作系统,充分满足了探测器的定位需求.
High-precision localization technology is an important component for conducting scientific explorations in the Chang'E-3 lunar rover mission. In this study, the exact stereo images strips network localization (SISNL) method is proposed to solve the localization problem of the lack of high-precision control points on the lunar surface. First, we obtained the camera calibration parameters using the bundle adjustment algorithm in a ground field with some known control points. Hence the baseline and relative orientation elements of the stereo cameras were acquired. Second, after building the stereoscopic model using feature matching and forward intersection, the stereo images strips network was generated, using the sequence of connection points at different stations. Third, based on the weighting matrix of the coordinate observations of the connection points, and on the extended orthogonal Procrustes theory, the least squares adjustment algorithm was used to solve the mathematical model of the images strips network. The lunar rover's position and attitude information was then acquired directly. At the end of the study, we compared the SISNL method with the traditional bundle adjustment method using the indoor testing field data. Though the two methods have equivalent localization accuracy, the comparision results show that the SISNL method was superior to the latter in terms of both stability and efficiency. On this basis, the SISNL method was successfully applied to the Chang'E-3 lunar rover teleoperation system and satisfied the demand for lunar rover localization.
出处
《科学通报》
EI
CAS
CSCD
北大核心
2015年第4期372-378,共7页
Chinese Science Bulletin
基金
国家自然科学基金(41071298
41401535)资助