期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Shape Sensing for Single-Port Continuum Surgical Robot Using FewMulticore Fiber Bragg Grating Sensors
1
作者 黎定佳 王重阳 +3 位作者 郭伟 王志东 张忠涛 刘浩 《Journal of Shanghai Jiaotong university(Science)》 EI 2023年第3期312-322,共11页
We proposed a method for shape sensing using a few multicore fiber Bragg grating (FBG) sensors ina single-port continuum surgical robot (CSR). The traditional method of utilizing a forward kinematic model tocalculate t... We proposed a method for shape sensing using a few multicore fiber Bragg grating (FBG) sensors ina single-port continuum surgical robot (CSR). The traditional method of utilizing a forward kinematic model tocalculate the shape of a single-port CSR is limited by the accuracy of the model. If FBG sensors are used forshape sensing, their accuracy will be affected by their number, especially in long and flexible CSRs. A fusionmethod based on an extended Kalman filter (EKF) was proposed to solve this problem. Shape reconstructionwas performed using the CSR forward kinematic model and FBG sensors, and the two results were fused usingan EKF. The CSR reconstruction method adopted the incremental form of the forward kinematic model, whilethe FBG sensor method adopted the discrete arc-segment assumption method. The fusion method can eliminatethe inaccuracy of the kinematic model and obtain more accurate shape reconstruction results using only a smallnumber of FBG sensors. We validated our algorithm through experiments on multiple bending shapes underdifferent load conditions. The results show that our method significantly outperformed the traditional methodsin terms of robustness and effectiveness. 展开更多
关键词 single-port continuum surgical robot multicoreber Bragg grating(FBG) forward kinematic model extended kalmanlter(ekf) shape reconstruction
原文传递
ArUco辅助的爬壁机器人自主定位方法
2
作者 张文 杨耀鑫 +1 位作者 黄天帜 孙振国 《机器人》 EI CSCD 北大核心 2024年第1期27-35,44,共10页
针对现有的爬壁机器人定位技术在纹理特征不明显、环境相对封闭、存在强磁干扰等特殊环境下的不足,提出利用机载鱼眼相机观测固定于地面的ArUco码的全新定位方案并实现了基于该定位方案的惯性测量单元/编码器/鱼眼相机多传感器融合的自... 针对现有的爬壁机器人定位技术在纹理特征不明显、环境相对封闭、存在强磁干扰等特殊环境下的不足,提出利用机载鱼眼相机观测固定于地面的ArUco码的全新定位方案并实现了基于该定位方案的惯性测量单元/编码器/鱼眼相机多传感器融合的自主定位方法A-IEF。该方法首先识别ArUco码,并根据其在鱼眼图像中的位置筛选关键帧;然后,研究了固定于地面的ArUco码的角点在鱼眼图像中的重投影规律,并结合机器人姿态约束进行重定位优化;其次,在关键帧区间内,推导了角点重投影误差关于机器人位置和姿态增量的雅可比矩阵;接着,设计了基于ES-EKF(误差状态扩展卡尔曼滤波)的多信息融合方法,以编码器估计的位移误差及ArUco角点重投影误差作为观测量,实现对机器人航向角和位置的修正;最后,在大型钢制构件上进行测试,试验结果表明,本文方法具有更高的定位精度,其中位置估计精度保持在0.06 m以内,航向角估计精度保持在3.7°以内,相较于ArUco-rectified、航位推测法等定位算法,位置误差降低约47%,航向角误差降低约68%,并能够实现在弱光照环境中的定位。 展开更多
关键词 多传感器融合 误差状态扩展卡尔曼滤波 爬壁机器人 位姿估计 ArUco码
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部