TY - GEN
T1 - Vision-guided state estimation and control of robotic manipulators which lack proprioceptive sensors
AU - Ortenzi, Valerio
AU - Marturi, Naresh
AU - Stolkin, Rustam
AU - Kuo, Jeffrey A.
AU - Mistry, Michael
PY - 2016/12/1
Y1 - 2016/12/1
N2 - This paper presents a vision-based approach for estimating the configuration of, and providing control signals for, an under-sensored robot manipulator using a single monocular camera. Some remote manipulators, used for decommissioning tasks in the nuclear industry, lack proprioceptive sensors because electronics are vulnerable to radiation. Additionally, even if proprioceptive joint sensors could be retrofitted, such heavy-duty manipulators are often deployed on mobile vehicle platforms, which are significantly and erratically perturbed when powerful hydraulic drilling or cutting tools are deployed at the end-effector. In these scenarios, it would be beneficial to use external sensory information, e.g. vision, for estimating the robot configuration with respect to the scene or task. Conventional visual servoing methods typically rely on joint encoder values for controlling the robot. In contrast, our framework assumes that no joint encoders are available, and estimates the robot configuration by visually tracking several parts of the robot, and then enforcing equality between a set of transformation matrices which relate the frames of the camera, world and tracked robot parts. To accomplish this, we propose two alternative methods based on optimisation. We evaluate the performance of our developed framework by visually tracking the pose of a conventional robot arm, where the joint encoders are used to provide ground-truth for evaluating the precision of the vision system. Additionally, we evaluate the precision with which visual feedback can be used to control the robot's end-effector to follow a desired trajectory.
AB - This paper presents a vision-based approach for estimating the configuration of, and providing control signals for, an under-sensored robot manipulator using a single monocular camera. Some remote manipulators, used for decommissioning tasks in the nuclear industry, lack proprioceptive sensors because electronics are vulnerable to radiation. Additionally, even if proprioceptive joint sensors could be retrofitted, such heavy-duty manipulators are often deployed on mobile vehicle platforms, which are significantly and erratically perturbed when powerful hydraulic drilling or cutting tools are deployed at the end-effector. In these scenarios, it would be beneficial to use external sensory information, e.g. vision, for estimating the robot configuration with respect to the scene or task. Conventional visual servoing methods typically rely on joint encoder values for controlling the robot. In contrast, our framework assumes that no joint encoders are available, and estimates the robot configuration by visually tracking several parts of the robot, and then enforcing equality between a set of transformation matrices which relate the frames of the camera, world and tracked robot parts. To accomplish this, we propose two alternative methods based on optimisation. We evaluate the performance of our developed framework by visually tracking the pose of a conventional robot arm, where the joint encoders are used to provide ground-truth for evaluating the precision of the vision system. Additionally, we evaluate the precision with which visual feedback can be used to control the robot's end-effector to follow a desired trajectory.
KW - Cameras
KW - Robot vision systems
KW - Visualization
KW - Manipulators
UR - http://www.scopus.com/inward/record.url?scp=85006399800&partnerID=8YFLogxK
U2 - 10.1109/IROS.2016.7759525
DO - 10.1109/IROS.2016.7759525
M3 - Conference contribution
AN - SCOPUS:85006399800
SN - 978-1-5090-3763-6 (PoD)
T3 - IEEE International Conference on Intelligent Robots and Systems. Proceedings
SP - 3567
EP - 3574
BT - 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
PB - Institute of Electrical and Electronics Engineers (IEEE)
T2 - 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2016
Y2 - 9 October 2016 through 14 October 2016
ER -