TY - JOUR
T1 - Vision-based framework to estimate robot configuration and kinematic constraints
AU - Ortenzi, Valerio
AU - Marturi, Naresh
AU - Mistry, Michael
AU - Kuo, Jeffrey A.
AU - Stolkin, Rustam
PY - 2018/8/17
Y1 - 2018/8/17
N2 - This paper addresses the problem of estimating the configuration of robots with no proprioceptive sensors and with kinematic constraints while performing tasks. Our work is motivated by the use of unsensored (industrial) manipulators, currently tele-operated in rudimentary ways, in hazardous environments such as nuclear decommissioning. For such robots, basic proprioceptive sensors are often unavailable. Even if radiation-hardened sensors could be retrofitted, such manipulators are typically deployed on a mobile base, while equipped with powerful end-effector tools for forceful contact tasks, which significantly perturb the robot base with respect to the scene. This work contributes a step towards enabling advanced control and increased autonomy in nuclear applications, but could also be applied to mechanically compliant, under-actuated arms and hands, and soft manipulators. Our proposed framework: estimates the robot configuration by casting it as an optimisation problem using visually tracked information; detects contacts during task execution; triggers an exploration task for detected kinematic constraints, which are then modelled by comparing observed versus commanded velocity vectors. Unlike previous literature, no additional sensors are required. We demonstrate our method on a Kuka iiwa 14 R820, reliably estimating and controlling robot motions and checking our estimates against ground truth values, and accurately reconstructing kinematic constraints.
AB - This paper addresses the problem of estimating the configuration of robots with no proprioceptive sensors and with kinematic constraints while performing tasks. Our work is motivated by the use of unsensored (industrial) manipulators, currently tele-operated in rudimentary ways, in hazardous environments such as nuclear decommissioning. For such robots, basic proprioceptive sensors are often unavailable. Even if radiation-hardened sensors could be retrofitted, such manipulators are typically deployed on a mobile base, while equipped with powerful end-effector tools for forceful contact tasks, which significantly perturb the robot base with respect to the scene. This work contributes a step towards enabling advanced control and increased autonomy in nuclear applications, but could also be applied to mechanically compliant, under-actuated arms and hands, and soft manipulators. Our proposed framework: estimates the robot configuration by casting it as an optimisation problem using visually tracked information; detects contacts during task execution; triggers an exploration task for detected kinematic constraints, which are then modelled by comparing observed versus commanded velocity vectors. Unlike previous literature, no additional sensors are required. We demonstrate our method on a Kuka iiwa 14 R820, reliably estimating and controlling robot motions and checking our estimates against ground truth values, and accurately reconstructing kinematic constraints.
KW - robots
KW - robot kinematics
KW - robot vision systems
U2 - 10.1109/TMECH.2018.2865758
DO - 10.1109/TMECH.2018.2865758
M3 - Article
SN - 1083-4435
JO - IEEE/ASME Transactions on Mechatronics
JF - IEEE/ASME Transactions on Mechatronics
ER -