Model Driven Robotic Assistance for Human-Robot Collaboration

Embargo until
2014-12-01
Date
2013-10-18
Journal Title
Journal ISSN
Volume Title
Publisher
Johns Hopkins University
Abstract
While robots routinely perform complex assembly tasks in highly structured factory environments, it is challenging to apply completely autonomous robotic systems in less structured manipulation tasks, such as surgery and machine assembly/repair, due to the limitations of machine intelligence, sensor data interpretation and environment modeling. A practical, yet effective approach to accomplish these tasks is through human-robot collaboration, in which the human operator and the robot form a partnership and complement each other in performing a complex task. We recognize that humans excel at determining task goals and recognizing constraints, if given sufficient feedback about the interaction between the tool (e.g., end-effector of the robot) and the environment. Robots are precise, unaffected by fatigue and able to work in environments not suitable for humans. We hypothesize that by providing the operator with adequate information about the task, through visual and force (haptic) feedback, the operator can: (1) define the task model, in terms of task goals and virtual fixture constraints through an interactive, or immersive augmented reality interface, and (2) have the robot actively assist the operator to enhance the execution time, quality and precision of the tasks. We validate our approaches through the implementations of both cooperative (i.e., hands-on) control and telerobotic systems, for image-guided robotic neurosurgery and telerobotic manipulation tasks for satellite servicing under significant time delay.
Description
Keywords
robotics, human-machine collaboration system
Citation