Human-robot interaction for object manipulation

kuka_leap

Project Coordinator: Dr Riccardo Secoli

Research Student: Alexander Koenig

Robots are often controlled with rather unintuitive devices such as handheld controllers or computer programs with predefined motion paths. In this project, we interface the Leap Motion gesture tracking device with a KUKA IIWA robot and a ReFlex TakkTile robotic hand. This setup allows for more natural control of the robotic rig with simple hand gestures: the robotic arm executes all relative motions of the operator’s palm whereas the robotic hand carries out the finger movements.

Furthermore, the software platform supports teleoperation in virtual reality. The operator can see the workspace in a virtual reality headset as recorded by a camera near the robot. The system can be easily configured to different workspaces and works with both KUKA IIWA robot models. The platform is based on the modular Robot Operating System (ROS) which makes it easily usable and extensible.

The goal of this project is to use this intuitive interface to study human grasp strategies. We want to transfer the extensive experience of grasping which we humans gained throughout our lives to the robot in the form of autonomous algorithms. Ultimately, this can be beneficial in industrial robotics where assembly-line robots could perform versatile grasping tasks. In the medical robotics domain, we can imagine scenarios in which autonomous robots assist surgeons in handing over or safely stowing away surgical instruments.

Link Publications: (in review)

[VIDEO DEMO] [CODE]