2021_CTR_UR5_EPN

HGR and Tracking Control for a Virtual UR5 Robot

Abstract: In this project, we developed a human-machine interface (HMI) which use an EMG-based HGR system presented for recognizing six different hand gestures: waveIn, waveOut, fist, open, pinch, relax state and the information provided by the IMU from the Myo Armband sensor to be used like control commands and reference inputs for the controllers designed to control the position and orientation of a virtual six-degree-of-freedom (DoF) UR5 robotic manipulator located in the scene designed in the open-source robotic simulation software CoppeliaSim which carries out the task of painting a set of walls and areas with different orientations and positions to demonstrate the usefulness of the HMI proposed. The controllers designed to achieve the control objectives are a PID controller based on minimal norm solution and a controller based on linear algebra.

Demonstration video:

Source code:

https://github.com/laboratorioAI/HGR_TRACKING_CONTROL_UR5_ROBOT

ADDRESS

  • Ladrón de Guevara E11-253, Quito – Ecuador
  • “José Rubén Orellana” polytechnic campus
    Faculty of Systems Engineering
    Fourth floor

FOLLOW US