Hand Gesture

Overview

  • OpenCV is a great tool for image processing and performing computer vision tasks. It is an open-source library that can be used to perform tasks like face detection, objection tracking, landmark detection, and much more.
  • ROS, which means Robot Operating System, is a set of software libraries and tools to help you build robot applications. The point of ROS is to create a robotics standard, so you don’t need to reinvent the wheel anymore when building new robotic software.
  • Hand gestures can be used in such a way it can help with people who have difficulty in controlling or operation systems or devices.
    Gesturing is a natural and intuitive way to interact with people and the environment. So it makes perfect sense to use hand gestures as a method of human-computer interaction​

Aim

This project deals with the implementation of OpenCV and ROS. It includes detecting the hand gestures and publishing different velocities to the turtlebot3 in the gazebo environment.

Applications

Hand gesture detection can be used in various industries as advances in computer vision, sensors, machine learning, and deep learning have made it more available and accurate.
Hand gesture detection finds its use in:
1. Virtual environment control
2. Remotely control robots
3. Sign language translation

Working in Brief

Detection of Hand GesturesFirst, we isolate hand from the background for this we separate the skin color from the input frame using HSV or another color format with upper and lower bound for skin color as per range. We then refine the image by removing noise via Morphological Transformation (which works on the kernel size of matrix, there are two types of basic transformation errosion and dilation. Errosion works on basis if kernel have a black element the entire portion under kernerl turn to black where as in dilation if even a single white element comes under kernel the entire part of image under kernel turns white.) and applying filters to smooth the image. Finding the contour of the mask and finding its convex hull.Using the various properties for like convexity defects along with contour properties like solidity, extent and aspect ratio of respective gesture. Basically we calculated the ratio of convex hull to contour of hand gesture to differentiate numbers with same number of convex defects.For controlling RobotInitial trials were carried on turtlebot under ROS pakage.We used ros-topic/cmd-vel for publishing linewr and angular velocity velocity messages to turtlebot in gazebo simulation as per the motion we want on individual hand gestures.​
Picture Picture

Results

  • Hand gesture Recognition
Picture
  • Hand gesture controlled robot
Picture

Future Prospects

  • There are various fields like automobile, healthcare, virtual reality and especially consumer electronics where a non-touch based system and devices are under development.
  • While gesture recognition may reduce the need for handheld devices, yet another avenue could lead to a vast infusion of specialized input devices. In the future, there may be custom devices for virtually every different type of activity in a virtual environment.

Extensions

  • The robot can be controlled with both hands to provide more flexibility and control over the robot motion.
  • Obstacle avoidance algorithms can be implemented to make avoid crash and make the robot motion smooth and safer
  • Other features like face dectection and motion detection can be used along with hand gestures to further improve the controlling and develop a non touch based control system

Tools and Libraries used

Picture PythonPicture
PicturePicture
Picture

Team

Members 
Yuvraj BoradeJoel VijoSahil DharmeSakshi GiriDeepshikhar GuptaAniket Khare
Mentors
Siddharth SinghNachiketAyush SinghAmit SawantHimanshuPratik

GitHub Repository