Hand Gesture Controlled TurtleBot
Baton

Overview
This project encompasses the development of a mechatronic system demonstrating the dynamic control of a TurtleBot utilizing hand gesture recognition. This specific system has been named Baton.
The system can recognize hand signals and gestures through a RGB camera or webcam and read such information being transformed into comprehensible data for a TurtleBot. This data is processed and sent in real-time to the TurtleBot, allowing for control over its movement. Such control includes the manipulation of the TurtleBots movement in terms of velocity.
The system employs the recognition of approximately seven gestures controlling forward, backward, acceleration, deceleration, left, and right movement of the TurtleBot.
Approach
This system software environment includes the programming language of Python alongside operating system Ubuntu 18.04 and 20.04 with ROS Noetic open source operating system. Furthermore, system modelling and aspects of the testing stage of the project has been completed utilizing software Gazebo, RViz and Turtle-Sim, GMapping (SLAM) environment. Therefore, fulfilling the objective of mapping interior spaces as it is mobile.
This system hardware environment includes a computer that will run the program, a webcam that detects hand gestures, a TurtleBot that will be the main robot which will demonstrate movement via provided hand gestures, a mounted LIDAR upon the TurtleBot along with a Pi Cam.

The University of Sydney
Vishant Prasad