Aim:
To explore an unknown area using a mobile robot, avoiding all obstacles and simultaneously generating a 2D map of the area with the help of a LiDAR sensor.
Working Principle:
This algorithm is mainly dependent on lidar data we are obtaining from YDLIDAR. The LiDAR sensor mounted on the robot was used to detect obstacles in front, left and right directions and based on the information, the robot moved in a direction where obstacles were farther or absent. Mapping was done using hector mapping package to create a 2D occupancy grid map of the area. In hector mapping, the robot uses previous scan results to estimate the current state of the system, so a drift from the beginning will be recorded resulting in random rotation and translation of the map frame against other ground truth frames.
We used Gazebo simulator to test our codes before applying them on actual hardware. Gazebo is a 3D dynamic simulator with the ability to accurately and efficiently simulate populations of robots in complex indoor and outdoor environments. While similar to game engines, Gazebo offers physics simulation at a much higher degree of fidelity, a suite of sensors, and interfaces for both users and programs. We performed teleoperation (operating manually through keyboard) and both mapping techniques (G-Mapping and Hector Mapping) in simulation on Turtlebot3 using the Gazebo simulator.
Software:
ROS(Robot Operating System)
Gazebo simulator
Hardware:
TurtleBot2:
TurtleBot2 consists of an Yujin Kobuki base, a 2,200mAh battery pack, a Kinect sensor, an Asus 1215N laptop with a dual core processor, fast charger, charging dock, and a hardware mounting kit attaching everything together and adding future sensors. Turtlebot2 was released on Oct 2012.

Lidar:

Media:
Team Members
Mentors: