Skip to content

IITISoC-IVR/RoughTerrain-IVR2

Repository files navigation

Autonomous Navigation for Vehicles in Rough Terrain

Contributors Forks Stargazers Issues

Jackal Robot

.

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Welcome to the ROS-MATLAB project on Rough Terrain Navigation! In this project, we demonstrate how to achieve robust and efficient navigation for mobile robots in challenging terrains using MATLAB toolboxes and co-simulation with Gazebo. Navigating rough terrains is challenging due to uneven surfaces and obstacles. Autonomous systems use advanced algorithms and sensor data to adapt, plan optimised routes, and avoid hazards, making rough terrain navigation easier and more efficient. Jackal, TurtleBot, and Curiosity Mars Rover are the bots utilised for testing. Among them, the Jackal bot exhibited the best results.

Contributors :

(back to top)

Built With

  • Windows Badge

  • Ubuntu Badge

  • ROS Noetic Badge

  • Gazebo Badge

  • MATLAB Badge

  • C++ Badge

  • Python Badge

(back to top)

Robots Descrption:

Our experimentation involved utilizing three distinct robots, each deployed in different environments, to thoroughly test the effectiveness of our algorithms. Each robot was equipped with specific sensors, enabling them to navigate and interact with their respective environments efficiently. Below, we provide a description of each robot, along with the sensors they were equipped with:

  1. JACKAL:

    • Environment: Office World, Inspection World
    • Sensors:
      1. Sick LMS1xx - 2D Laser Scanner
      2. Velodyne VLP-16 - 3D Laser Scanner
      3. Hokuyo UST-10 lidar - front accessory fender
      4. NovAtel satellite navigation receiver
      5. Pointgrey Flea3 camera
      6. Pointgrey Bumblebee2 camera
      7. realsense2_camera
  2. Curiosity Mars Rover:

    • Environment: Rough Mars terrain world, Flat Shapes world
    • Sensors:
      1. Mast Camera
      2. RearHaz Camera
      3. FrontHaz Camera
      4. Nav Camera
      5. IMU
  3. Turtlebot3 Waffle:

    • Environment: office world, turtlebot3_world
    • Sensors:
      1. 360 LiDAR

By using a diverse set of robots and environments, we aimed to thoroughly assess the capabilities and limitations of our algorithms, ensuring robustness and adaptability in real-world scenarios. The data collected from these experiments provided valuable insights and helped us refine and optimize our algorithms for better performance and broader applicability.

Getting Started

For optimal performance, we recommend using two separate systems: one with a Linux OS for installing ROS1 and Gazebo for robotics aspects, and another with Windows OS for MATLAB and its toolboxes. Running all these resource-intensive software on one system may lead to performance issues.

Please refer to the detailed installation instructions provided in the the README files of each folder of our repository to install all the required dependencies and softwares. Follow the instructions carefully to set up each component accordingly.

Demo and Usage:

Download the  ROS simulation

(back to top)

Download the MATLAB simulation

(back to top)

1. Setup the required bot by following the steps in the respective directory. 2. Mount the required sensors on the bot. eg -
   export JACKAL_URDF_EXTRAS=$HOME/Desktop/realsense.urdf.xacro
  1. Launch the required world file and the gmapping algorithm. eg -
   roslaunch cpr_inspection_gazebo inspection_world.launch 
   roslaunch jackal_viz view_robot.launch
   roslaunch jackal_navigation gmapping_demo.launch 
  1. Create and download the .pgm file of the map of the world.
  2. Open Matlab Scripts and establish the connection between the two systems to set up co-simulation.
  3. Run the .mlx files in order to view the results

Roadmap

  • Implemented and tested various global path planners like RRT, RRT*, A* and Hybrid A*.
  • Added sensors to the bot and received data from ROS using matlab toolboxes.
  • Created map using sensor data.
  • Implemented AMCL using matlab scripting.
  • WayPoint Navigation using Simulink Model.
  • Visualisation of pointcloud2 data from Velodyne LiDAR.

RQT GRAPH

Jackal Robot

.

FUTURE IMPLEMENTATION

  1. Implementing dynamic obstacle avoidance.
  2. Enhance mapping of rough terrain.
  3. Changing the existing control system of the bots
  4. Using the concept of reinforcement learning.

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

Contact

Abhishek Nair:

Aditya Suwalka:

Tejal Uplenchwar:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •