Navigation2
Source: https://docs.nav2.org/index.html
Overview
Nav2 is the professionally supported spiritual successor to the ROS navigation stack. The project aims to find a safe way to enable mobile robots to complete complex tasks in a wide variety of environments and robot kinematics. It not only enables movement from point A to point B, but also handles intermediate poses and represents other types of tasks, such as object tracking. Nav2 is a production-grade, high-quality, proven navigation framework trusted by over 50 companies worldwide.
It provides perception, planning, control, localization, and visualization capabilities for building highly reliable autonomous systems. This enables environmental modeling, dynamic path planning, motor speed calculation, obstacle avoidance, semantic region and object representation, and the construction of higher-level robot behaviors using sensor data. To learn more about this project, including related projects, robots used, comparisons to ROS 1, and maintainers, see About and Contact. To learn more about navigation and ROS concepts, see Navigation Concepts.
Nav2 uses behavior trees to create customized and intelligent navigation behaviors by coordinating many independent, modular servers. Task servers can be used to calculate paths, control forces, recovery, or any other navigation-related tasks. These independent servers communicate with behavior trees (BTs) via ROS interfaces (e.g., action servers or services). A robot can utilize multiple different behavior trees to perform many types of unique tasks.
The diagram below will give you an idea of the Nav2 architecture. Note: Each server can have multiple controller, planner, and recovery plugins, which can be matched with BT plugins. This can be used to create contextual navigation behaviors. If you want to see a comparison between this project and ROS (1) navigation, see: ROS to ROS 2 Navigation.
Nav2 expects inputs including REP-105-compliant TF transforms, a map source if using a static costmap layer, a BT XML file, and any relevant sensor data sources. It then provides valid velocity commands for the motors of omnidirectional or non-omnidirectional robots. We currently support all major robot types: omnidirectional, differential drive, legged, and Ackermann (car-like) primitives! We also have the unique ability to support circular and arbitrary-shaped robots through SE2 collision checking.
It has the following tools:
- Loading, serving, and storing maps (map server)
- Localizing the robot on a map (AMCL)
- Planning a path from A to B around obstacles (Nav2 planner)
- Controlling the robot to move along a path (Nav2 controller)
- Make path planning more continuous and feasible (Nav2 smoother)
- Convert sensor data into a costmap representation of the world (Nav2 Costmap 2D)
- Use behavior trees to build complex robot behaviors (Nav2 behavior trees and BT navigator)
- Computing recovery behavior in case of failure (Nav2 recovery)
- Follow waypoints in sequence (Nav2 Waypoint Follower)
- Manage the server's lifecycle and watchdog (Nav2 Lifecycle Manager)
- Plugins to enable custom algorithms and behaviors (Nav2 core)
- Monitoring raw sensor data to detect impending collisions or hazardous situations (collision monitoring)
- Python3 API for interacting with Nav2 in a Pythonic way (Simple Commander)
- Smoothing of the output velocity to ensure dynamic feasibility of the command (velocity smoother)

We also provide a set of starter plugins to help you get started. A list of all plugins can be found in Navigating Plugins - but they include algorithms for common behaviors and bot platform types.
References
If you use the navigation framework, the algorithms in this repository, or the ideas therein, please cite this work in your paper!
S. Macenski, F. Martín, R. White, J. Clavero.The Marathon 2: A Navigation System. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020.
IROS 2020 presentation on the Nav2 Marathon experiment:
If you use our work in a formal comparison of VSLAM and service robotics needs, please cite the paper:
A. Merzlyakov, S. Macenski.A Comparison of Modern General-Purpose Visual SLAM Approaches. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021.
@InProceedings{vslamComparison2021,
author = {Merzlyakov, Alexey and Macenski, Steven},
title = {A Comparison of Modern General-Purpose Visual SLAM Approaches},
booktitle = {2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year = {2021}
}