Skip to main content

FastPlanner

Source: https://github.com/HKUST-Aerial-Robotics/Fast-Planner

Fast-Planner is designed to help quadrotors achieve rapid flight in complex, unknown environments. It includes a rich set of carefully designed planning algorithms and provides a foundational code framework and algorithms that support several popular open-source drone projects, including ego-planner, FUEL, and RACER.

information:

  • March 13, 2021: The rapid autonomous exploration code is now available! Check out this repository for more details.
  • October 20, 2020: Fast-Planner has been extended and applied to rapid autonomous exploration. See this repo for more details.

Authors: Zhou Boyu and Shen Shaojie from the Aerial Robotics Group of the Hong Kong University of Science and Technology, and Gao Fei from the FAST Laboratory of Zhejiang University .

Full videos: video1, video2, video3. A demonstration of this work was reported in IEEE Spectrum: page1, page2, page3 (search for _HKUST_ within the pages).

To run this project in minutes, check out the Quick Start. See the other sections for more details.

If this project was helpful to you, please give a thumbs up ⭐. We work very hard to develop and maintain it 😁😁.

Table of Contents

1. Quick

The project has been tested on Ubuntu 18.04 (ROS Melodic) and 20.04 (ROS Noetic).

First, you should install nlopt v2.7.1:

git clone -b v2.7.1 https://github.com/stevengj/nlopt.git
cd nlopt
mkdir build
cd build
cmake ..
make
sudo make install

Next, you can run the following command to install other required tools:

sudo apt-get install libarmadillo-dev

Then simply clone and compile our package (using ssh here):

cd ${YOUR_WORKSPACE_PATH}/src
git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
cd ../
catkin_make

You can view detailed project setup instructions. After compilation is complete, you can start visualization in the following ways:

source devel/setup.bash && roslaunch plan_manage rviz.launch

And start the simulation (run in a new terminal):

source devel/setup.bash && roslaunch plan_manage kino_replan.launch

You will find a random map and drone in the Rviz tool. You can use it to select the target for the drone to reach. An example simulation is shown here.2D Nav Goal

2. Algorithms and

This project contains a series of robust and computationally efficient algorithms for quadrotors:

  • Motion dynamics path search
  • Trajectory optimization based on B-spline
  • Topological path search and path guidance optimization
  • Perceptual Awareness Planning Strategies (to be published)

These methods are described in detail in the papers listed below.

If you use this project in your research, please cite at least one of our papers: Bibtex.

All planning algorithms and other key modules (such as mapping) are implemented in fast_planner:

  • plan_env: An online map building algorithm that takes depth image (or point cloud) and camera pose (odometry) pairs as input, performs ray casting to update the probability volume map, and builds the Euclidean Signed Distance Field (ESDF) for the planning system.
  • path_searching: The front-end path-searching algorithms. Currently, this includes a dynamic path-searching algorithm that respects the dynamics of a quadrotor. It also includes a sampling-based topological path-searching algorithm that generates multiple topologically unique paths that capture the structure of the 3D environment.
  • bspline: Implementation of B-spline based trajectory representation.
  • bspline_opt: Gradient-based trajectory optimization using B-spline trajectories.
  • active_perception: A perception planning strategy that enables the quadrotor to actively observe and avoid unknown obstacles that appear in the future.
  • plan_manage: A high-level module for scheduling and calling mapping and planning algorithms. It contains interfaces for starting the entire system and configuration files.

In addition to the folder fast_planner, the lightweight uav_simulator is also used for testing.

3. Setup and

Prerequisites

  1. Our software was developed and tested in Ubuntu 18.04 (ROS Melodic) and 20.04 (ROS Noetic).
  2. We use NLopt to solve nonlinear optimization problems. uav_simulator depends on the C++ linear algebra library Armadillo. Both dependencies can be installed using the following command.

First, you should install nlopt v2.7.1:

git clone -b v2.7.1 https://github.com/stevengj/nlopt.git
cd nlopt
mkdir build
cd build
cmake ..
make
sudo make install

Next, you can run the following command to install other required tools:

sudo apt-get install libarmadillo-dev

on ROS

Once the prerequisites are met, you can clone this repository into your catkin workspace and catkin_make. Using a new workspace is recommended:

cd ${YOUR_WORKSPACE_PATH}/src
git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
cd ../
catkin_make

If you encounter problems during this step, please refer to existing issues, pull requests, and Google before raising a new issue.

Now you can start running the simulation.

Use GPU depth rendering (optionally skipped

This step is not required to run the simulation. However, if you want to run a more realistic depth camera in uav_simulator, you need to install the CUDA toolkit. Otherwise, a less realistic depth sensor model will be used.

The local_sensing package in uav_simulator can choose to use GPU or CPU to render depth sensor measurement data. By default, it is set to CPU version in CMakeLists:

set(ENABLE_CUDA false)
# set(ENABLE_CUDA true)

However, we strongly recommend the GPU version because it generates depth images that are more like real depth cameras. To enable GPU depth rendering, set ENABLE_CUDA to true and remember to change the "arch" and "code" flags according to your graphics card device. You can see the correct code here.

set(CUDA_NVCC_FLAGS 
-gencode arch=compute_61,code=sm_61;
)

To install CUDA, visit CUDA ToolKit

4. Run

First we run Rviz using our configuration :

<!-- go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage rviz.launch

Then run the quadcopter simulator and Fast-Planner. A few examples are provided below:

Motion Dynamics Path Search and B-Spline

This method uses dynamic path search to find a safe, dynamically feasible, and time-efficient initial trajectory in the discrete control space. It then uses B-spline optimization to improve the smoothness and sharpness of the trajectory. To test this method, run:

<!-- open a new terminal, go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage kino_replan.launch

Typically, you'll find a randomly generated map and drone model in the game Rviz. At this point, you can use 2D Nav Goal the tool to trigger the planner. Clicking a point in the game Rviz will immediately generate a new trajectory for the drone to follow. Here's an example:

The relevant algorithms are introduced in detail in this paper.

Topological path search and path guidance

The characteristic of this method is that it searches for multiple trajectories in different topological categories. Thanks to this strategy, the solution space is explored more thoroughly, local minima are avoided, and better solutions are obtained. Similarly, run:

<!-- open a new terminal, go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage topo_replan.launch

You will then find that the random map has been generated, and you can use it 2D Nav Goal to trigger the planner:

The relevant algorithms are introduced in detail in this paper.

Perception

The code will be released after the relevant paper is published.

5.

If you have successfully run a simulation and want to use Fast-Planner in your project , please browse to the kino_replan.launch or topo_replan.launch files. They contain and document important parameters that you may want to change during use.

Note that in our configuration, the depth image is 640x480. To improve map fusion efficiency, we downsample it (skip_pixel = 2 in kino_algorithm.xml). If you use a lower-resolution depth image (e.g., 256x144), you can disable downsampling by setting skip_pixel = 1. Additionally, the depth_scaling_factor is set to 1000; you may need to change this depending on your device.

Finally, for installation issues, such as compilation errors caused by different ROS/Eigen versions, please refer to existing issues, pull requests, and Google resources before submitting a new issue. Irrelevant issues will not receive a response.

6.

  • October 20, 2020: Fast-Planner has been extended and applied to rapid autonomous exploration. See this repo for more details.
  • July 5, 2020: We will release an implementation of the paper in the future: RAPTOR: Robust and Perception-aware Trajectory Replanning for Quadrotor Fast Flight (submitted to TRO, under review).
  • April 12, 2020: An implementation of the ICRA2020 paper "Robust Real-Time UAV Replanning Using Guided Gradient-Based Optimization and Topological Paths" is now available.
  • January 30, 2020: Volumetric mapping is integrated with our planner. It takes a depth image and camera pose pair as input, performs ray casting to fuse the measurements, and constructs a Euclidean Signed Distance Field (ESDF) for the planning module.

Acknowledgements

We use NLopt for nonlinear optimization.

License

The source code is released under the GPLv3 license.

Disclaimer

This is research code distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

More information

  1. FastPlanner Official