Skip to content

An autonomous drone racing system integrating vision-guided trajectory adjustments with MPC for planar control and PID for elevation. Real-time visual feedback refines gate localization and trajectory generation, ensuring efficient and accurate navigation.

Notifications You must be signed in to change notification settings

muralikarteek7/DroneRacer-MPC-Vision

Repository files navigation


AirSim Drone Racing Lab: Overview

ADRL is a framework for drone racing research, built on Microsoft AirSim.
We used our framework to host a simulation-based drone racing competition at NeurIPS 2019, Game of Drones.

Currently, ADRL allows you to focus on three core research directions pertinent to autonomous drone racing - perception, trajectory planning and control, and head-tp-head competition with a single competitor drone.

Race Tiers

  1. Airsim Setup:

    AirSim-Drone-Racing-Lab Repository

  2. Nanosam Setup:

    Nanosam Repository

  3. State base controller:

    In this project, we use a State-Based Controller that combines Model Predictive Control (MPC) for the XY plane and PID control for Yaw and Altitude.

    MPC for XY Plane Control

    The Model Predictive Control (MPC) is applied to control the drone’s position in the XY plane. MPC optimizes the control inputs over a prediction horizon, enabling the drone to follow the desired path while handling constraints effectively.

    PID for Yaw and Altitude Control

    • Yaw Control: A PID controller is used to control the yaw angle of the drone to ensure the desired orientation is maintained.
    • Altitude Control: Another PID controller manages the altitude, ensuring the drone stays at or moves toward the target height.

    Both control methods work together to ensure the drone can move to the desired position, maintain stable altitude, and adjust its orientation in flight.

  4. Vision Control:

    In this project, the Vision Control system leverages NanoSAM for gate segmentation, enabling the drone to identify and navigate through gates autonomously.

    Gate Segmentation using NanoSAM We use NanoSAM, a lightweight semantic segmentation model, to process the drone's camera feed and segment gates in the environment. The segmented masks provide crucial visual feedback for the drone to approach the gates.

  5. Planner:

    The Planner utilizes a Cubic Spline Generator to calculate smooth paths between waypoints (gates) for the drone to follow. The path is continuously updated based on the real-time vision data.

    Cubic Spline Path Generation The Cubic Spline Generator computes a smooth, continuous path between gates. This method ensures the drone transitions smoothly and stably between waypoints.

  6. Implementaation:

  7. Results:

    Vision Control

    MPC PID Control


About

An autonomous drone racing system integrating vision-guided trajectory adjustments with MPC for planar control and PID for elevation. Real-time visual feedback refines gate localization and trajectory generation, ensuring efficient and accurate navigation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages