|Go Faster Drones|
|Written by Harry Fairhead|
|Sunday, 25 July 2021|
An autonomously flying quadrotor has outperformed human pilots in a drone race. This feat was achieved by a research group lead by Davide Scaramuzza at the University of Zurich using an approach which relies solely on onboard sensing and computation.
A key requirement for drones is to be quick. They also need to be combine speed with accuracy in order to pass through waypoints or specific locations they are required to inspect.
Up until now human drone pilots have always always had superior performance in drone racing a new algorithm has changed this.
Davide Scaramuzza, who heads the Robotics and Perception Group at the University of Zurich and the Rescue Robotics Grand Challenge of the NCCR Robotics, which funded the research claims:
“Our drone beat the fastest lap of two world-class human pilots on an experimental race track”.
For this test the algorithm, which used a sensorimotor policy that enables an autonomous quadrotor to fly extreme acrobatic maneuvers with only onboard sensing and computation, and two human pilots controlled the same quadrotor through a race circuit. External cameras were used to precisely capture the motion of the drones and, in the case of the autonomous drone, to give real-time information to the algorithm on where the drone was at any moment. To ensure a fair comparison, the human pilots were given the opportunity to train on the circuit before the race.
The outcome was that the algorithm won - all its laps were faster than the human ones, and the performance was more consistent because once the algorithm has found the best trajectory it can reproduce it faithfully many times, unlike human pilots.
This video includes an overview of them performing a Barrel Roll (the trajectory on the top left), a Power Loop (top middle), and a Matty Flip (top right). The one on the bottom is the Combo.
The researchers' paper entitled Deep Drone Acrobatics concludes:
Our approach is the first to enable an autonomous flying machine to perform a wide range of acrobatics maneuvers that are highly challenging even for expert human pilots. The approach relies solely on onboard sensing and computation, and leverages sensorimotor policies that are trained entirely in simulation. We have shown that designing appropriate abstractions of the input facilitates direct transfer of the policies from simulation to physical reality. The presented methodology is not limited to autonomous flight and can enable progress in other areas of robotics.
The navigation algorithm has been released open source. The code can be found at:
Deep Drone Acrobatics (pdf) by Elia Kaufmann, Antonio Loquercio, René Ranftl, Matthias Müller, Vladlen Koltun, Davide Scaramuzza
or email your comment to: firstname.lastname@example.org
|Last Updated ( Sunday, 25 July 2021 )|