Optical flow problems solved by a learning process

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Optical flow problems solved by a learning process

guido No Comments

Optical flow for small flying robots

Flying insects heavily rely on optical flow for visual navigation and flight control. Roboticists have endowed small flying robots with optical flow control as well, since it requires just a tiny vision sensor. However, when using optical flow, the robots run into problems that insects appear to have overcome.

Fundamental problems of optical flow

Today Nature Machine Intelligence has published our article (featuring it on the cover), in which we propose a solution to two fundamental problems of optical flow. The first problem is that optical flow only provides mixed information on distances and velocities. This means that using it directly for control leads to oscillations when getting closer to obstacles. The second problem is that optical flow provides very little information on obstacles in the direction of motion. This means that the hardest obstacles to detect are the ones the robot is actually going to collide with!

Drone landing with the help of optical flow. The drone evaluates the distance to the landing surface in order to set the right parameters for the optical flow control.

Drone landing with the help of optical flow. The drone evaluates the distance to the landing surface in order to set the right parameters for the optical flow control.

Learning process as solution

We tackled these problems with the help of a learning process. Specifically, the robot exploits self-induced oscillations to learn what the objects in its environment look like at different distances. In this way, it can for example learn how fine the texture of grass is when looking at it from different heights during landing. For obstacle avoidance it can learn how thick tree barks are at different distances when navigating in a forest.

After learning, drones can adapt their control to the perceived distance. This results in faster and smoother optical flow landings. It also allows them to detect obstacles in the direction of flight. This results in faster and safer flight in cluttered environments.

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Hypothesis on insect intelligence

The findings are not only relevant to robotics, but also provide a new hypothesis for insect intelligence. Insect flight control for tasks such as landing or obstacle avoidance is typically described in terms of pre-wired optical flow control laws. Our findings suggest that flying insects may learn the visual appearance of their environment to improve their flight and navigation skills over their lifetime.

More information

The article: Enhancing optical-flow-based control by learning visual appearance cues for flying robots. G.C.H.E. de Croon, C. De Wagter, and T. Seidl, Nature Machine Intelligence 3(1), 2021.

TU Delft press release

Finally, I would like to thank Knock Knock Studios for the great video, and Sarah Gluschitz for the great image design for the cover.


We won the AIRR autonomous drone race!

guido No Comments

On December 6, 2019, the MAVLab team won the AI Robotic Racing (AIRR) world championship, taking home 1 Million dollars in prize money!

The AIRR competition was an autonomous drone race season set up by Lockheed Martin and the Drone Racing League, aiming to push the boundaries of AI for robotics. Whereas typically autonomous drones fly rather slowly, the goal of AIRR was to have drones compete against each other on a ~80 meter racing track with the goal to finish the track as fast as possible. High-speed autonomous flight is extremely challenging, especially on real-world tracks with very limited testing time. We won the world championship race in a thrilling match, where we finished the track in 12 seconds, just 3 seconds faster than the runner-up, the Robotics and Perception Group from the UZH in Zürich. As the winner, our team also had to compete with one of the best human drone race pilots, “Gab707”. He still beat our drone fair and square, finishing the track in 7 seconds.

There is much more information on the AIRR competition, the world championship race, and our approach in the university’s press release and our lab’s web site.

Below, please see our video with our winning 12 seconds run.

A swarm of tiny drones autonomously explores an unknown environment.

Swarm of tiny drones autonomously explores unknown environments

guido No Comments

We have succeeded in making a swarm of tiny drones that can autonomously explore unknown environments. This achievement, published in Science Robotics on October 23, is a result of a 4-year collaboration with researchers from the University of Liverpool and Radboud University of Nijmegen.

The main challenge was that the tiny 33-gram drones need to navigate autonomously with extremely limited sensing and computational capabilities. The proposed solution draws inspiration from the relative simplicity of insect navigation. Instead of building highly detailed 3D maps such as most autonomous robots, our drones use a novel “bug algorithm” for navigation. This algorithm makes the drones react to obstacles on-the-fly. It allows the drones to first fly away from the base station to explore the environment, and, when batteries start running low, to come back to a wireless beacon at the same location.

The article that now appeared in Science Robotics is the culmination of our efforts to bring swarms of tiny drones to the real world. In order to reach successful swarm exploration, we needed to tackle “low-level” problems such as how drones can localize other drones, how they can estimate their own velocity and detect and avoid obstacles. Only when these capabilities were available, could we focus on having the drones coordinate the exploration and navigate back to the base station.

For more more information, see the extensive press release on the TU Delft website
Link: https://www.tudelft.nl/en/2019/tu-delft/swarm-of-tiny-drones-explores-unknown-environments/

Article information
Minimal navigation solution for a swarm of tiny flying robots to explore an unknown environment
K.N. McGuire, C. De Wagter, K. Tuyls, H.J. Kappen, and G.C.H.E. de Croon
Science Robotics, 23 October 2019
DOI: 10.1126/scirobotics.aaw9710
Link: http://robotics.sciencemag.org/lookup/doi/10.1126/scirobotics.aaw9710

The research team
Kimberly McGuire, Christophe De Wagter, and Guido de Croon (TU Delft)
Karl Tuyls (University of Liverpool)
Bert Kappen (Radboud University Nijmegen)

Financed by the Dutch Research Council (NWO), within the Natural Artificial Intelligence programme.

YouTube playlist:
Additional videos:


This 72-gram drone is able to race autonomously

The world’s smallest autonomous racing drone

guido No Comments

Autonomous drone racing

Drone racing by human pilots is becoming a major e-sport. In its wake, autonomous drone racing has become a major challenge for artificial intelligence and control. Over the years, the speed of autonomous race drones has been gradually improving. Most of the autonomous racing drones are equipped with high-performance processors, with multiple, high-quality cameras and sometimes even with laser scanners. This allows these drones to use state-of-the-art solutions to visual perception, like building maps of the environment or tracking accurately how the drone is moving over time. However, it also makes the drones relatively heavy and expensive.

At the Micro Air Vehicle Laboratory (MAVLab) of TU Delft, the aim is to make light-weight and cheap autonomous racing drones. Such drones could be used by many drone racing enthusiasts to train with or fly against. If the drone becomes small enough, it could even be used for racing at home.

WhatsApp Image 2019-04-30 at 22.48.08

This 72-gram drone is able to race autonomously

This 72-gram drone is a modified Eachine “Trashcan” drone that has been modified by adding a JeVois smartcamera. The camera, and onboard open source autopilot Paparazzi allow it to race autonomously


The main innovation underlying this feat is the creation of extremely efficient and yet still robust algorithms. “The wireless images in human drone racing can be very noisy and sometimes not even arrive at all”, says Christophe De Wagter, founder of the MAVLab. “So, human pilots rely heavily on their predictions of how the drone is going to move when they move the sticks on their remote control.”

Although the images of an autonomous drone do not have to be transmitted through the air, the interpretation of the images by small drones can sometimes be completely off. The drone can miss a gate or evaluate its position relative to the gate completely wrongly. For this reason, a prediction model is central to the approach. Since the drone has very little processing, the model only captures the essentials, such as thrust and drag forces on the drone frame.


“When scaling down the drone and sensors, the sensor measurements deteriorate in quality, from the camera to the accelerometers”, says Shuo Li, PhD student at the MAVLab on the topic of autonomous drone racing. “Hence, the typical approach of integrating the accelerations measured by the accelerometers is hopeless. Instead, we have only used the estimated drone attitude in our predictive model. We correct the drift of this model over time by relying on the vision measurements.” A new robust state estimation filter was used to combine the noisy vision measurements in the best way with the model predictions.

Racing performance

The drone used the newly developed algorithms to race along a 4-gate race track in TU Delft’s Cyberzoo. It can fly multiple laps at an average speed of 2 m/s, which is competitive with larger, state-of-the-art autonomous racing drones. Thanks to the central role of gate detection in the drone’s algorithms, the drone can cope with displacements of the gates.

“We are currently still far from the speeds obtained by expert human drone racers. The next step will require even better predictive control, state estimation and computer vision”, says Christophe De Wagter. “Efficient algorithms to achieve these capabilities will be essential, as they will allow the drone to sense and react quickly. Moreover, small drones can choose their trajectory more freely, as the racing gates are relatively larger for them.”

Beyond racing

Although racing is a quickly growing e-sport with more and more enthusiasts involved, autonomous racing drones are useful beyond drone racing alone. “For typical drones with four rotors, flying faster also simply means that they are able to cover more area. For some applications, such as search and rescue or package delivery, being quicker will be hugely beneficial”, adds Guido de Croon, scientific leader of the MAVLab. “Our focus on light weight and cheap solutions means that such fast flight capabilities will be available to a large variety of drones.”


For more in-depth information on the algorithms and motivation for our approach, please see our blog post at Robohub.

The scientific article has been uploaded to the ArXiv public repository:
Shuo Li, Erik van der Horst, Philipp Duernay, Christophe De Wagter, Guido C.H.E. de Croon,“Visual Model-predictive Localization for Computationally Efficient Autonomous Racing of a 72-gram Drone”, ArXiv Preprint arXiv:1905.10110 (2019). It is available here: https://arxiv.org/abs/1905.10110

Drone racing team 2018-2019

Christophe De Wagter
Guido de Croon
Shuo Li
Phillipp Dürnay
Jiahao Lin
Simon Spronk

DelFly Nimble

Science paper on our new DelFly

guido No Comments

We have developed a new, highly agile flapping-wing robot, called the DelFly Nimble. It can mimic high-speed insect escape maneuvers so accurately, that it exhibited very similar motion as fruitflies when performing these same maneuvers.

Interestingly, the robot turned around an axis that was not controlled during the escape maneuver. In particular, it would turn towards the flight direction. A fruit fly does the same, but it was not clear if it did this on purpose or not. Namely, it is not possible to look inside such an insect’s brain during flight. For the robot, however, we know exactly what happens inside its brain. We completely determine its control system and can log all its senses and motor actions during flight.

We found that the turn is due to an aerodynamic mechanism. When flying faster, the moments around the roll and pitch axis of the robot start to become coupled with the yaw axis. We modeled this effect, and the predictions of the model nicely capture the robot and fruitfly data.

The article can be found here: A tailless aerial robotic flapper reveals that flies use torque coupling in rapid banked turns, by Matěj Karásek, Florian T. Muijres, Christophe De Wagter, Bart D. W. Remes, and Guido C. H. E. de Croon, in Science, 14 Sep 2018, Vol. 361, Issue 6407, pp. 1089-1094.


We think that this new robot is an important step towards real-world applications of flapping wing robots. The previous DelFly versions were very sensitive to wind gusts or even strong drafts, such as from a zealous air conditioning system. These older designs had airplane-like tails for steering, which was not sufficient in such situations. Steering with the wings is a solution to this problem.

Given the light weight of flapping wing robots as the DelFly Nimble (weighing 29 grams), they are very safe for flight in indoor spaces and around humans. Applications may include having them fly in greenhouses for monitoring crop or in warehouses for checking stock.

See our press release here.

self-supervised learning of depth estimation

TOP grant on self-supervised learning

guido No Comments

The Dutch Science Foundation (NWO) has granted me a personal grant on the topic of self-supervised learning. The grant, named TOP grant, is intended for researchers who obtained their PhD maximally 10 years ago.

A structured study on self-supervised learning

In the proposal, I forward self-supervised learning (SSL) as a reliable mechanism to have robots learn in their own environment. In SSL, robots have a baseline perception and behavior module, which allows for initial operation. In the same time this module provides supervised targets to a learning process that extends the robot’s capabilities. Where previous work has shown the potential of SSL in isolated case studies, I will now perform – together with the PhD student funded by the project – the first structured study of SSL. The key innovation is the extension of SSL with novel elements and identify and propose solutions to fundamental challenges in the way of its widespread usage.

This image shows a self-supervised learning setup in which a robot with a stereo system uses the stereo-based distance estimates to learn how to estimate distances also with only one camera. After learning, fusion of the distances from the stereo algorithm and the monocular distance estimation lead to more reliable distance estimates. This setup is described in an article that is to be presented at ICRA 2018, in Australia.

In the TOP grant project, we will study fundamental aspects of self-supervised learning, such as the guarantee of a baseline performance and the fusion of baseline and learned perceptual capabilities. If successful, the research will have an impact on many different types of robots, which will be able to significantly improve their perceptual capabilities over time.

Read more on the NWO web site.


First lecture at the world horti center

guido No Comments

Today I gave the first lecture at the World Horti Center (WHC). The WHC is going to be an important international hub for innovations in the horticulture sector. During the lecture, I explained my vision on how swarms of drones will be able to contribute to precision agriculture indoors. Hopefully this is the first step on the way to having drones contribute to important matters such as the reduction of water and pesticide use in horticulture.

First lecture at the world horti center

First lecture at the world horti center

drone learns to see in zero gravity

Drone learns “to see” in zero gravity

guido No Comments

Last few years, I worked together with ESA and MIT on an experiment, in which a drone learns to see in zero gravity. Yesterday, ESA presented our joint work at the International Astronautical Congress in Mexico. In the experiment, the drone started to fly based on stereo vision with two cameras, but learned online to also see distances with a single camera. Already after a few minutes, the drone was able to predict distances with a single camera, which would allow it to keep navigating if one camera broke down. This type of “self-supervised learning” is very promising for future robot exploration in space.

We have also extensively tested the same learning algorithms on a drone on earth (see this article), but on earth it is easier to see distances, since gravity provides a nice reference frame and adds a lot of structure to the environment. In zero gravity, a drone can move in a full 6 degrees of freedom, so it can look at objects in many different ways.

See the news on the ESA page.

First drone day

guido No Comments

Yesterday I went to the first Dutch drone festival to talk about (and demonstrate) small autonomous flying robots. This weekend the festival in Hilv ersum Media Park continues, for instance with FPV drone racing workshops!

Antoni van Leeuwenhoeklezing

guido No Comments

Today I gave a lecture on smart robots at the Delft Science Center, discussing the intelligence of big robots such as self-driving cars and small robots, such as lightweight drones. Great to see such a mix of young and old people interested enough in this topic to sacrifice a sunny Sunday morning for it :)