Geen categorie

Optical flow and a motion model suffice for attitude estimation.

Nature article reveals how flying insects and drones can discern up from down

guido No Comments

We have developed a new theory on how flying drones and insects can estimate the gravity direction. Whereas drones typically use accelerometers to this end, the way in which flying insects do this is shrouded in mystery, since they lack a specific sense for acceleration. In an article published today in Nature, scientists from TU Delft, the Netherlands, and CNRS / Aix-Marseille University, France, have shown that drones can estimate the gravity direction by combining visual motion sensing with a model of how they move. This new approach is an important step for the creation of autonomous tiny drones, since it requires fewer sensors. Moreover, it forms a hypothesis for how insects control their attitude, as the theory forms a parsimonious explanation of multiple phenomena observed in biology.

The importance and difficulty of finding the gravity direction

Successful flight requires knowing the direction of gravity. As ground-bound animals, we humans typically have no trouble determining which way is down. However, this becomes more difficult when flying. Indeed, the passengers in an airplane are normally not aware of the plane being slightly tilted sideways in the air to make a wide circle. When humans started to take the skies, pilots relied purely on visually detecting the horizon line for determining the plane’s “attitude”, that is, its body orientation with respect to gravity. However, when flying through clouds the horizon line is no longer visible, which can lead to an increasingly wrong impression of what is up and down – with potentially disastrous consequences.

Also drones and flying insects need to control their attitude. Drones typically use accelerometers for determining the gravity direction. However, in flying insects no sensing organ for measuring accelerations has been found. Hence, for insects it is currently still a mystery how they estimate attitude, and some even question whether they estimate attitude at all.

Optic flow suffices for finding attitude

Although it is unknown how flying insects estimate and control their attitude, it is very well known that they visually observe motion by means of “optic flow”. Optic flow captures the relative motion between an observer and its environment. For example, when sitting in a train, trees close by seem to move very fast (have a large optic flow), while mountains in the distance seem to move very slowly (have a small optic flow).

“This research started with the question whether optic flow could convey any information on attitude.”, says Guido de Croon, Full Professor of Bio-inspired Micro Air Vehicles, “Intuitively, this could be the case. For instance, if a fly stops flapping its wings, gravity will make it accelerate downwards. The resulting downward motion can be picked up by means of optic flow. This can allow the fly to find the gravity direction and hence its attitude.”

Based on this initial intuition, the researchers performed a theoretical analysis, combining the physical formulas expressing optic flow measurements with a model of how an insect or drone will accelerate and rotate due to its own actions and gravity.

“We were surprised by the outcome of the theoretical analysis: It showed that when using a motion model, optic flow alone suffices for determining the gravity direction.”, adds Abhishek Chatterjee, who worked on the research as an MSc student at TU Delft, “This is surprising, because optic flow itself captures only rotation rates and not attitude angles. Moreover, we feared that if this approach worked, it would only be valid in specific conditions such as when a fly stops flapping its wings.”

In fact, the opposite is true. Finding the gravity direction with optic flow works almost under any condition, except for specific cases such as when the observer is completely still. “At first sight, this may still seem like quite a problem, since it implies that in the important condition of hovering flight the gravity direction cannot be found.” mentions Guido de Croon, “Normally, engineers would add extra sensors so that attitude can also be determined in hover flight. However, we hypothesize that nature has simply accepted this problem. In the article we provide a theoretical proof that despite this problem, an attitude controller will still work around hover at the cost of slight oscillations.”

Implicitaions for robotics

The researchers confirmed the theory’s validity with robotic implementations. Most experiments were performed with a quadrotor drone, which was able to fly completely by itself based on optic flow and gyros – not using the accelerometers. As predicted, when controlling attitude based on optic flow, the robot exhibited slight oscillations when hovering. Moreover, experiments were performed with a flapping wing drone equipped with an artificial insect compound eye. These experiments showed that the flapping motion improved the attitude estimation accuracy.

 

Flapping wing robot controlling its attitude with the proposed theory. It is equipped with an artificial insect compound eye, which can perceive optic flow at a high frequency. Photo by Christophe de Wagter, TU Delft.

 

Flapping wing robot controlling its attitude with the proposed theory. It is equipped with an artificial insect compound eye, which can perceive optic flow at a high frequency. Photo by Christophe de Wagter, TU Delf

The proposed theory is promising for the field of robotics. “Various research groups strive to design autonomous, insect-sized flying robots.”, says Guido de Croon, “Tiny flapping wing drones can be useful for tasks like search-and-rescue or pollination. Designing such drones means dealing with a major challenge that nature also had to face; how to achieve a fully autonomous system subject to extreme payload restrictions. This makes even tiny accelerometers a considerable burden. Our proposed theory will contribute to the design of tiny drones by allowing for a smaller sensor suite.

Biological insights

The proposed theory also has the potential to give insight into various biological phenomena. “It was known that optic flow played a role in attitude control, but until now the precise mechanism for this was unclear.”, explains Franck Ruffier, bio-roboticist and director of research at CNRS / Aixe-Marseille University, “The proposed theory can explain how flying insects succeed in estimating and controlling their attitude even in difficult, cluttered environments where the horizon line is not visible. It also provides insight into other phenomena, for example, why locusts fly less well when their ocelli (eyes on the top of their heads) are occluded with paint.”

Honeybee flying in a tapered tunnel. The narrowing tunnel leads to lower flight speeds – and as the researchers noticed – more oscillations of the honeybee body angles. Photo by DGA / François Vrignaud.

Now, the attention will turn to verify that insects indeed use the proposed mechanism for attitude control. The challenge here is that the theory concerns neural processes that are hard to monitor in flight on flying insects. “One avenue that can be taken is to study oscillations of the insects’ body and heads.”, adds Franck Ruffier, “For the article we re-analyzed biological honeybee data from one of our previous research studies. We found that honeybee bodies’ attitude angles had less variation at higher flight speeds. Although in accordance with the proposed theory, this could also be explained by aerodynamic effects. We expect that novel experiments, specifically designed for testing our theory will be necessary for verifying the use of the proposed mechanism in insects.”

Whatever the outcome, the article shows how the synergy between robotics and biology can lead to technological advances and novel avenues for biological research.


Article:

“Accommodating unobservability to control flight attitude with optic flow”, Nature, G.C.H.E. de Croon, J.J.G. Dupeyroux, C. De Wagter, A. Chatterjee, D.A. Olejnik, and F. Ruffier.

DOI: https://doi.org/10.1038/s41586-022-05182-2

URL: https://www.nature.com/articles/s41586-022-05182-2

Video: https://youtu.be/ugY0RTMjH1s

 

Additional photos and videos:

https://surfdrive.surf.nl/files/index.php/s/ncgesB1ltRVcn8C

 

Part of this project was funded by the Dutch Science Foundation (NWO) under grant number 15039.

 

The parsimony of insect intelligence stems in part from their embodiment and capabilities of sensory-motor coordination and swarming.

Insect-inspired AI for autonomous robots

guido No Comments

Insect-inspired AI for autonomous robots

Small autonomous mobile robots, such as drones, rovers, and legged robots, promise to perform a wide range of tasks, from autonomously monitoring crops in greenhouses to last-kilometer delivery. These applications require robots to operate for extended periods while performing complex tasks, often in unknown, changing, and complicated environments.

In an article published in Science Robotics on June 15, researchers from Delft University of Technology, the University of Washington, University of Sheffield, and Opteran argue that one should draw inspiration from insects when creating the AI for small, autonomous robots. Insect intelligence is characterized by its minimalistic yet robust solutions. They use these to behave successfully in complex, dynamic environments.

In the article, the researchers explain governing principles that underly the efficiency and robustness of insect intelligence. They also give an overview of existing robotics research that has leveraged these principles and identify challenges and opportunities ahead. In particular, advances in biology and technology allow for more fine-grained investigations of insect brains. Moreover, progress in sensing and computing hardware will enable robots to approach the energy efficiency and speed of insect sensing and neural processing. These developments accelerate the creation of insect-inspired AI for autonomous robots, leading to start-ups in this field.

Example Studies

The article published in Science Robotics is a “review” article. Here we provide some examples of the authors’ previous research studies so that one can imagine the ramifications of exploiting insect AI.

Swarm of tiny drones is able to localize gas leaks

https://www.youtube.com/watch?v=hj_SBSpK5qg

(Guido de Croon)

 

 

 

AntBot: A walking robot that uses sky polarization for navigation

https://www.youtube.com/watch?v=lVT8qeiASX4

(Julien Dupeyroux)

 

 

 

 

The first wireless flying robotic insect takes off

https://www.youtube.com/watch?v=7DXuxGErs9k

(Sawyer Fuller)

 

 

 

 

Sophisticated Collective Foraging with Minimalist Agents: 50-Robot Swarm

https://www.youtube.com/watch?v=TqnpoldQKFI

(James Marshall)

 

 

 

 

Moreover, one of the authors, James Marshall, is co-founder and Chief Scientific Officer of Opteran – a company that focuses on bringing insect AI to autonomous robots:

Opteran technologies

Intelligence is natural: Capturing biological systems to develop hyper efficient and robust software capable of development on low-end silicon

https://opteran.com/

(James Marshall)

 

The published article:

Insect-inspired AI for autonomous robots, G.C.H.E. de Croon, J.J.G. Dupeyroux, S.B. Fuller, J.A.R. Marshall, Science Robotics, June 15, 2022.

DOI: 10.1126/scirobotics.abl6334

URL: http://www.science.org/doi/10.1126/scirobotics.abl6334

Featured cover image by Myrtille La Lumia and Julien Dupeyroux.

Drone flying autonomously in a greenhouse to monitor the crop.

Self-flying drones that monitor greenhouse diseases and pests

guido No Comments

We have teamed up with Royal Brinkman and start-up Mapture to develop AI and drone technology for greenhouse monitoring. These lightweight drones are able to take off, navigate without GPS, collect critical data, and land in a box fully autonomously. Through the data collected, greenhouses could monitor the health and growth of plants, and detect diseases and pests at a very early stage to contain potential waste. This innovation of drone technology contributes to a sustainable future of precision agriculture.

Read more about our story in this TU Delft news release: https://www.tudelft.nl/en/2022/tu-delft/self-flying-drones-that-monitor-greenhouse-diseases-and-pests

Swarm of tiny drones able to localize gas leaks

Swarm of autonomous tiny drones can localize gas leaks

guido No Comments

A gas leak in a large building or at an industrial site is difficult to find. Human firefighters cannot see the gas, so they have to use specific instruments for detecting it. Finding the gas leak may take a long time, while the firefighters are risking their lives.

We have developed a swarm of tiny drones that can autonomously localize gas leaks in indoor environments. The main challenge was to design an Artificial Intelligence that would fit in the drones’ tight computational and memory constraints. In order to tackle this challenge, we have drawn inspiration from nature. This was a joint study with researchers from the University of Barcelona and Harvard University.

Scientific article:

Sniffy Bug: A Fully Autonomous Swarm of Gas-Seeking Nano Quadcopters in Cluttered Environments, by B.P. Duisterhof, S. Li, J. Burgués, V.J. Reddi, and G.C.H.E. de Croon, accepted at IEEE/RSJ International Conference on Intelligent Robots and Systems 2021 (IROS 2021). – Preprint available at the Arxiv: article

Video Playlist: playlist showing how the tiny drones can localize gas leaks in indoor environments.

Photos for use by media: photos

Contact: Prof. dr. Guido de Croon, email: g.c.h.e.decroon [at] tudelft.nl , telephone: +31152781402

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Optical flow problems solved by a learning process

guido No Comments

Optical flow for small flying robots

Flying insects heavily rely on optical flow for visual navigation and flight control. Roboticists have endowed small flying robots with optical flow control as well, since it requires just a tiny vision sensor. However, when using optical flow, the robots run into problems that insects appear to have overcome.



Fundamental problems of optical flow

Today Nature Machine Intelligence has published our article (featuring it on the cover), in which we propose a solution to two fundamental problems of optical flow. The first problem is that optical flow only provides mixed information on distances and velocities. This means that using it directly for control leads to oscillations when getting closer to obstacles. The second problem is that optical flow provides very little information on obstacles in the direction of motion. This means that the hardest obstacles to detect are the ones the robot is actually going to collide with!

Drone landing with the help of optical flow. The drone evaluates the distance to the landing surface in order to set the right parameters for the optical flow control.

Drone landing with the help of optical flow. The drone evaluates the distance to the landing surface in order to set the right parameters for the optical flow control.

Learning process as solution

We tackled these problems with the help of a learning process. Specifically, the robot exploits self-induced oscillations to learn what the objects in its environment look like at different distances. In this way, it can for example learn how fine the texture of grass is when looking at it from different heights during landing. For obstacle avoidance it can learn how thick tree barks are at different distances when navigating in a forest.

After learning, drones can adapt their control to the perceived distance. This results in faster and smoother optical flow landings. It also allows them to detect obstacles in the direction of flight. This results in faster and safer flight in cluttered environments.

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Drone avoiding obstacles with the help of optical flow. The drone also evaluates the distances to obstacles it sees, so that it can detect obstacles in the flight direction and safely speed up.

Hypothesis on insect intelligence

The findings are not only relevant to robotics, but also provide a new hypothesis for insect intelligence. Insect flight control for tasks such as landing or obstacle avoidance is typically described in terms of pre-wired optical flow control laws. Our findings suggest that flying insects may learn the visual appearance of their environment to improve their flight and navigation skills over their lifetime.

More information

The article: Enhancing optical-flow-based control by learning visual appearance cues for flying robots. G.C.H.E. de Croon, C. De Wagter, and T. Seidl, Nature Machine Intelligence 3(1), 2021.

TU Delft press release

Finally, I would like to thank Knock Knock Studios for the great video, and Sarah Gluschitz for the great image design for the cover.

AIRR_drone

We won the AIRR autonomous drone race!

guido No Comments

On December 6, 2019, the MAVLab team won the AI Robotic Racing (AIRR) world championship, taking home 1 Million dollars in prize money!

The AIRR competition was an autonomous drone race season set up by Lockheed Martin and the Drone Racing League, aiming to push the boundaries of AI for robotics. Whereas typically autonomous drones fly rather slowly, the goal of AIRR was to have drones compete against each other on a ~80 meter racing track with the goal to finish the track as fast as possible. High-speed autonomous flight is extremely challenging, especially on real-world tracks with very limited testing time. We won the world championship race in a thrilling match, where we finished the track in 12 seconds, just 3 seconds faster than the runner-up, the Robotics and Perception Group from the UZH in Zürich. As the winner, our team also had to compete with one of the best human drone race pilots, “Gab707”. He still beat our drone fair and square, finishing the track in 7 seconds.

There is much more information on the AIRR competition, the world championship race, and our approach in the university’s press release and our lab’s web site.

Below, please see our video with our winning 12 seconds run.

A swarm of tiny drones autonomously explores an unknown environment.

Swarm of tiny drones autonomously explores unknown environments

guido No Comments

We have succeeded in making a swarm of tiny drones that can autonomously explore unknown environments. This achievement, published in Science Robotics on October 23, is a result of a 4-year collaboration with researchers from the University of Liverpool and Radboud University of Nijmegen.

The main challenge was that the tiny 33-gram drones need to navigate autonomously with extremely limited sensing and computational capabilities. The proposed solution draws inspiration from the relative simplicity of insect navigation. Instead of building highly detailed 3D maps such as most autonomous robots, our drones use a novel “bug algorithm” for navigation. This algorithm makes the drones react to obstacles on-the-fly. It allows the drones to first fly away from the base station to explore the environment, and, when batteries start running low, to come back to a wireless beacon at the same location.

The article that now appeared in Science Robotics is the culmination of our efforts to bring swarms of tiny drones to the real world. In order to reach successful swarm exploration, we needed to tackle “low-level” problems such as how drones can localize other drones, how they can estimate their own velocity and detect and avoid obstacles. Only when these capabilities were available, could we focus on having the drones coordinate the exploration and navigate back to the base station.

For more more information, see the extensive press release on the TU Delft website
Link: https://www.tudelft.nl/en/2019/tu-delft/swarm-of-tiny-drones-explores-unknown-environments/

Article information
Minimal navigation solution for a swarm of tiny flying robots to explore an unknown environment
K.N. McGuire, C. De Wagter, K. Tuyls, H.J. Kappen, and G.C.H.E. de Croon
Science Robotics, 23 October 2019
DOI: 10.1126/scirobotics.aaw9710
Link: http://robotics.sciencemag.org/lookup/doi/10.1126/scirobotics.aaw9710

The research team
Kimberly McGuire, Christophe De Wagter, and Guido de Croon (TU Delft)
Karl Tuyls (University of Liverpool)
Bert Kappen (Radboud University Nijmegen)

Financed by the Dutch Research Council (NWO), within the Natural Artificial Intelligence programme.

Videos
YouTube playlist:
https://www.youtube.com/playlist?list=PL_KSX9GOn2P9okvQGaGMP7KdXmF-7_iSV
Additional videos:
https://surfdrive.surf.nl/files/index.php/s/gt1kdiOZI8VeaRj

Photos
https://surfdrive.surf.nl/files/index.php/s/EKAwnvW2eALbvfI

This 72-gram drone is able to race autonomously

The world’s smallest autonomous racing drone

guido No Comments

Autonomous drone racing

Drone racing by human pilots is becoming a major e-sport. In its wake, autonomous drone racing has become a major challenge for artificial intelligence and control. Over the years, the speed of autonomous race drones has been gradually improving. Most of the autonomous racing drones are equipped with high-performance processors, with multiple, high-quality cameras and sometimes even with laser scanners. This allows these drones to use state-of-the-art solutions to visual perception, like building maps of the environment or tracking accurately how the drone is moving over time. However, it also makes the drones relatively heavy and expensive.

At the Micro Air Vehicle Laboratory (MAVLab) of TU Delft, the aim is to make light-weight and cheap autonomous racing drones. Such drones could be used by many drone racing enthusiasts to train with or fly against. If the drone becomes small enough, it could even be used for racing at home.

WhatsApp Image 2019-04-30 at 22.48.08

This 72-gram drone is able to race autonomously

This 72-gram drone is a modified Eachine “Trashcan” drone that has been modified by adding a JeVois smartcamera. The camera, and onboard open source autopilot Paparazzi allow it to race autonomously

Algorithms

The main innovation underlying this feat is the creation of extremely efficient and yet still robust algorithms. “The wireless images in human drone racing can be very noisy and sometimes not even arrive at all”, says Christophe De Wagter, founder of the MAVLab. “So, human pilots rely heavily on their predictions of how the drone is going to move when they move the sticks on their remote control.”

Although the images of an autonomous drone do not have to be transmitted through the air, the interpretation of the images by small drones can sometimes be completely off. The drone can miss a gate or evaluate its position relative to the gate completely wrongly. For this reason, a prediction model is central to the approach. Since the drone has very little processing, the model only captures the essentials, such as thrust and drag forces on the drone frame.

Sensors

“When scaling down the drone and sensors, the sensor measurements deteriorate in quality, from the camera to the accelerometers”, says Shuo Li, PhD student at the MAVLab on the topic of autonomous drone racing. “Hence, the typical approach of integrating the accelerations measured by the accelerometers is hopeless. Instead, we have only used the estimated drone attitude in our predictive model. We correct the drift of this model over time by relying on the vision measurements.” A new robust state estimation filter was used to combine the noisy vision measurements in the best way with the model predictions.

Racing performance

The drone used the newly developed algorithms to race along a 4-gate race track in TU Delft’s Cyberzoo. It can fly multiple laps at an average speed of 2 m/s, which is competitive with larger, state-of-the-art autonomous racing drones. Thanks to the central role of gate detection in the drone’s algorithms, the drone can cope with displacements of the gates.

“We are currently still far from the speeds obtained by expert human drone racers. The next step will require even better predictive control, state estimation and computer vision”, says Christophe De Wagter. “Efficient algorithms to achieve these capabilities will be essential, as they will allow the drone to sense and react quickly. Moreover, small drones can choose their trajectory more freely, as the racing gates are relatively larger for them.”

Beyond racing

Although racing is a quickly growing e-sport with more and more enthusiasts involved, autonomous racing drones are useful beyond drone racing alone. “For typical drones with four rotors, flying faster also simply means that they are able to cover more area. For some applications, such as search and rescue or package delivery, being quicker will be hugely beneficial”, adds Guido de Croon, scientific leader of the MAVLab. “Our focus on light weight and cheap solutions means that such fast flight capabilities will be available to a large variety of drones.”

Article

For more in-depth information on the algorithms and motivation for our approach, please see our blog post at Robohub.

The scientific article has been uploaded to the ArXiv public repository:
Shuo Li, Erik van der Horst, Philipp Duernay, Christophe De Wagter, Guido C.H.E. de Croon,“Visual Model-predictive Localization for Computationally Efficient Autonomous Racing of a 72-gram Drone”, ArXiv Preprint arXiv:1905.10110 (2019). It is available here: https://arxiv.org/abs/1905.10110

Drone racing team 2018-2019

Christophe De Wagter
Guido de Croon
Shuo Li
Phillipp Dürnay
Jiahao Lin
Simon Spronk

DelFly Nimble

Science paper on our new DelFly

guido No Comments

We have developed a new, highly agile flapping-wing robot, called the DelFly Nimble. It can mimic high-speed insect escape maneuvers so accurately, that it exhibited very similar motion as fruitflies when performing these same maneuvers.

Interestingly, the robot turned around an axis that was not controlled during the escape maneuver. In particular, it would turn towards the flight direction. A fruit fly does the same, but it was not clear if it did this on purpose or not. Namely, it is not possible to look inside such an insect’s brain during flight. For the robot, however, we know exactly what happens inside its brain. We completely determine its control system and can log all its senses and motor actions during flight.

We found that the turn is due to an aerodynamic mechanism. When flying faster, the moments around the roll and pitch axis of the robot start to become coupled with the yaw axis. We modeled this effect, and the predictions of the model nicely capture the robot and fruitfly data.

The article can be found here: A tailless aerial robotic flapper reveals that flies use torque coupling in rapid banked turns, by Matěj Karásek, Florian T. Muijres, Christophe De Wagter, Bart D. W. Remes, and Guido C. H. E. de Croon, in Science, 14 Sep 2018, Vol. 361, Issue 6407, pp. 1089-1094.

Applications


We think that this new robot is an important step towards real-world applications of flapping wing robots. The previous DelFly versions were very sensitive to wind gusts or even strong drafts, such as from a zealous air conditioning system. These older designs had airplane-like tails for steering, which was not sufficient in such situations. Steering with the wings is a solution to this problem.

Given the light weight of flapping wing robots as the DelFly Nimble (weighing 29 grams), they are very safe for flight in indoor spaces and around humans. Applications may include having them fly in greenhouses for monitoring crop or in warehouses for checking stock.

See our press release here.

self-supervised learning of depth estimation

TOP grant on self-supervised learning

guido No Comments

The Dutch Science Foundation (NWO) has granted me a personal grant on the topic of self-supervised learning. The grant, named TOP grant, is intended for researchers who obtained their PhD maximally 10 years ago.

A structured study on self-supervised learning

In the proposal, I forward self-supervised learning (SSL) as a reliable mechanism to have robots learn in their own environment. In SSL, robots have a baseline perception and behavior module, which allows for initial operation. In the same time this module provides supervised targets to a learning process that extends the robot’s capabilities. Where previous work has shown the potential of SSL in isolated case studies, I will now perform – together with the PhD student funded by the project – the first structured study of SSL. The key innovation is the extension of SSL with novel elements and identify and propose solutions to fundamental challenges in the way of its widespread usage.

SSL_stereo_setup
This image shows a self-supervised learning setup in which a robot with a stereo system uses the stereo-based distance estimates to learn how to estimate distances also with only one camera. After learning, fusion of the distances from the stereo algorithm and the monocular distance estimation lead to more reliable distance estimates. This setup is described in an article that is to be presented at ICRA 2018, in Australia.

In the TOP grant project, we will study fundamental aspects of self-supervised learning, such as the guarantee of a baseline performance and the fusion of baseline and learned perceptual capabilities. If successful, the research will have an impact on many different types of robots, which will be able to significantly improve their perceptual capabilities over time.

Read more on the NWO web site.