Animal Brain-Inspired AI Game Changer for Autonomous Robots

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

A team of researchers from Delft University of Technology has developed a drone that flies autonomously using neuromorphic image processing and control based on the workings of the animal's brain. Animal brains use less data and energy than existing deep neural networks running on GPUs (graphics chips). Neuromorphic processors are very suitable for small drones as they do not require heavy and bulky hardware and batteries. The results are extraordinary: the drone's deep neural network processes data 64 times faster during flight and consumes three times less energy than running on a GPU. Further advances in this technology could enable drones to become smaller, agile and smarter than flying insects or birds. The results were published recently. Science Robotics.

Learning from animal brains: sharpening neural networks

Artificial intelligence has great potential to provide autonomous robots with the intelligence needed for real-world applications. However, current AI relies on deep neural networks that require considerable computing power. Processors designed to run deep neural networks (graphics processing units, GPUs) consume significant amounts of energy. This is especially a problem for small robots like flying drones, as they can only take very limited resources in terms of sensing and computing.

Animal brains process information in a way that is very different from neural networks running on GPUs. Biological neurons process information asynchronously, and communicate mostly through electrical pulses spikes. Because it costs energy to send such spikes, the brain slows down the speed, leading to sparse processing.

Inspired by these characteristics of animal brains, scientists and tech companies are developing new, Neuromorphic Processors These new processors allow spiking neural networks to run and promise to be much faster and more energy efficient.

“Computations performed by spiking neural networks are much simpler than standard deep neural networks.” “While digital spiking neurons only need to add integers, standard neurons use these spiking neural networks to multiply and add floating-point numbers,” says Jessie Hagenares, PhD candidate and one of the authors of the article. Why, think it's easier for humans to calculate 5 + 8 than 6.25 x 3.45 + 4.05 x 3.45.

This energy efficiency is further enhanced if neuromorphic processors are used with neuromorphic sensors such as neuromorphic cameras. Such cameras do not take pictures at a fixed time interval. Instead, each pixel only sends a signal when it gets brighter or darker. The advantages of such cameras are that they can detect motion more quickly, are more energy efficient, and work well in both dark and bright environments. Additionally, signals from neuromorphic cameras can feed directly into spiking neural networks running on neuromorphic processors. Together, they could be a huge enabler for autonomous robots, especially small, agile robots like flying drones.

The first neuromorphic vision and control of flying drones

In an article published in Science Robotics On May 15, 2024, researchers at Delft University of Technology, the Netherlands, demonstrated for the first time a drone that uses neuromorphic vision and control for autonomous flight. Specifically, they developed a spiking neural network that processes signals from a neuromorphic camera and outputs control commands that determine the drone's pose and thrust. They deployed this network on a neuromorphic processor, Intel's Lohi Neuromorphic Research Chip, on board the drone. Thanks to the network, the drone can see and control its movement in all directions.

Federico Paredes-Wallace, one of the researchers who worked on the study, says, “We faced many challenges, but the most difficult was visualization. How We can train a spiking neural network so that the training is quite fast. And A trained network will work well on a real robot. Finally, we designed a network consisting of two modules. The first module learns to visually perceive motion from moving neuromorphic camera signals. It does this completely by itself, in a self-monitored way, based only on camera data. It is similar to how animals learn to see the world by themselves. The second module learns to map approximate motion to control commands in the simulator. This learning depended on artificial evolution, in which networks controlling drones were more likely to produce offspring. Over generations of artificial evolution, spiking neural networks improved rapidly in control, and eventually were able to fly in any direction at varying speeds. We trained both modules and developed a method with which we could integrate them together. We were pleased to see that the integrated network immediately performed well on a real robot.”

With its neuromorphic vision and control, the drone is able to fly at different speeds in different lighting conditions, from dark to bright. It can also fly with flashing lights, causing the neuromorphic camera pixels to send a large number of signals to the network that have nothing to do with movement.

Improved energy efficiency and speed through neuromorphic AI

“Importantly, our measurements confirm the potential of neuromorphic AI. The network runs between 274 and 1600 times per second on average. If we run the same network on a small, embedded GPU, it runs on average only 25 times per second, a difference of a factor of ~10-64. Also, when running the network, Intel's Lohe Neuromorphic Research chip uses 1.007 watts, of which 1 watt is idle power. When running the same network, the embedded GPU consumes 3 watts, of which 1 watt is idle power and 2 watts network. are spent on running, which runs faster and more efficiently on much smaller autonomous robots.” says Stan Strubants, a PhD candidate in the field of neuromorphic drones.

Future applications of neuromorphic AI for miniature robots

“Neuromorphic AI will enable all autonomous robots to become more intelligent, but it is a complete enabler for small autonomous robots,” says Guido de Kroon, professor of bio-inspired drones at Delft University of Technology's Faculty of Aerospace Engineering. At, we work on small autonomous drones that can be used for everything from crop monitoring in greenhouses to stock tracking in warehouses. They can also go between plant boundaries, they can be very cheap, so they can be deployed in a crowd, this is useful for covering an area more quickly, as we discovered and gas sources. shown in the localization settings.”

“The present work is a great step in this direction. However, the realization of these applications will depend on further miniaturization of neuromorphic hardware and increasing capabilities towards complex tasks such as navigation.”

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment