AI Unleashed: Revolutionizing Autonomous Drone Navigation

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

University of Missouri researchers are advancing drone autonomy using AI, focusing on navigation and environmental interaction without reliance on GPS. Credit: SciTechDaily.com

A pioneering project led by researchers at the University of Missouri aims to equip drones with autonomous visual navigation capabilities, potentially changing the way drones operate and help in critical situations such as natural disasters.

AI algorithms are being developed to allow drones to navigate autonomously and perform complex tasks, particularly GPS-Using advances in mediated environments, sensor technology and high-performance computing.

University of Missouri students spent a month at Arizona's Yuma Proving Grounds, one of the world's largest military installations, working to collect visible and infrared video data using custom-built drones. have been. Their project helped form the basis of a two-year project supported by the US Army Engineer Research and Development Center. Credit US Department of Defense

AI powered drone navigation

Thanks to the power of smart algorithms Artificial intelligence (AI), drones could one day pilot themselves — no humans needed — using visual cues to help them navigate from one location to another. That's the goal of a two-year project led by University of Missouri researchers and supported by a $3.3 million grant from the U.S. Army Engineer Research and Development Center (ERDC), an arm of the U.S. Army Corps of Engineers. It is the premier research and development center for

Autonomy in critical situations

The ability to operate autonomously becomes critical in situations where GPS navigation is interrupted or signal is lost, such as after a natural disaster or In military situations. Investigators on the project.

“It usually occurs after natural disasters, as a result of disruptions in the built environment and terrain, or human intervention,” Palnipan said. “Most drones in operation today require GPS navigation to fly, so when they lose that signal, they can't find their way and usually end up wherever they are. Land there. Unlike ground-based GPS navigation apps, which can reroute you if you miss a turn, there's currently no rerouting option for airborne drones in these situations. Is.”

University of Missouri students spent a month at Arizona's Yuma Proving Grounds, one of the world's largest military installations, working to collect visible and infrared video data using custom-built drones. have been. Their project helped form the basis of a two-year project supported by the US Army Engineer Research and Development Center. Credit US Department of Defense

Augmenting Drones with Smart Technology

Currently, one must fly the drone manually and have a high level of situational awareness to keep it away from surrounding obstacles, such as buildings, trees, mountains, bridges, landmarks or other prominent structures. while staying within sight. Now, through a combination of visual sensors and algorithms, Palanipan and team are developing software that will allow drones to fly autonomously — independently sensing their environment while pursuing specific goals or objectives. And will allow you to communicate with them.

Kanpan Palanipan. Credit: University of Missouri-Columbia

“We want to take the skills, attributes, contextual knowledge, mission planning and other capabilities that drone pilots have and incorporate them — along with weather conditions — into the drone's software so that it All these decisions can be made independently.” said.

Advancing the intelligent scene concept

In recent years, advances in visual sensor technology such as light detection and ranging, or lidar, and thermal imaging have allowed drones to perform limited advanced tasks such as object detection and visual recognition. When combined with the team's algorithms — powered by deep learning and Machine learninga subset of AI—can help drones produce advanced 3D or 4D imagery for mapping and monitoring applications.

“As humans, we've been using our visual system to add dynamic knowledge of 3D models and patterns of movement to our surroundings since we were young,” Palnipan said. “Now, we are trying to decode the salient features of the human visual system and build these capabilities into autonomous vision-based aerial and terrestrial navigation algorithms.”

Overcoming technical limitations

Developing sophisticated imaging capabilities requires computer-related resources such as processing power, memory, or time. This capability is beyond what is currently available through software systems typically available on drones. Therefore, the MU-led team is investigating how to leverage the power of cloud, high-performance, and edge computing approaches for potential solutions.

“After a severe storm or natural disaster, buildings, waterways and other types of infrastructure will be damaged,” said Palnipan. “3D reconstruction of an area can help government officials and first responders understand the extent of damage. By allowing drones to collect raw data and transmit that information to the cloud, the cloud supports High-performance computing software can complete the analysis and generate a 3D digital twin model without the need for additional software to be physically installed and accessible on the drone.”

The MU team consists of Prasad Kalim, Fils Buniak and Joshua Fraser. The team also includes researchers from Saint Louis University, University of California-Berkeley and University of Florida.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment