Ant insights lead to robot navigation breakthrough

Ant insights lead to robot navigation breakthrough

In a recent study published in Science Robotics, researchers at TU Delft have drawn inspiration from ants to develop an insect-inspired autonomous navigation strategy for tiny, lightweight robots. This innovative approach allows the robots to return home after long journeys, requiring minimal computation and memory – just 0.65 kilobytes per 100 meters.

Scientists have long marveled at ants’ remarkable navigational skills, despite their relatively simple sensory and neural systems. Previous research, such as a study conducted at the Universities of Edinburgh and Sheffield, allowed the development of an artificial neural network that helps robots recognize and remember routes in complex natural environments by mimicking ants’ navigational prowess.

In the recent study, the researchers focused on tiny robots, weighing from a few tens to a few hundred grams, which have enormous potential for various applications. Their lightweight design ensures safety even if they accidentally collide with something. Their small size allows them to easily maneuver in tight spaces. Furthermore, if low-cost production is established, such robots can be used in large numbers, quickly covering large areas such as greenhouses to detect pests or diseases in plants early.

However, enabling these tiny robots to operate autonomously poses significant challenges due to their limited resources compared to larger robots. A major hurdle is their ability to navigate independently. While robots can utilize external infrastructure like GPS satellites outdoors or wireless communication beacons indoors, relying on such infrastructure is often undesirable. GPS signals are unavailable indoors and can be inaccurate in cluttered environments like urban areas. Installing and maintaining beacons can be expensive or impractical, especially in search-and-rescue scenarios.

To overcome these challenges, researchers turned to nature. Insects, particularly ants, operate over distances relevant to many real-world applications while using minimal sensing and computing resources. Insects combine odometry (tracking their own motion) with visually guided behaviors based on their low-resolution yet omnidirectional visual system (view memory). This combination has inspired researchers to develop new navigation systems.

One of the theories of insect navigation, the “snapshot” model, suggests that insects occasionally capture snapshots of their environment. Later, they compare their current visual perception to these snapshots to navigate home, correcting any drift that occurs with odometry alone. The researchers’ main insight was that snapshots could be spaced much further apart if the robot traveled between them based on odometry. Guido de Croon, professor in bio-inspired drones and co-author of the study, explained that homing will work as long as the robot ends up close enough to the snapshot location, i.e., as long as the robot’s odometry drift falls within the snapshot’s “catchment area.” This also allows the robot to travel much further, as the robot flies much slower when homing to a snapshot than when flying from one snapshot to the next based on odometry algorithms.

The proposed navigation strategy was tested on a 56-gram “CrazyFlie” drone equipped with an omnidirectional camera. The drone successfully covered distances up to 100 meters using only 0.65 kilobytes of memory. All visual processing was handled by a tiny computer called a “micro-controller,” commonly found in inexpensive electronic devices.

According to Guido de Croon, this new insect-inspired navigation strategy is an important step towards applying tiny autonomous robots in the real world. While the strategy’s functionality is more limited than modern navigation methods, it can suffice for many applications. For example, drones could be used for stock tracking in warehouses or crop monitoring in greenhouses. They could fly out, gather data, and return to a base station, storing mission-relevant images on a small SD card for post-processing by a server without needing these images for navigation.

In a related research and development QuData has also made significant strides in autonomous navigation systems for drones in GPS-denied environments. Our innovative approach leverages advanced AI algorithms, computer vision, and onboard sensors to enable drones to navigate and operate effectively without relying on external GPS signals. This technology is particularly useful for applications in indoor environments, both urban or rural areas, and other challenging settings when traditional GPS navigation fails.

These advancements mark a step forward in the deployment of tiny autonomous robots and drones, expanding their potential uses and enhancing their operational efficiency in real-world scenarios.

Related articles

Introductory time-series forecasting with torch

This is the first post in a series introducing time-series forecasting with torch. It does assume some prior...

Does GPT-4 Pass the Turing Test?

Large language models (LLMs) such as GPT-4 are considered technological marvels capable of passing the Turing test successfully....