Collision avoidance system transforms drone navigation
Using only on-board sensors and cameras, a team of scientists led by professor Julián Estévez, an industrial engineer at the University of the Basque Country (UPV/EHU), has developed an innovative, low-cost collision avoidance system designed to prevent mid-air collisions between unmanned aerial vehicles (UAVs). This new technology is set to play a critical role in the future of drone operations, where the airspace is expected to become increasingly congested with UAVs performing various services.
New system is based on the principles of computer vision and color identification. Unlike many existing anti-collision systems that require drones to communicate with each other, this new approach relies solely on onboard sensors and cameras. The key innovation lies in the simplicity and cost-effectiveness of the solution. Each drone is equipped with a camera that detects the presence of other drones by identifying red cards attached to them. The camera’s screen is split into two halves, and by analyzing which side the red color dominates, the drone autonomously decides which direction to take to avoid a collision.
According to the developed system, when the percentage of the color red on the screen increases, it means that the drones are approaching each other head-on. Once a certain threshold is exceeded, the robot knows it needs to perform an avoidance maneuver. All of this happens autonomously, without any human intervention. It’s a simple way to prevent collisions and can be performed by low-cost sensors and equipment.
This technology has been validated in laboratory settings and is ready for real-world applications. The research team tested their system using AR Drone 2.0 drones from Parrot, a French manufacturer known for producing lightweight and affordable drones. These tests demonstrated that the system could effectively prevent collisions, even under challenging conditions, such as uncontrolled lighting and drones flying from different directions.
The simplicity and effectiveness of this approach could have far-reaching implications for the drone industry. As the drone market continues to expand, the need for fully autonomous navigation systems becomes more pressing. Professor Estévez’s work represents a step towards this direction – a fully autonomous navigation, without any human intervention, so that drones can decide which maneuver to perform, which direction to take, thus preventing collisions with each other or with other obstacles.
The potential applications for this technology are vast. Beyond commercial drone operations, it could be used in various sectors, including public safety, logistics, agriculture, and even recreational drone flying. The development of such collision avoidance systems is crucial for ensuring that drones can safely navigate complex environments, unlocking their full potential across different industries.
The integration of QuData's advanced drone navigation systems can further enhance the benefits of this technology. By utilizing onboard visual assessment, inertial measurement units, and long-distance map navigation, QuData's solution enables drones to navigate different environments with high accuracy, even in the absence of GPS signals. This capability is particularly valuable in conflict zones, disaster-stricken or rural areas, where traditional navigation methods often fall short.
As drones become more integral to modern society, innovations like these will be essential for maintaining safety and efficiency in the skies. With continued research and development, we are moving closer to a future where UAVs can operate autonomously and harmoniously, paving the way for the widespread adoption of drone technology across multiple sectors.