Mutualistic Drones

Mutualistic Drones for Individual Identification

State of the Art

For large African landscapes, manual aerial surveys with light aircraft are still considered the best method of counting large mammals. However, high costs and logistical constraints mean that light aircraft surveys either do not take place at all, or if they do, the time between successive surveys is so long that catastrophic declines in populations can occur. Several key areas have been highlighted within conservation where UAS might provide great benefit, e.g., the real-time mapping of land cover, monitoring of illegal deforestation, detection of poaching activities, and wildlife surveying. Where drones are currently used for conservation, it can be difficult to capture the required multi-viewpoint visual data from a single vehicle.

Innovations and Impact

The use of larger fixed-wing drones releasing and recovering smaller mutualistic aerial robots for efficient multi-viewpoint capture and high-resolution close-up photography of individual animals. This promises radical improvements compared to single aircraft-based analysis, since active multi-view image capture at source would counteract species false positives and reduce false negatives.

Objectives

To employ larger fixed-wing drones/UAS for surveys, releasing and recovering smaller aerial robots for efficient multi-viewpoint capture and high-resolution close-up photography of individual animals. These autonomous drones will be able to collaborate, actively and dynamically change flight paths to optimize data capture, delivering error-resilient animal recognition and monitoring from the air. Such an AI-integrated, multi-aircraft approach is completely novel in conservation. It promises radical improvements compared to post-flight analysis, since active multi-view image interpretation at source would effectively counteract species false positives, as well as reducing false negative detections.

Expected Results

The key result of this work will be the control systems and aerial platforms required for in-flight remote deployment, multi vehicle operations and image capture.  Core elements of the mission and their associated control behaviours will be tested individually, including manoeuvring for subject capture, and airborne deployment and capture of small reconnaissance drones. Critically, linking multi-vehicle real-time navigation and control to onboard visual animal recognition and tracking will enable this capability.

Project Facts

University of Bristol (UK).

Professor Thomas Richardson, University of Bristol.

University of Münster (DE): Integration with computer vision subsystem. 

AVY (NL): Experiments and integration with the Avy platform.

(Visited 468 times, 3 visits today)