From Kenya to Ohio – the inspiration for WildWing: an open-source, autonomous and affordable unmanned aerial system for animal behavioural video monitoring

Post provided by Jenna Kline, PhD Candidate, Department of Computer Science and Engineering, The Ohio State University, Columbus, OH, USA

The story of the WildWing project began in 2022 when I enrolled in the Experiential Introduction to Imageomics course. For the fieldwork component of the course, I travelled to the Mpala Research Centre in Laikipia, Kenya. My course project advisors, Dr Tanya Berger-Wolf and Dr Dan Rubenstein, were interested in exploring how drones could collect large-scale video datasets to train machine-learning models for automatic behaviour recognition. This approach could help alleviate the tedious, time-consuming work required to gather fine-grained behaviour observations in the field. Through my research with my co-advisor, Dr Chris Stewart, I had previous experience flying drones to collect ecological observations, so I led the drone missions in Kenya.

Our project team was tasked with collecting drone video imagery of giraffes, Plains zebras, and the endangered Grevy’s zebra to study their behaviour. We found that the drones allowed us to collect much clearer video footage of the animals than hand-held cameras, as shown in Figure 1. I could maneuver the drones easily to obstructions and follow the animals as they moved through the landscape. By the end of the three weeks, I had flown over 50 missions, collecting extensive videos while refining my technique for collecting behaviour data using drones and minimising animal disturbance.

Figure 1: Plains zebras at Mpala Research Center. Top: a group of five Plains zebras photographed from our field vehicle, occluded by vegetation and other animals. Bottom the same herd photographed using the drone, where all ten individuals are clearly visible.

After returning from Kenya, Maksim Kholiavchenko and I worked with our team to annotate the drone videos with behaviour labels. This effort produced the Kenyan Animal Behavior Recognition (KABR) dataset (Kholiavchenko et al., 2023). Rewatching the videos to annotate the animals’ behaviours made me realise that my decisions as a pilot in the field were not always ideal for downstream analysis. I hadn’t fully considered how the data collection would impact the computer vision models’ ability to extract usable behavioural data. Only two-thirds of the videos I collected were suitable for inferring behaviour—too few pixels, occlusions, out-of-sight, etc.

Could there be a better way to collect video behaviour data using drones?

I knew from previous studies using drones for digital agriculture that autonomous navigation missions tend to produce more reliable, consistent, and replicable datasets, which are ideal for downstream computer vision analysis (Boubin et al., 2019). The drones I used in Kenya came equipped with an automatic follow-me function – allowing the drone to automatically follow people or vehicles (but not zebras, unfortunately).  If I redefined the ‘object of interest’ as the group of animals, I could use this tracking-by-detection approach to automatically track herds more consistently. I designed and tested this approach in simulation, using the KABR videos and flight logs to test if my algorithm could track herds (Kline et al., 2023), which produced even better results once I added parameters to keep the drone in the ideal altitude and distance to infer behaviours (Kline et al., 2024).

Figure 2: Field tests at The Wilds. Left: Parrot drone surveying giraffes, Right: video still of Grevy’s zebras captured with WildWing.

Once I was confident in the navigation’s performance in simulation, the next step was to test it in real life. I spent my summer building and testing the software infrastructure to track herds with my Parrot Anafi drone autonomously. I performed the first few tests on myself with help from my friends, running around a nearby park to see how responsive the drone was tracking my movements. For the next phase of tests, I deployed the WildWing system at The Wilds, a 10,000-acre conservation centre in Ohio. Along with the animal welfare experts, I used WildWing to autonomously collect video behaviour data of giraffes, Grevy’s zebras, and Przewalski’s horse. The percentage of usable frames, that is, frames usable for behaviour studies, approached 100%, demonstrating the effectiveness of designing autonomous drone missions tailored to the downstream analysis.

I designed WildWing to be modular so other users could easily build their own navigation models and use different computer vision models best suited for their research. I am very excited to see how this tool can be applied to others’ research projects in the future.

If you want to use WildWing, check out the code and documentation here! And go check out our paper, too!

References:

Boubin, J., Chumley, J., Stewart, C., and Khanal, S. Autonomic computing challenges in fully autonomous precision agriculture. In 2019 IEEE International Conference on Autonomic Computing (ICAC), page 11–17, Jun 2019. 10.1109/ICAC.2019.00012

Kholiavchenko, M., Kline, J., Ramirez, M., Stevens, S., Sheets, A., Babu, R., Banerji, N., Campolongo, E., Thompson, M., Van Tiel, N., Miliko, J., Bessa, E., Duporge, I., Berger-Wolf, T., Rubenstein, D., & Stewart, C. (2024). KABR: In-situ dataset for Kenyan animal behavior recognition from drone videos. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (pp. 31–40). https://doi.org/10.1109/WACVW60836.2024.00011

Kline, J., Stewart, C., Berger-Wolf, T., Ramirez, M., Stevens, S., Ramesh Babu, R., Banerji, N., Sheets, A., Balasubramaniam, S., Campolongo, E., Thompson, M., Stewart, C. V., Kholiavchenko, M., Rubenstein, D. I., Van Tiel, N., & Miliko, J. (2023). A framework for autonomic computing for in situ imageomics. In 2023 IEEE International Conference on Autonomic Computing and Self-Organizing Systems (ACSOS). https://doi.org/10.1109/ACSOS58161.2023.00018

Kline, J., Kholiavchenko, M., Brookes, O., Berger-Wolf, T., Stewart, C.V. and Stewart, C. (2024). Integrating biological data into autonomous remote sensing systems for in situ imageomics: A case study for kenyan animal behavior sensing with unmanned aerial vehicles (uavs). https://doi.org/10.48550/arXiv.2407.16864

Leave a comment