Post provided by Marianna Chimienti
My name is Dr Marianna Chimienti, and I am a lecturer in Marine Top Predator Ecology at the School of Ocean Sciences at Bangor University (UK). I’m fascinated by animal movements. My main research focuses on understanding how, where, when, and why animals move, using bio-logging technology (devices attached to animals that can record location, depth, acceleration, orientation, environmental conditions, and even images). This approach is often referred to as “animal tracking”.
Collecting underwater pictures and videos of animals and their surroundings using devices they carry themselves has developed more slowly than other types of data. This was mainly because the equipment initially remained on the large and heavy side, and it could thus only be carried by large marine animals. But rapid advancements in technology—especially in the miniaturization of image sensors—are changing the game. Small, lightweight devices can now capture a wide range of underwater visuals, including still images, video footage, and sonar readings of everything animals do, see and encounter when they go about their daily life.
These tools are offering exciting new ways to understand marine animals and how they interact with their environments. For example, micro-sonar sensors have helped researchers study seal hunting tactics and how their prey try to escape in response. Footage recorded from animals like seals, seabirds, and sharks have revealed a wide breadth of knowledge on foraging, flight dynamics, or social behaviour in animals for example. Beyond behaviour, image-based data can help measure physical features of the ocean—such as sea ice, seafloor and benthic habitats or the pelagic zones —in ways that are directly relevant to the animals’ daily lives. In short, these devices are opening new windows into marine ecosystem dynamics.

Despite this tremendous progress, one thing stood out to my team and I: although Artificial Intelligence (AI) and computer vision tools (like the tools used in facial recognition or self-driving cars) can dramatically improve and speed up image analysis, they remain largely under-used in marine science, and virtually absent from bio-logging research. We needed to understand why, and to try and help our community to get on board with these new methods that have the potential to quickly unlock vast knowledge on our oceans—particularly important in the actual context of climate change.
This idea behind our recent review began to take shape in autumn 2023. At the time, I was a postdoctoral researcher at the Centre d’Études Biologiques de Chizé (CEBC) and the Laboratoire Informatique, Image, Interaction (L3i) at La Rochelle University in France, working on image data collected from wild animals. I had looked at this kind of data before and was struck by its potential. We quickly realised there was no standardised method to analyse these data across species or study systems. Together with my collaborators (Dr Akiko Kato and Dr Tiphaine Jeanniard du Dot), we began reaching out to other researchers working with similar data from a range of marine species.
All authors came together to write a comprehensive review aiming for a resource that brings together shared research questions, challenges, and opportunities to advance this emerging field. Our goal wasn’t just to highlight the potential of image-based bio-logging, but to offer practical solutions for analysing these data. That meant building an interdisciplinary team that included statisticians, modellers, and computer scientists.
Our review starts by covering two areas: (i) how AI is currently being used to process underwater imagery, and (ii) how image-based bio-logging is applied in marine environments. We identify what’s working, where the gaps are, and how ecology and computer vision could be more effectively integrated. We also propose a step-by-step framework to guide researchers in analysing image data, with a hands-on example in a Jupyter notebook.
Looking ahead, we’re excited about aligning image data with other bio-logging data streams (like depth, movement, and location) which are often recorded at different resolutions. We also see potential in developing lightweight models that could process images directly on-board the device while the animal is still roaming free in the wild.
At its core, this work calls for a collaborative research community at the intersection of ecology and AI. By sharing data, tools, and knowledge across disciplines, we can accelerate discovery and drive more innovative science. These efforts go beyond academic interest, they can help conservationists and policymakers better understand and protect marine life. Applying AI to image-based bio-logging in a systematic way could transform how we study marine ecosystems, advance ecological theory, and support conservation efforts at a critical time.
Read the full article here.
What a cool read, Dr. Chimienti! It’s amazing to see how tiny cameras and smart tech are helping us explore the ocean through the eyes of marine animals. Thank you for making science feel so alive!