Post provided by René Steinmann
Hi, I’m a geophysicist by training, but I’ve recently shifted my focus from studying the solid Earth to some of its living inhabitants. I now work at the intersection of geophysics, wildlife biology, and machine learning. My recent research brings together these seemingly distant worlds in a project that’s all about listening—not to singing birds, but to the subtle vibrations of footsteps in the ground.
Seismic Footsteps
Can we monitor wildlife through ground vibrations? Maybe a weird question, but the idea is surprisingly intuitive. When large animals like elephants or giraffes walk, their footsteps generate tiny ground vibrations—you could even call them mini-earthquakes, but we decided to call them footfall signals. These footfall signals can be picked up by sensitive instruments called geophones. While typically used to study earthquakes and other geological processes, these sensors are rarely applied in biology. But what if we could use them to monitor animals, silently and non-invasevly, from below the surface?
This concept isn’t brand new. The first studies describing seismic footfall signals and their potential in biology and conservation appeared in the early 2000s. But back then, the data was limited, and machine learning wasn’t as accessible or powerful as it is today.
As humans, we can easily identify a giraffe or an elephant in a camera trap photo. But could we tell them apart just from the vibrations of their footsteps? Probably not—unless we had heard them over and over again. Think of how you might recognize a friend or family member by the sound of their walk. You’ve heard it enough to form a mental model. Similarly, with enough training data, a machine learning algorithm can learn to do just that.
Thanks to technological advances, we’ve been able to build on those early ideas and use the power of AI to explore the rich diversity of these footfall signals—and identify wildlife accordingly.
From the Field to the CPUs
In 2019, a team of Kenyan and British scientists, including some of my collaborators, installed a small network of seismic sensors near a waterhole at the Mpala Research Centre in Kenya. They also deployed camera traps to detect passing animals. This gave us a unique dataset: the seismic sensors recorded the seismic waves of the footsteps, while the cameras provided a visual record of who was passing by—perfect training data for machine learning.
Before diving into model training, we spent time looking at the data ourselves. We observed distinct patterns in the seismograms and their spectrograms, depending on the animal. Giraffes, in particular, showed a fascinating signature: every second, we see a double impulse—one from the hind feet, one from the front feet, landing with a slight delay (see Fig. 1). Sometimes, there’s even a smaller impulse that follows, caused by the feet lifting off the ground at the same time. These patterns reveal not just the presence of an animal, but something about its biomechanics—how it moves.

Encouraged by these observations, we trained a machine learning model to classify the seismic recordings into different wildlife categories: elephants (specifically African bush elephants), giraffes (specifically northern giraffes), zebras (both plains and Grevy’s zebras), and hyenas (spotted and striped hyenas). The sensors even picked up footfalls from a leopard, but we didn’t have enough data to include them in the training set.
The model performed remarkably well, achieving 77–87 % balanced accuracy depending on how far the animal was from the sensor. That’s pretty exciting for a system that doesn’t “see” the animal at all—only feels its steps.
What This Could Mean
The implications go well beyond academic curiosity. To develop evidence-based conservation strategies, we need data—ideally collected in ways that don’t disturb the animals. Camera traps and acoustic sensors are widely used, capturing visual and auditory information. Seismic sensors add a new, vibrational layer to that toolbox.
Buried underground, they operate silently and non-invasevly, offering a novel way to monitor wildlife activity with minimal impact on animals or their habitats. For conservationists and biologists, this means more options for monitoring and studying wildlife, providing additional data for evidence-based conservation strategies.
Looking Ahead
This research is just the beginning. We returned to the Mpala Research Centre earlier this year to collect more data, building on lessons from our first deployment and data analysis. This time, we installed a dense array of seismic sensors to improve resolution and recorded video footage to better correlate seismic data with above-ground behavior.
Our goal is to refine the classification models down to the species level (e.g., Grevy vs. plain zebra) and eventually detect different behaviors. Is the zebra grazing peacefully or running from a lion? With enough data, we may be able to tell. Another important question is if we can transfer this methodology to other ecosystems with different wildlife? What signals would we pick up in the Amazonian rainforest, where we can’t see very far? This field between biology and seismology offers a lot of exciting research with a lot of room for creativity and an impact to protect our beautiful planet and its inhabitants.
Read the full article here.