Collective Behavior

Mission

Coordinating behavior with others requires communication, and for many animals, including meerkats, vocal communication is key. Our research aims to understand how meerkats coordinate with one another across different behavioral contexts, and in elucidate the role of vocalizations in mediating these behaviors. We combine direct behavioral observations, multi-sensor tracking collars, and field experiments to address this topic.




Main Research Questions

From staying together while on the move to banding together against common threats, many group behaviors require coordination. How do animals achieve such feats? And what role does vocal communication play in mediating these collective behaviors?

We are investigating this topic across different behavioral contexts, including how meerkats come to consensus about when and where to move; how they coordinate vigilance; and how they respond to threats such as predators. Our data collection involves deploying multi-sensor tracking collars on entire groups of meerkats, allowing us to record the movements, vocalizations, and behaviors of all members of the group simultaneously. We combine this bio-logging approach with direct behavioral observations as well as targeted audio playback experiments. We also develop machine learning and other computational approaches to process and analyze the resulting large, multimodal tracking datasets. 

Overall, we aim to understand how animals in groups integrate acoustic and spatial information when making decisions, how information flows through groups, how the interactions of individuals give rise to group-level outcomes.

Findings so far

Our work so far has yielded large datasets on the movements, vocalizations, and behaviors of entire meerkat groups; new tools for the analysis of bioacoustic and other data; and insights into meerkat communication and collective behavior. In the future, we plan to continue leveraging these datasets, tools, and insights to delve deeper into the mechanisms of collective behavior (Demartsev et al. 2022).

Tracking datasets:

Below is an example of a short sequence of tracking data, taken from a group of 7 meerkats in 2017. Each trajectory represents the path of an individual meerkat over the course of 20 minutes, with time represented by color. Inset at left shows vocalizations produced by the meerkats during the same time period, including calls associated with maintaining group cohesion (red), initiating group departures (blue) and running or sentinel behavior (green).

Machine learning tools:

Making use of our large multi-modal datasets requires first identifying signals and behaviors of interest from the raw sensor data. We are therefore developing machine learning approaches to identify vocalizations from audio data and behaviors from accelerometer data. For audio data, our team developed and publicly released animal2vec, a self-supervised, transformer-based framework for detection and classification of animal vocalizations from raw audio (https://arxiv.org/abs/2406.01253). In addition, we publicly released our hand-labeled meerkat audio dataset (https://doi.org/10.17617/3.0J0DYB) to enable future use as a reference dataset in bioacoustics and machine learning. This dataset consists of over 1000 hours of audio containing over 250,000 hand-labeled acoustic events.

Mechanisms of collective behavior:

In ongoing work, we are using our tracking datasets to reveal the mechanisms underlying collective behavior. For example, our recent investigations into collective movement have shown that dominant females play a central role in guiding meerkat groups, in terms of both the timing and direction of movement (Averly et al. 2022). Meerkats’ influence over group direction can also be modulated by producing certain calls, such as “move” calls.  In addition, by analyzing the spatiotemporal dynamics of vocal interactions, we identified instances of both call-and-response (i.e. signal exchange) and synchronous calling (i.e. signal broadcast), linked to different call types (Demartsev et al. 2024).

Collaborations