Behaviour 2019
Search
Automatically classifying animal calls from acoustic collars – meerkat case study  
Kiran L Dhanjal-Adams1,2, Mathieu Duteil1,2, Baptiste Averly1,2,4, Vlad Demartsev1,2,4, Marta B. Manser3,4, Dan Stowell5, Ariana Strandburg-Peshkin1,2,4,7, Marie A. Roch6,7. 1University of Konstanz, Konstanz, Tip: Search for English results only. You can specify your search language Baden-Württemberg, Germany; 2Max Planck Institute of Animal Behavior, Konstanz, Baden-Württemberg, Germany; 3University of Zurich, Zurich, Zürich, Switzerland; 4Kalahari Meerkat Project, Van Zylsrus, Northern Cape, South Africa; 5Tilburg University, Tilburg, Tilburg, Netherlands; 6San Diego State University, San Diego, California, United States; 7shared last author, N/A, N/A, United States

Understanding group decision-making in mobile species is complex. The integration of both GPS and acoustic sensors into tags deployed on group members can help us systematically quantify communication between individuals as well as their movement responses - based on who is calling, the context of the call, and how the group reacts. However, the resulting datasets are very large, and the manual identification and classification of calls is both time and labor consuming. Here, we present a supervised machine learning tool focused on detecting call types in field recordings with variable signal to noise ratios. We trained a convolutional neural network to extract features from spectrograms. Bidirectional gated recurrent units learned temporal structure from these features, and a final feed-forward network predicted call type. We also used data augmentation to increase the number of training examples for underrepresented call types. We illustrate the method on entire meerkat groups. The method allows us to view animal communication in a continuous fashion to better understand the social context in which the calls are made to better understand collective decision-making