Navigation auf uzh.ch

Suche

Institute of Neuroinformatics

Computational Ethology

Small colony of birds that interact freely in a naturalistic environment

What principles govern the natural behavior of humans and other animals? How can we understand our responses to stimuli from the external world, and in producing our response, how do we generate motor commands to elicit the appropriate actions and thus receive rewards and avoid costly mistakes? These and related questions can be tackled by combining knowledge from the fields of ethology (the biological study of animal behavior, Tinbergen 1951), cognitive science, mathematics, statistics and computer science. These topics together encompass the field of computational ethology.

One of our group’s main objective is to make use of recent advances in embedded systems, computer vision, and machine learning perform longitudinal observations of a small colony of birds that interact freely in a naturalistic environment. This approach meets the intuition that vocal production and learning strongly depend on social context. While the problem of tracking individual animals and their movements has been essentially solved in recent years, discriminating individual vocalizations of rapidly moving sometimes simultaneously vocalizing individuals has remained an obstacle for investigating vocal interactions in such a social setting.   

To analyze individual vocalizations in groups of songbirds, we have developed ultra-miniature back-attached sound and acceleration recorders. Using these recorders, we have designed a method for detecting the brief copulation events between males and females and we are in the process of analyzing the vocal interactions surrounding such events. We are developing new methods for the semi-supervised segmentation and classification of dense vocal interactions. Furthermore, to allow for bidirectional signal transfer as would be required for experiments involving optogenetics, we are developing a sensor node system based on bluetooth low-energy technology.

Statistical Inference of vocal learning strategies

We recently developed a new method to analyze high-dimensional vocalizations revealing fast and slow components of vocal learning while approaching the template (Kollmorgen 2020). Our method is based on nearest-neighbor statistics describing how behavior evolves relative to itself in a non-parametric way. We expect our method because of its generality and applicability to huge amounts of data to be well suited for comparing learning across many behaviors and species and thus becoming an important tool for data analysis in the field of computational ethology (more here: Multimodal action recognition).

In a sensory substitution study, we found that songbirds have a very basic need, which is to experience reliable feedback from their singing. Exploiting this need, we were able to incite birds to make targeted changes to their vocal repertoire in the absence of auditory feedback, showing that evaluation of auditory performance is not necessary for vocal plasticity in adulthood. This finding raises the question whether a template needs to be auditory as is commonly assumed or could also be from a different sensory modality e.g. visual.