Automatic beak tracking to monitor movement. From a continuous video stream (30 frames/s) of the cage interior we extract the position of the beak (red circle) based on its hue. By tracking the beak position we obtain a position map showing the spatial density of the animal, and a movement map showing the movement paths taken by the animal. These maps can be generated from data ranging from 5 minutes of recording up to 1 hour. The position map is useful to detect longer periods of immobility of the bird, and the movement path map can provide information about movement stereotypy. Shown movement data from a 10 h period obtained in a singly housed bird, tethered from the center of the cage.

Manipulations in birdsong research: their benefits and welfare implications

Over the past 50 years, songbirds have become a valuable model organism for scientists studying vocal communication from its behavioral, hormonal, neuronal, and genetic perspectives. Many advances in our understanding of vocal learning result from research using the zebra finch, a close-ended vocal learner. We study the impact on welfare of some of the common manipulations used in zebra finch research, such as isolate housing, transient/reversible impairment of hearing/vocal organs, implantation of small devices for chronic electrophysiology, head fixation for imaging, aversive song conditioning using sound playback, and mounting of miniature backpacks for behavioral monitoring. We weigh the benefits of these manipulations for science and health, and try to estimate their impact on animal welfare based on the literature and on data from our work. The assessment of such benefit-welfare tradeoffs is a legal prerequisite for animal research in Switzerland. We find that a diverse set of known stressors reliably lead to suppressed singing rate, and that by contraposition, elevated singing rate can be used as an indicator of welfare. We hope that our study can contribute to answering some of the most burning questions about zebra finch welfare in research environments.


© 2019 Institut für Neuroinformatik