Publication: Geometric Deep Learning Enables 3D Kinematic Profiling Across Species and Environments
No Thumbnail Available
Open/View Files
Date
2021-04-19
Published Version
Journal Title
Journal ISSN
Volume Title
Publisher
Springer Science and Business Media LLC
The Harvard community has made this article openly available. Please share how this access benefits you.
Citation
Dunn, Timothy W., Jesse D. Marshall, Kyle S. Severson, Diego E. Aldarondo, David G. C. Hildebrand, Selmaan N. Chettih, William L. Wang, et al. 2021. “Geometric Deep Learning Enables 3D Kinematic Profiling Across Species and Environments.” Nature Methods 18 (5): 564–73.
Research Data
Abstract
Comprehensive descriptions of animal behavior require precise measurements of 3D whole-body
movements. Although 2D approaches can track visible landmarks in restrictive environments,
performance drops significantly in freely moving animals, where occlusions and appearance changes
are ubiquitous. To enable robust 3D tracking, we designed DANNCE, a method using projective
geometry to construct inputs to a convolutional neural network that leverages learned 3D geometric
reasoning to track anatomical landmarks across species and behaviors. We trained and benchmarked
DANNCE using a new 7-million frame dataset relating color videos and rodent 3D poses. In rats and
mice, DANNCE robustly tracked dozens of landmarks on the head, trunk, and limbs of freely moving
animals in naturalistic settings, achieving over an order of magnitude better accuracy than prior
techniques. We extend DANNCE to rat pups, marmosets, and chickadees, and demonstrate a novel
ability to quantitatively profile behavioral lineage over development. DANNCE offers unprecedented
analytical access to animal behavior across species and environments.
Description
Other Available Sources
Keywords
Cell Biology, Molecular Biology, Biochemistry, Biotechnology
Terms of Use
Metadata Only