Fast animal pose estimation using deep neural networks

Pereira, Talmo D.; Aldarondo, Diego E.; Willmore, Lindsay; Kislin, Mikhail; Wang, Samuel S.-H.; Murthy, Mala; Shaevitz, Joshua W.
Issue date: 30 May 2018
Cite as:
Pereira, Talmo D., Aldarondo, Diego E., Willmore, Lindsay, Kislin, Mikhail, Wang, Samuel S.-H., Murthy, Mala, & Shaevitz, Joshua W. (2018). Fast animal pose estimation using deep neural networks [Data set]. https://doi.org/10.34770/2jce-gm62
@electronic{pereira_talmo_d_2018,
  author      = {Pereira, Talmo D. and
                Aldarondo, Diego E. and
                Willmore, Lindsay and
                Kislin, Mikhail and
                Wang, Samuel S.-H. and
                Murthy, Mala and
                Shaevitz, Joshua W.},
  title       = {{Fast animal pose estimation using deep n
                eural networks}},
  year        = 2018,
  url         = {https://doi.org/10.34770/2jce-gm62}
}
Abstract:

Recent work quantifying postural dynamics has attempted to define the repertoire of behaviors performed by an animal. However, a major drawback to these techniques has been their reliance on dimensionality reduction of images which destroys information about which parts of the body are used in each behavior. To address this issue, we introduce a deep learning-based method for pose estimation, LEAP (LEAP Estimates Animal Pose). LEAP automatically predicts the positions of animal body parts using a deep convolutional neural network with as little as 10 frames of labeled data for training. This framework consists of a graphical interface for interactive labeling of body parts and software for training the network and fast prediction on new data (1 hr to train, 185 Hz predictions). We validate LEAP using videos of freely behaving fruit flies (Drosophila melanogaster) and track 32 distinct points on the body to fully describe the pose of the head, body, wings, and legs with an error rate of <3% of the animal's body length. We recapitulate a number of reported findings on insect gait dynamics and show LEAP's applicability as the first step in unsupervised behavioral classification. Finally, we extend the method to more challenging imaging situations (pairs of flies moving on a mesh-like background) and movies from freely moving mice (Mus musculus) where we track the full conformation of the head, body, and limbs.

Show More
Description:

This dataset contains videos of freely moving fruit flies, as well as trained networks and body position estimates for all ~21 million frames. Download the README.txt file for a detailed description of this dataset's content. See the code repository (https://github.com/talmo/leap) for usage examples of these files.

Show More
Filename Size