Quantifying behavior using deep learning
A core goal of neuroscience is to understand how the brain adaptively orchestrates movements to execute complex behaviors. Quantifying behavioral dynamics, however, has historically been prohibitively laborious or technically intractable, particularly for unconstrained and naturalistic behaviors which the brain evolved to produce. Driven by advances in computer vision and deep learning, new methods are being developed to overcome these limitations and enable precise and automated quantification of behavior from conventional across species and experimental settings. In this talk we will: introduce the problem of pose tracking for behavioral quantification; show how deep learning can be employed to achieve markerless motion capture; and highlight examples of how our work on making this technology accessible through tools like SLEAP (sleap.ai) is enabling studies across domains and application areas ranging from social and motor neuroscience in flies, rodents, and primates, to ecology, human dance, and even plant biology to tackle climate change.