Dataset

Data for Sehara et al., 2021 eNeuro (the real-time DeepLabCut project)

, , , ,
  1. 1Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.
  2. 2Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom.

 BROWSE REPOSITORY  BROWSE ARCHIVE  DOWNLOAD ARCHIVE (ZIP 147 GiB)

Published 06 Feb. 2021 | License Creative Commons 4.0 Attribution


Description

Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e 10.5 millisecond. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.

Keywords

| Neuroscience | Behavioral tracking | Closed-loop experiment system |

References

Funding

  • EU EU.670118
  • EU EU.327654276
  • EU EU.720270
  • EU EU.785907
  • EU EU.945539
  • DFG DFG.250048060
  • DFG DFG.246731133
  • DFG DFG.267823436

Citation

Sehara K, Zimmer-Harwood P, Colomb J, Larkum ME, Sachdev RN (2021) Data for Sehara et al., 2021 eNeuro (the real-time DeepLabCut project). G-Node. https://doi.org/10.12751/g-node.lgu2m0