Published June 28, 2021 | Version 1.0
Dataset Open

The Mouse Action Recognition System (MARS): behavior annotation data

Description

The study of naturalistic social behavior requires quantification of animals' interactions. This is generally done through manual annotation—a highly time consuming and tedious process. Recent advances in computer vision enable tracking the pose (posture) of freely-behaving animals. However, automatically and accurately classifying complex social behaviors remains technically challenging. We recently introduced the Mouse Action Recognition System (MARS), an automated pipeline for pose estimation and behavior quantification in pairs of freely interacting mice (Segalin et al, 2020). This Dataset includes the training and test sets used to train MARS's pose estimation network, which detects the pose of each of two mice in terms of a set of anatomical keypoints. Included in this dataset are pose estimates, extracted pose features, and frame-by-frame manual annotations from video recordings of pairs of mice freely interacting in a standard home cage.

Files

test_2.zip
Files (134.9 GB)
Name Size
md5:a88009a1bc3863886e3782c70c031afe
17.2 GB Preview Download
md5:7f7db823e7abb33958449f4071f56577
57.3 GB Preview Download
md5:1d5f94e6ee086769ec4b23ff45b36ee0
33.8 GB Preview Download
md5:8fa0960e7105e574a0e98d7d378e6792
32.4 kB Download
md5:dee6289786194101a8660b2cfaccc077
16.4 kB Download
md5:05a32cc1c591879a1511c10e1af37ae3
1.8 kB Download
md5:2c532163cffa26164e301246833ac444
26.6 GB Preview Download

Methods

Data Acquisition Experimental mice ("residents") were transported in their homecage (with cagemates removed) to a behavioral testing room, and acclimatized for 5-15 minutes. Homecages were then inserted into a custom-built hardware setup (Hong et al, 2015) with infrared video captured at 30 fps from top- and front-view cameras (Point Grey Grasshopper3) recorded at 1024x570 (top) and 1280x500 (front) pixel resolution using StreamPix video software (NorPix). Following two further minutes of acclimatization, an unfamiliar group-housed male or female BALB/c mouse ("intruder") was introduced to the cage, and animals were allowed to freely interact for a period of approximately 10 minutes. BALB/c mice are used as intruders for their white coat color (simplifying identity tracking), as well as their relatively submissive behavior, which reduces the likelihood of intruder-initiated aggression. In some videos, mice are implanted with a cranial cannula, or with a head-mounted miniaturized microscope (nVista, Inscopix) or optical fiber for optogenetics or fiber photometry, attached to a cable of varying color and thickness. Surgical procedures for these implantations can be found in (Karigo et al, 2020). The raw behavior videos are currently in preparation for upload, and will be added to this repository shortly. Behavior Annotation. Behaviors were annotated on a frame-by-frame basis by a trained human expert in the Anderson lab. Annotators were provided with simultaneous top- and front-view video of interacting mice, and scored every video frame for close investigation, attack, and mounting (for full criteria see Methods of Segalin et al, 2020). In some videos, additional behaviors were also annotated--when this occurred, these behaviors were assigned to one of close investigation, attack, mounting, or "other" for the purpose of training classifiers. Annotation was performed either in BENTO (Segalin et al, 2020) or using a previously developed custom Matlab interface. Pose Estimation. The poses of mice in top-view recordings are estimated using the Mouse Action Recognition System (MARS, Segalin et al, 2020), a computer vision tool that identifies seven anatomically defined keypoints on the body of each mouse: the nose, ears, base of neck, hips, and tail. For details on the pose estimation process, please refer to the MARS manuscript. Note that while front-view video was acquired, pose information from the front view was not included in this dataset as it was not found to improve MARS classifier performance. This is likely due to the poor quality of front-view pose estimates due to high occurrence of occlusion as mice are interacting. Pose Feature Extraction. To facilitate behavior classification, a large set of features are computed from the poses of the two interacting mice, capturing the animals' relative positions, velocities, distances to cage walls, and other socially informative features. For each feature, we furthermore compute its mean, standard deviation, maximum, and minimum value within windows of +/-1, 5, and 10 frames of the current frame, to capture changes in feature values over time. A full list of features is included in the MARS manuscript; code to compute features from animals' poses can be found in the MARS Github repository.

Other

Related Publication: The Mouse Action Recognition System (MARS): a software pipeline for automated analysis of social behaviors in mice Segalin, Cristina Caltech Williams, Jalani Caltech Karigo, Tomomi Caltech Hui, May Caltech Zelikowsky, Moriel University of Utah Sun, Jennifer J. Caltech Perona, Pietro Caltech Anderson, David J. Caltech Kennedy, Ann Northwestern bioRxiv 2020-07-27 https://doi.org/10.1101/2020.07.26.222299 eng

Additional details

Created:
September 14, 2022
Modified:
November 18, 2022