LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Search results

Result 1 - 4 of total 4

Search options

  1. Article: ArMo: An Articulated Mesh Approach for Mouse 3D Reconstruction.

    Bohnslav, James P / Osman, Mohammed Abdal Monium / Jaggi, Akshay / Soares, Sofia / Weinreb, Caleb / Datta, Sandeep Robert / Harvey, Christopher D

    bioRxiv : the preprint server for biology

    2023  

    Abstract: Characterizing animal behavior requires methods to distill 3D movements from video data. Though keypoint tracking has emerged as a widely used solution to this problem, it only provides a limited view of pose, reducing the body of an animal to a sparse ... ...

    Abstract Characterizing animal behavior requires methods to distill 3D movements from video data. Though keypoint tracking has emerged as a widely used solution to this problem, it only provides a limited view of pose, reducing the body of an animal to a sparse set of experimenter-defined points. To more completely capture 3D pose, recent studies have fit 3D mesh models to subjects in image and video data. However, despite the importance of mice as a model organism in neuroscience research, these methods have not been applied to the 3D reconstruction of mouse behavior. Here, we present ArMo, an articulated mesh model of the laboratory mouse, and demonstrate its application to multi-camera recordings of head-fixed mice running on a spherical treadmill. Using an end-to-end gradient based optimization procedure, we fit the shape and pose of a dense 3D mouse model to data-derived keypoint and point cloud observations. The resulting reconstructions capture the shape of the animal’s surface while compactly summarizing its movements as a time series of 3D skeletal joint angles. ArMo therefore provides a novel alternative to the sparse representations of pose more commonly used in neuroscience research.
    Language English
    Publishing date 2023-02-18
    Publishing country United States
    Document type Preprint
    DOI 10.1101/2023.02.17.526719
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  2. Article ; Online: A faithful internal representation of walking movements in the Drosophila visual system.

    Fujiwara, Terufumi / Cruz, Tomás L / Bohnslav, James P / Chiappe, M Eugenia

    Nature neuroscience

    2016  Volume 20, Issue 1, Page(s) 72–81

    Abstract: The integration of sensorimotor signals to internally estimate self-movement is critical for spatial perception and motor control. However, which neural circuits accurately track body motion and how these circuits control movement remain unknown. We ... ...

    Abstract The integration of sensorimotor signals to internally estimate self-movement is critical for spatial perception and motor control. However, which neural circuits accurately track body motion and how these circuits control movement remain unknown. We found that a population of Drosophila neurons that were sensitive to visual flow patterns typically generated during locomotion, the horizontal system (HS) cells, encoded unambiguous quantitative information about the fly's walking behavior independently of vision. Angular and translational velocity signals were integrated with a behavioral-state signal and generated direction-selective and speed-sensitive graded changes in the membrane potential of these non-spiking cells. The nonvisual direction selectivity of HS cells cooperated with their visual selectivity only when the visual input matched that expected from the fly's movements, thereby revealing a circuit for internally monitoring voluntary walking. Furthermore, given that HS cells promoted leg-based turning, the activity of these cells could be used to control forward walking.
    MeSH term(s) Animals ; Behavior/physiology ; Drosophila ; Drosophila melanogaster/physiology ; Locomotion/physiology ; Motion Perception/physiology ; Neurons/physiology ; Photic Stimulation/methods ; Vision, Ocular/physiology ; Walking/physiology
    Language English
    Publishing date 2016-10-31
    Publishing country United States
    Document type Journal Article ; Research Support, Non-U.S. Gov't
    ZDB-ID 1420596-8
    ISSN 1546-1726 ; 1097-6256
    ISSN (online) 1546-1726
    ISSN 1097-6256
    DOI 10.1038/nn.4435
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  3. Article ; Online: DeepEthogram, a machine learning pipeline for supervised behavior classification from raw pixels.

    Bohnslav, James P / Wimalasena, Nivanthika K / Clausing, Kelsey J / Dai, Yu Y / Yarmolinsky, David A / Cruz, Tomás / Kashlan, Adam D / Chiappe, M Eugenia / Orefice, Lauren L / Woolf, Clifford J / Harvey, Christopher D

    eLife

    2021  Volume 10

    Abstract: Videos of animal behavior are used to quantify researcher-defined behaviors of interest to study neural function, gene mutations, and pharmacological therapies. Behaviors of interest are often scored manually, which is time-consuming, limited to few ... ...

    Abstract Videos of animal behavior are used to quantify researcher-defined behaviors of interest to study neural function, gene mutations, and pharmacological therapies. Behaviors of interest are often scored manually, which is time-consuming, limited to few behaviors, and variable across researchers. We created DeepEthogram: software that uses supervised machine learning to convert raw video pixels into an ethogram, the behaviors of interest present in each video frame. DeepEthogram is designed to be general-purpose and applicable across species, behaviors, and video-recording hardware. It uses convolutional neural networks to compute motion, extract features from motion and images, and classify features into behaviors. Behaviors are classified with above 90% accuracy on single frames in videos of mice and flies, matching expert-level human performance. DeepEthogram accurately predicts rare behaviors, requires little training data, and generalizes across subjects. A graphical interface allows beginning-to-end analysis without end-user programming. DeepEthogram's rapid, automatic, and reproducible labeling of researcher-defined behaviors of interest may accelerate and enhance supervised behavior analysis. Code is available at: https://github.com/jbohnslav/deepethogram.
    MeSH term(s) Animals ; Drosophila melanogaster ; Female ; Grooming ; Humans ; Image Processing, Computer-Assisted ; Kinetics ; Male ; Mice, Inbred C57BL ; Motor Activity ; Neural Networks, Computer ; Pattern Recognition, Automated ; Reproducibility of Results ; Social Behavior ; Supervised Machine Learning ; Video Recording ; Walking ; Mice
    Language English
    Publishing date 2021-09-02
    Publishing country England
    Document type Journal Article ; Research Support, N.I.H., Extramural ; Research Support, Non-U.S. Gov't ; Research Support, U.S. Gov't, Non-P.H.S. ; Video-Audio Media
    ZDB-ID 2687154-3
    ISSN 2050-084X ; 2050-084X
    ISSN (online) 2050-084X
    ISSN 2050-084X
    DOI 10.7554/eLife.63377
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  4. Article ; Online: Automated preclinical detection of mechanical pain hypersensitivity and analgesia.

    Zhang, Zihe / Roberson, David P / Kotoda, Masakazu / Boivin, Bruno / Bohnslav, James P / González-Cano, Rafael / Yarmolinsky, David A / Turnes, Bruna Lenfers / Wimalasena, Nivanthika K / Neufeld, Shay Q / Barrett, Lee B / Quintão, Nara L M / Fattori, Victor / Taub, Daniel G / Wiltschko, Alexander B / Andrews, Nick A / Harvey, Christopher D / Datta, Sandeep Robert / Woolf, Clifford J

    Pain

    2022  Volume 163, Issue 12, Page(s) 2326–2336

    Abstract: Abstract: The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis ... ...

    Abstract Abstract: The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for high-throughput preclinical analgesic efficacy assessment.
    MeSH term(s) Mice ; Animals ; Pain/diagnosis ; Pain/drug therapy ; Analgesia ; Pain Management ; Analgesics/pharmacology ; Analgesics/therapeutic use ; Pain Measurement
    Chemical Substances Analgesics
    Language English
    Publishing date 2022-05-11
    Publishing country United States
    Document type Journal Article ; Research Support, U.S. Gov't, Non-P.H.S. ; Research Support, Non-U.S. Gov't ; Research Support, N.I.H., Extramural
    ZDB-ID 193153-2
    ISSN 1872-6623 ; 0304-3959
    ISSN (online) 1872-6623
    ISSN 0304-3959
    DOI 10.1097/j.pain.0000000000002680
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

To top