-
New from us- Markerless kinematic analysis using
#DeepLearning#DeepLabCut Many exciting applications to come: https://www.sciencedirect.com/science/article/pii/S0021929019301551?via%3Dihub … -
#DeepLabCut is awesome even for tracking tiny fly legs!@TrackingActionspic.twitter.com/x1jH7iFWev -
Deeplabcutting! Thanks to
@mwmathislab for this awesome tool we (@dann_benjamin, Timo Hueser and myself) are having fun while building a markerless hand tracking system! A preview of precise hand tracking from#deeplabcut even as I attempt to simulate swift monkey movements. pic.twitter.com/mJjxcRJDVr -
This week, I started playing around with
#deeplabcut from@TrackingActions—tracking my fingers and a bottle. Well, I am already hooked.pic.twitter.com/lNCwsF94jj -
upon popular demand our (@dann_benjamin, Timo and Hans Scherberger) 3D reconstruction toolbox written in Matlab is available for download at https://github.com/SwathiSheshadri/pose3d …. Below is a demo 3D reconstruction of cube corners from 2D tracked data using#deeplabcut
pic.twitter.com/SKQPORV9lH -
People View all
@DeepLabCut Markerless 3D Pose Estimation | Animal Tracking | Deep Learning Toolbox | Free & Open Source Code
#deeplabcut#dlcProTip | https://alexemg.github.io/DeepLabCut/ -
3D tracking based on
#deeplabcut! We are a fun team of Timo – hardware expert with a ton of LEGO training,@dann_benjamin – mentor with immense experiment design expertise & myself - the worker bee. Special thanks to@LittlePrimate and Andrej Filippow for their pep talks/advice!pic.twitter.com/SErRA6OHKu -
We can detect rearing in mice (also head direction) by analyzing the magnitude and direction of vectors made from body part identifiers tracked using
#DeepLabCut in a social task. Blue dots indicate rearing behavior@balajisriram#DeepLearning#neurosciencepic.twitter.com/kEQWsFqfK1 -
1st vids using
@DeepLabCut for zombie ant#biomechanics! Wasn’t going to pass up a chance to collaborate with@CharissaB and learn to use#DeepLabCut ! Thanks@shirazi_en for these 1st tries. Hope to get some $ to devote more effort into this!
#UCF@UCFCECS@UCFSciencespic.twitter.com/59Qk9fLJHU -
This animation of a
#gecko gecko walking made in#R#Rstudio , built from a#deeplabcut ML algorithm, took way too long to make this morning, so it had better get me at least one heart. pic.twitter.com/JJ8CEZmknp -
Sunday excitement: getting some figures without tracking error from
#DeepLabCut#animalbehaviorpic.twitter.com/UqzFfgQjlp
-
It goes really well!!
#deeplabcut trained with 200 frames and 160,000 iterations (Google GPU can only be used for 12 hrs...but it's not too bad
). Now time for behavior analysis!
pic.twitter.com/GOnx9kGn5iShow this thread -
Testing out
#DeepLabCut on the most important video I could think of. pic.twitter.com/PGIk4NzkziShow this thread -
Procrastinating from writing my PhD thesis by training
#DeepLabCut to follow my friends cats nose around. Tempted to start a cat behavioural lab from home.#neuroscience#CatsOfTwitterpic.twitter.com/aqXvq7eZej -
Using
#deeplabcut to track mouse pupil and eyelids. Pretty impressive given how little frames I annotated. It even tracks when there is motion blur/ jittering. It also tracks when eyelids are completely closed. pic.twitter.com/IlBdSrKGUz -
Need to sync videos + another signal (perhaps for
#3D#deeplabcut)? We released our custom Python GUI for acquisition from *multiple*@ImagingSource USB3 cameras + record system timestamps & optionally a@labview TTL.#CameraControl check it out:
https://github.com/AdaptiveMotorControlLab/Camera_Control … -
-
Oh, you evolved for 600M years to disrupt visual detection? No problem! Thanks to the hard work of
@TrackingActions and@TrackingPlumes on@DeepLabCut, we can track octopus! PS: Notice the siphon switching sides at the end, insane details!#deeplabcut#cephalopod@SimonGinginspic.twitter.com/6vRRMQ4hp3Show this thread -
It’s a
#FeatureFriday with@DeepLabCut analyzing our#leafcutter#ant from@LibertySciCtr! Note how only one tooth moves during cutting
#ants#leaf#postdoc#science#deeplabcut#MachineLearning#DeepLearning#sciencetwitter#scifri#sciencefriday#sciencedailypic.twitter.com/p1ZvAMJaTW – at Smilow Center for Translational Research -
Simultaneous behavioural tracking, vis stim and functional imaging is finally working.... This is a max projection of tail position over the course of the of a swim bout
#zebrafish#Neuroscience tracked with#deeplabcut pic.twitter.com/VZeYa23wMS
-
We are hiring! Are you a full stack software engineer who wants to work on an open source project?
#deeplabcut is calling... contact us (emails at http://deeplabcut.org ) for more information!#machinelearning#deeplearning#poseEstimationpic.twitter.com/aPGvtfPFTN
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.