batched inference by n-poulsen · Pull Request #2708 · DeepLabCut/DeepLabCut
Navigation Menu
{{ message }}
DeepLabCut / DeepLabCut Public
- Notifications You must be signed in to change notification settings
- Fork 1.8k
Merged
batched inference#2708
MMathisLab merged 9 commits intopytorch_dlcfrom
batched inference#2708
MMathisLab merged 9 commits intopytorch_dlcfrom
Conversation
Copy link
Contributor
n-poulsen
commented
Aug 7, 2024
n-poulsen
commented
Implemented batched inference for video analysis.
- updated the InferenceRunner classes to support batched inference
- added a
detector_batch_sizeattribute to the projectconfig.yaml(with default value 1) - added
batch_sizeanddetector_batch_sizeparameters tovideo_inference_superanimal - added
batch_sizeanddetector_batch_sizeparameters toanalyze_videos(with default valueNone, which means that the batch sizes used are the ones defined in theconfig.yaml) - added tests to ensure all frames are processed for top-down and bottom-up models
n-poulsen added 8 commits
August 2, 2024 12:55
n-poulsen
added
the
DLC3.0🔥
label
n-poulsen
requested a review
from AlexEMG
MMathisLab
merged commit
8a294c6
into
pytorch_dlc
n-poulsen
deleted the
niels/batched_video_analysis
branch
AlexEMG reviewed Aug 13, 2024
Copy link
Member
AlexEMG
left a comment
AlexEMG
left a comment
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect!
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment