`get_detector_inference_runner`: resolve the device from the model config when None by nattse · Pull Request #2841 · DeepLabCut/DeepLabCut
Navigation Menu
{{ message }}
DeepLabCut / DeepLabCut Public
- Notifications You must be signed in to change notification settings
- Fork 1.8k
Merged
n-poulsen merged 1 commit intoDeepLabCut:pytorch_dlcfrom
Jan 21, 2025Merged
Conversation
Copy link
Contributor
nattse
commented
Jan 20, 2025
nattse
commented
Fixes #2840 where get_detector_inference_runner was not able to get device from model_cfg, leaving all detector inference to be done on CPU
n-poulsen approved these changes Jan 21, 2025
Copy link
Contributor
n-poulsen
left a comment
n-poulsen
left a comment
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @nattse!
n-poulsen
added
the
DLC3.0🔥
label
n-poulsen
merged commit
db23d67
into
DeepLabCut:pytorch_dlc
n-poulsen
changed the title
Update utils.py
get_detector_inference_runner: resolve the device from the model config when None
n-poulsen
added
the
3.0.0rc7
label
This was referenced
Jan 22, 2025This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment