fix broken `train_network` argument `display_iters` in the GUI by n-poulsen · Pull Request #2865 · DeepLabCut/DeepLabCut
Hello,
Sorry this is indeed confusing, I agree with you.
Basically:
- When there was only Tensorflow as DeepLearning engine (before DLC 3.0), the naming was not consistent: the
train_network()method exposed this argument asdisplayiters, but inside the method itself (and in the model configuration file, pose_cfg.yaml), the variable is nameddisplay_iters. - When Pytorch support has been added (DLC 3.0), the variable has been consistently named
display_itersinside thepose_estimation_pytorchsubmodule. This means that inside pytorch_config.yaml, it is nameddisplay_iters, and if you call the Pytorch-specifictrain_network()method implementation (deeplabcut.pose_estimation_pytorch.train_network()), you specifydisplay_iters. - To route API-level methods calls to their implementation with the right Deep Learning Engine, there is the compat.py module. For example, when you call
deeplabcut.train_network(), you actually calldeeplabcut.compat.train_network(), which itself then calls eitherdeeplabcut.pose_estimation_tensorflow.train_network.pyif you have a Tensorflow-based model, ordeeplabcut.pose_estimation_pytorch.train_network.pyif you have a Pytorch-based model. And in order to stay backwards-compatible with the previous DeepLabCut versions, we had to name the argumentdisplayitersindeeplabcut.compat.train_network()(so that, if a user had a script that was doingdeeplabcut.train_network(displayiters=9999)with DLC 2.0, they can still run their script after updating their DLC version to 3.0).
To summarize:
- When you do
deeplabcut.train_network()- passdisplayiters - In your pytorch_config.yaml, specify
display_iters - If you use the pytorch module directly (
import deeplabcut.pose_estimation_pytorch as dlc_torch, and thendlc_torch.train_network()), passdisplay_iters.
Thanks for spotting the error in the deeplabcut.train_network() docstring, I opened a PR to fix it, and sorry for the confusing naming. Unfortunately we cannot refactor to make it consistent everywhere, because ensuring full backwards compatibility is a strict requirement of DeepLabCut.