Fix logic in `accelerator.prepare` + IPEX for 2+ `nn.Models` and/or `optim.Optimizers` by mariusarvinte · Pull Request #3517 · huggingface/accelerate

@mariusarvinte

What does this PR do?

Fixes #3516.

Fixes the logic in accelerator._prepare_ipex such that:

  • Multiple nn.Module objects can now be prepared in parallel, without silent failures.
  • An error is raised if a user attempts to prepare a combination that is non-trivial to infer (e.g., two modules and two optimizers at the same time).
  • A cautionary note and a suggested workaround for this limitation (due to intel-extension-for-pytorch) is added to the documentation.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@yao-matrix @BenjaminBossan @SunMarc @zach-huggingface

SunMarc

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot ! Just a nit

Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>

@mariusarvinte

@mariusarvinte

@mariusarvinte

SunMarc

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks !

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.