[WIP] Fix gradient scaling bug in emd by rflamary · Pull Request #310 · PythonOT/POT
Navigation Menu
{{ message }}
- Notifications You must be signed in to change notification settings
- Fork 544
Merged
Conversation
Copy link Copy Markdown
Collaborator
Types of changes
- Fixes Issue Constants do not backpropagate through function emd2 in torch #309 about porblem in scaling of gradients
- Correct documentation problem in examle Intro_OT
- Fix upload of documentation when release
rflamary added 3 commits
November 16, 2021 09:43
rflamary
mentioned this pull request
Copy link Copy Markdown
codecov
bot
commented
Nov 16, 2021
codecov bot commented
Nov 16, 2021Codecov Report
Merging #310 (94a3868) into master (0c58991) will increase coverage by
0.04%.
The diff coverage is100.00%.
@@ Coverage Diff @@ ## master #310 +/- ## ========================================== + Coverage 93.39% 93.43% +0.04% ========================================== Files 21 21 Lines 4888 4888 ========================================== + Hits 4565 4567 +2 + Misses 323 321 -2
rflamary
merged commit
f4b363d
into
master
rflamary
deleted the
bug_grad_emd
branch
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment