Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
move out zero grad logic into separate function (#969)
Summary: Pull Request resolved: #969 # Context Currently it isn't possible to log gradients from AutoUnit as they are zeroed out before `on_train_step_end()` is reached. # This Diff Moves out the zeroed grad from the `_update_weights` and into it's own function. Can be overridden, ie ``` class MyAutoUnit(AutoUnit): ... def zero_grad(self) -> self.logger.log(self.module.grad) super().zero_grad() ``` to log the gradients prior to zeroing them out Reviewed By: galrotem, diego-urgell Differential Revision: D68983117 fbshipit-source-id: 744b72c5634d8b6979ef1145fc3254ddde93d743
- Loading branch information