-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add best_k_metrics parameter to the ModelCheckpoint #20457
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the contribution! Happy to merge after we ensure full backward compatibility and test coverage
from lightning.pytorch.utilities.exceptions import MisconfigurationException | ||
from lightning.pytorch.utilities.rank_zero import WarningCache, rank_zero_info, rank_zero_warn | ||
from lightning.pytorch.utilities.types import STEP_OUTPUT | ||
import pytorch_lightning as pl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this needs to be lightning.pytorch as pl
in the PR (it will become pytorch_lightning as pl
automatically in the pytorch_lightning
package)
from lightning.pytorch.utilities.types import STEP_OUTPUT | ||
import pytorch_lightning as pl | ||
from lightning_fabric.utilities.cloud_io import ( | ||
_is_dir, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we probably have different formatting options, please run
pre-commit run --all-files
from the base directory to make sure the PR conforms
get_filesystem, | ||
) | ||
from lightning_fabric.utilities.types import _PATH | ||
from pytorch_lightning.callbacks import Checkpoint |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change to from lightning.pytorch.callbacks ...
and same for the following lines
@@ -241,9 +249,10 @@ def __init__( | |||
self._last_global_step_saved = 0 # no need to save when no steps were taken | |||
self._last_time_checked: Optional[float] = None | |||
self.current_score: Optional[Tensor] = None | |||
self.best_k_models: Dict[str, Tensor] = {} | |||
self.best_k_models: Dict[str, Dict[str, Tensor | Dict[str, Tensor]]] = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this may be easier to read
Dict[str, Dict[str, Tensor]] | Dict[str, Dict[str, Dict[str, Tensor]]]
but ultimately we'd be better off defining a type alias
more importantly, we need to avoid breaking backward compatibility here
so whatever code relies on best_k_models
being Dict[str, Tensor]
today needs to keep working
I suggest we just limit ourselves to track best_model_metrics
and not mess with best_k_models
, or use a separate private attribute
@@ -523,7 +534,9 @@ def check_monitor_top_k(self, trainer: "pl.Trainer", current: Optional[Tensor] = | |||
return True | |||
|
|||
monitor_op = {"min": torch.lt, "max": torch.gt}[self.mode] | |||
should_update_best_and_save = monitor_op(current, self.best_k_models[self.kth_best_model_path]) | |||
should_update_best_and_save = monitor_op( | |||
current, cast(Tensor, self.best_k_models[self.kth_best_model_path]["score"]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this will stay as in the original if we avoid changing best_k_models
@@ -706,6 +706,7 @@ def test_model_checkpoint_save_last_none_monitor(tmp_path, caplog): | |||
assert checkpoint_callback.best_model_path == str(tmp_path / "epoch=1-step=20.ckpt") | |||
assert checkpoint_callback.last_model_path == str(tmp_path / "last.ckpt") | |||
assert checkpoint_callback.best_model_score is None | |||
assert checkpoint_callback.best_model_metrics is None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need to add tests that exercise the new code
@gonzachiar I'm wrapping up the last few PRs for the release, do you have time to push this through in the next couple of days? |
for more information, see https://pre-commit.ci
Hey, I will work on this during the weekend. Hopes that helps! |
What does this PR do?
Adds a parameter to save all the metrics from the best model.
Fixes #20321
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--20457.org.readthedocs.build/en/20457/