Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add best_k_metrics parameter to the ModelCheckpoint #20457

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

gonzachiar
Copy link

@gonzachiar gonzachiar commented Nov 27, 2024

What does this PR do?

Adds a parameter to save all the metrics from the best model.

Fixes #20321

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--20457.org.readthedocs.build/en/20457/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Nov 27, 2024
Copy link
Collaborator

@lantiga lantiga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution! Happy to merge after we ensure full backward compatibility and test coverage

from lightning.pytorch.utilities.exceptions import MisconfigurationException
from lightning.pytorch.utilities.rank_zero import WarningCache, rank_zero_info, rank_zero_warn
from lightning.pytorch.utilities.types import STEP_OUTPUT
import pytorch_lightning as pl
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this needs to be lightning.pytorch as pl in the PR (it will become pytorch_lightning as pl automatically in the pytorch_lightning package)

from lightning.pytorch.utilities.types import STEP_OUTPUT
import pytorch_lightning as pl
from lightning_fabric.utilities.cloud_io import (
_is_dir,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we probably have different formatting options, please run

pre-commit run --all-files

from the base directory to make sure the PR conforms

get_filesystem,
)
from lightning_fabric.utilities.types import _PATH
from pytorch_lightning.callbacks import Checkpoint
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change to from lightning.pytorch.callbacks ...
and same for the following lines

@@ -241,9 +249,10 @@ def __init__(
self._last_global_step_saved = 0 # no need to save when no steps were taken
self._last_time_checked: Optional[float] = None
self.current_score: Optional[Tensor] = None
self.best_k_models: Dict[str, Tensor] = {}
self.best_k_models: Dict[str, Dict[str, Tensor | Dict[str, Tensor]]] = {}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this may be easier to read
Dict[str, Dict[str, Tensor]] | Dict[str, Dict[str, Dict[str, Tensor]]]
but ultimately we'd be better off defining a type alias

more importantly, we need to avoid breaking backward compatibility here
so whatever code relies on best_k_models being Dict[str, Tensor] today needs to keep working

I suggest we just limit ourselves to track best_model_metrics and not mess with best_k_models, or use a separate private attribute

@@ -523,7 +534,9 @@ def check_monitor_top_k(self, trainer: "pl.Trainer", current: Optional[Tensor] =
return True

monitor_op = {"min": torch.lt, "max": torch.gt}[self.mode]
should_update_best_and_save = monitor_op(current, self.best_k_models[self.kth_best_model_path])
should_update_best_and_save = monitor_op(
current, cast(Tensor, self.best_k_models[self.kth_best_model_path]["score"])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will stay as in the original if we avoid changing best_k_models

@@ -706,6 +706,7 @@ def test_model_checkpoint_save_last_none_monitor(tmp_path, caplog):
assert checkpoint_callback.best_model_path == str(tmp_path / "epoch=1-step=20.ckpt")
assert checkpoint_callback.last_model_path == str(tmp_path / "last.ckpt")
assert checkpoint_callback.best_model_score is None
assert checkpoint_callback.best_model_metrics is None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need to add tests that exercise the new code

@lantiga lantiga added the waiting on author Waiting on user action, correction, or update label Dec 5, 2024
@lantiga
Copy link
Collaborator

lantiga commented Dec 10, 2024

@gonzachiar I'm wrapping up the last few PRs for the release, do you have time to push this through in the next couple of days?

@mergify mergify bot removed the has conflicts label Dec 10, 2024
@gonzachiar
Copy link
Author

@gonzachiar I'm wrapping up the last few PRs for the release, do you have time to push this through in the next couple of days?

Hey, I will work on this during the weekend. Hopes that helps!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package waiting on author Waiting on user action, correction, or update
Projects
None yet
Development

Successfully merging this pull request may close these issues.

best-k-metrics in ModelCheckpoint
2 participants