Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump version of Python, torch and pytorch-lightning #58

Closed
wants to merge 5 commits into from

Conversation

marcovarrone
Copy link
Contributor

I would like to extend the compatibility with Python 3.11, torch 2.1, and pytorch-lightning 2.1.3.

One fix was easy to make (e.g. change training_epoch_end into on_training_epoch_end).

Another fix was about errors appearing for a mismatch of data type in the parameters of masked_scatter and scatter_add. They are probably related to these open issues: #81876, #115821.
For now, my solution is not very elegant, which is to force the data to be float32 the two times time it's passed to the functions.

Finally, this PR is still a draft because I cannot understand another error popping up:

RuntimeError: Early stopping conditioned on metric `frobenius_norm_change` which is not available. Pass in or modify your `EarlyStopping` callback to use any of the following: `inertia`

I cannot find any definition of frobenius_norm_change in the code, nor in Pytorch or Pytorch Lightning, so I don't know how to solve the error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant