Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add optimizers to unified API #14

Merged
merged 4 commits into from
Feb 28, 2024
Merged

Add optimizers to unified API #14

merged 4 commits into from
Feb 28, 2024

Conversation

SamDuffield
Copy link
Contributor

This PR allows the user to swap out the UQ transforms with optimisation if they like.

Nothing sophisticated just gives a bit more flexibility to the API.

Also open to discussion to whether this is within scope or not.

Copy link

@dmitrisaberi dmitrisaberi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i see how this makes everything more functional, and makes the use of optimizers consistent with the unified api -- and it's very nice in that way -- but i don't get what added flexibility / functionality this actually buys you. could you give me a bit more intuition for this? no problems with code at all though, all LGTM

@SamDuffield
Copy link
Contributor Author

SamDuffield commented Feb 12, 2024

i see how this makes everything more functional, and makes the use of optimizers consistent with the unified api -- and it's very nice in that way -- but i don't get what added flexibility / functionality this actually buys you. could you give me a bit more intuition for this? no problems with code at all though, all LGTM

I'm not expecting these to replace torch.optim optimisation in general it just adds support in case the user wants to write a script that can run any uqlib method and simple MAP optimisation is likely a good baseline.

Essentially just adds support to swap easily from a UQ method to pure optimisation.

E.g. in the lightning_autoencoder.py example we could have

method, config_args = uqlib.vi.diag, {"optimizer": torchopt.adam(lr=1e-3)}

or

method, config_args = uqlib.optim, {"optimizer": torch.optim.Adam, "lr": 1e-3}

And all of the code below this the same (i.e. method agnostic).

Copy link
Contributor

@johnathanchiu johnathanchiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@SamDuffield SamDuffield merged commit cb47f7d into main Feb 28, 2024
2 checks passed
@SamDuffield SamDuffield deleted the add-opt branch February 28, 2024 09:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants