Skip to content

facebookresearch/optimizers

Repository files navigation

Optimizers

Python 3.10 | 3.11 | 3.12 tests gpu-tests linting formatting type-checking examples

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is BSD licensed, as found in the LICENSE file.

Installation and Dependencies

This code requires python>=3.10 and torch>=2.5.0. Install distributed_shampoo with all dependencies:

git clone git@github.com:facebookresearch/optimizers.git
cd optimizers
pip install .

If you also want to try the examples, replace the last line with pip install ".[examples]".

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo import AdamGraftingConfig, DistributedShampoo

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamGraftingConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.

About

For optimization algorithm research and development.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published