
neptune.ai examples
Neptune is an experiment tracker purpose-built for foundation model training.
With Neptune, you can monitor thousands of per-layer metricsβlosses, gradients, and activationsβat any scale. Visualize them with no lag and no missed spikes. Drill down into logs and debug training issues fast. Keep your model training stable while reducing wasted GPU cycles.
In this repo, you'll find examples of using Neptune to log and retrieve your ML metadata.
You can run every example with zero setup (no registration needed).
Docs | Neptune | GitHub | Colab | |
---|---|---|---|---|
Quickstart | ||||
Track and organize runs | ||||
Monitor runs live |
Docs | Neptune | GitHub | Colab | |
---|---|---|---|---|
Re-run failed training | ||||
Log from sequential pipelines | ||||
DDP training experiments | ||||
Use multiple integrations together |
Neptune | GitHub | Colab | |
---|---|---|---|
Text classification using fastText | |||
Text classification using Keras | |||
Text summarization | |||
Time series forecasting |
GitHub | |
---|---|
Import runs from Weights & Biases | |
Copy runs from one Neptune project to another | |
Copy models and model versions from model registry to runs | |
Back up run metadata from Neptune |
GitHub | Colab | |
---|---|---|
Get Neptune storage per project and user | ||
Get runs with most fields logged |