Skip to content

Latest commit

 

History

History
50 lines (28 loc) · 858 Bytes

notes.md

File metadata and controls

50 lines (28 loc) · 858 Bytes

Planned Agenda

ML: Focus on LLM Training, Fine-Tuning and Infernece ; Profiling HPC: Basics, primitives and Sample real application ; Profiling

Cerebras

ML

Pre-train: LLaMA 7B: https://docs.alcf.anl.gov/ai-testbed/cerebras/example-programs/#:~:text=in%20910.883781998%20seconds.-,Llama%2D7B,-The%20Cerebras%20llama7B

Fine-Tune: https://docs.cerebras.net/en/latest/wsc/Getting-started/Quickstart-for-fine-tune.html

Infernece:

HPC

GEMV GEMM MC

Sambanova

pre-train:

Fine-tune: https://docs.sambanova.ai/developer/latest/hf-compile-run.html#_fine_tune_the_model

Inference: modelbox?

HPC

AI4HPC?

Graphcore

pre-training: https://github.com/graphcore/examples/tree/master/nlp/llama/popxl

fine-tuning:

inference: https://github.com/graphcore/examples/tree/master/nlp/llama/popxl

ML

HPC

Groq

Habana