Skip to content

Latest commit

 

History

History
20 lines (19 loc) · 430 Bytes

README.md

File metadata and controls

20 lines (19 loc) · 430 Bytes

ggml-playground

ML model inference in Zig using ggml

How to run

  1. Start and get inside the Docker container:
    cd infra-dev/
    docker-compose up -d
    docker-compose exec -it ggml-playground bash
  2. Convert models:
    cd ./models/
    python3 ./convert.py gpt_neox "rinna/japanese-gpt-neox-small"
  3. Inference:
    cd ../ggml-playground/
    zig build test