A web interface built with Nuxt.js for interacting with Llama 3.2 language model locally.
- Docker
- Docker Compose
- Node.js (for local development)
- pnpm
- oLlama (for local development)
- Clone the repository:
git clone <repository-url>
cd iLlama
- Start the application using Docker Compose:
docker compose up -d
- Stop the application using Docker Compose:
docker compose down
The application will be available at http://localhost:3000
- Install dependencies and start dev server:
pnpm install
pnpm dev
-
Install oLlama
-
Pull and run the required model:
ollama pull llama3.2
ollama run llama3.2
The development server will be available at http://localhost:3000