Skip to content
/ iLlama Public

A UI built with Nuxt for interacting with Llama 3.2 locally.

Notifications You must be signed in to change notification settings

wazeerc/iLlama

Repository files navigation

iLlama - A web app built with Nuxt for Llama3.2

A web interface built with Nuxt.js for interacting with Llama 3.2 language model locally.

Prerequisites

  • Docker
  • Docker Compose
  • Node.js (for local development)
  • pnpm
  • oLlama (for local development)

Docker Setup

  1. Clone the repository:
git clone <repository-url>
cd iLlama
  1. Start the application using Docker Compose:
docker compose up -d
  1. Stop the application using Docker Compose:
docker compose down

The application will be available at http://localhost:3000

Local Development Setup

  1. Install dependencies and start dev server:
pnpm install
pnpm dev
  1. Install oLlama

  2. Pull and run the required model:

ollama pull llama3.2
ollama run llama3.2

The development server will be available at http://localhost:3000

About

A UI built with Nuxt for interacting with Llama 3.2 locally.

Resources

Stars

Watchers

Forks