Skip to content

OllaSeek is a lightweight, offline, local AI chat application designed for seamless interactions without an internet connection. Powered by Ollama, this app allows users to have conversations with an AI assistant while keeping all data private and running locally on their machine.

Notifications You must be signed in to change notification settings

IsaacTalb/OllaSeek-Chat-Local

Repository files navigation

OllaSeek AI Local Chat Application

A simple web-based chat application that uses the Ollama API to interact with the DeepSeek-R1 AI model.

Prerequisites

Before running the application, make sure you have:

  1. Node.js installed on your system
  2. Ollama installed and running locally
  3. The DeepSeek-R1 model installed in Ollama
  4. Git installed and check in terminal with this command line: git --version

Ollama Installation

Windows

  1. Download Ollama for Windows from the official website:

  2. After installation:

    • Ollama will start automatically
    • You should see the Ollama icon in your system tray
    • The Ollama service will be running at http://127.0.0.1:11434

macOS

  1. Install using Homebrew:
brew install ollama
  1. Start Ollama:
ollama serve

Linux

  1. Install using curl:
curl -fsSL https://ollama.ai/install.sh | sh
  1. Start the Ollama service:
ollama serve

Installing DeepSeek-R1 Model

After installing Ollama:

  1. Open a terminal (Command Prompt or PowerShell on Windows)

  2. Pull the DeepSeek-R1 model:

ollama pull deepseek-r1:7b
  1. Wait for the download to complete (this may take several minutes depending on your internet connection)

  2. Verify the installation:

ollama list

You should see 'deepseek-r1:latest' in the list of models

  1. Run the model:
ollama run deepseek-r1:7b

This will start an interactive chat session with the model in your terminal. Type 'exit' or 'Ctrl + d' to quit the session.

Application Installation

  1. Clone the repository:
git clone https://github.com/IsaacTalb/OllaSeek-Chat-Local.git
cd OllaSeek-Chat-Local
  1. Install the required dependencies:
npm install

Running the Application

  1. Make sure Ollama is running:

    • On Windows: Check the system tray for the Ollama icon
    • On macOS/Linux: Run ollama serve if not already running
  2. Start the application server:

npm start
  1. Open your web browser and navigate to:
http://localhost:3000

Features

  • AI model and API endpoint runs locally on your machine
  • Clean user interface, responsive design
  • Real-time chat interaction with the DeepSeek-R1 AI model
  • Error handling and status messages

File Structure

  • index.html - The main HTML file containing the chat interface
  • styles.css - CSS styles for the chat application
  • script.js - Client-side JavaScript code
  • server.js - Node.js server that handles API proxying
  • package.json - Project dependencies and scripts

Technical Details

Troubleshooting

  1. If you see "model not found" error:

    • Make sure Ollama is running:
      • Windows: Check system tray
      • macOS/Linux: Run ps aux | grep ollama to check if the process is running
    • Run ollama pull deepseek-r1:latest to install the model
    • Verify with ollama list that the model is installed
    • Try running ollama run deepseek-r1:latest to test the model directly
  2. If the server won't start:

    • Check if port 3000 is already in use
    • Make sure all dependencies are installed
    • Try running npm install again
  3. If you get CORS errors:

  4. If Ollama won't start:

    • Windows:
      • Check Windows Services
      • Try restarting the Ollama service
    • macOS/Linux:
      • Check system logs: journalctl -u ollama
      • Try restarting: sudo systemctl restart ollama

License

This project is open source and available under the MIT License.

About

OllaSeek is a lightweight, offline, local AI chat application designed for seamless interactions without an internet connection. Powered by Ollama, this app allows users to have conversations with an AI assistant while keeping all data private and running locally on their machine.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published