Skip to content

AndreRatzenberger/streamlit-ollama-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Streamlit with Ollama Integration

image

Overview

This repository contains a simple boilerplate application that integrates Ollama into Streamlit to enable chat functionalities using models provided by Ollama.

The app features a sidebar that allows users to switch between different models provided by Ollama.

Running the Application

To run this application, use the following command:

streamlit run main.py

This will start the Streamlit server locally and open the application in your default web browser.

Requirements

This application requires the following Python packages:

You can install these packages using pip:

pip install ollama==0.1.8 streamlit==1.33.0

About

Simple chat with streamlit + ollama

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages