This project provides a lightweight chat interface for interacting with OLLAMA chatbot models using Streamlit. It allows users to select a model, configure parameters, and engage in a conversation with the chatbot.
- Model selection: Choose from a list of available OLLAMA models.
- Parameter configuration: Adjust the temperature and seed for the conversation.
- Conversation history: View the history of the conversation with the chatbot.
- Clear chat history: Reset the conversation history.
REF: https://github.com/ollama/ollama
To set up the chatbot interface, follow these steps:
- Clone this repository:
git clone https://github.com/your-repository/ollama-chatbot-interface.git
- Navigate to the project directory:
cd ollama-chatbot-interface
- Install the required dependencies:
pip install -r requirements.txt
- Run the Streamlit app:
streamlit run app.py
Once the app is running, you can interact with it through the web interface:
- Select a model from the sidebar.
- Adjust the temperature and seed parameters as desired.
- Type your message in the input field and press Enter to send it.
- The chatbot's response will appear in the chat window.
Contributions are welcome! Here are some features that could be added:
- Append the conversational history in the GUI.
- Support for streaming responses.
Feel free to fork the repository and submit a pull request with your changes.