-
-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API needed for ollama #3
Comments
Interesting use case. I just checked the library that I use for Ollama. It should be possible. Will try to implement it within the next week. |
Thank you! |
@abdessalaam I created a custom Docker image |
@icereed thank you!
|
@abdessalaam it seems the hook that I used wasn't working as expected. I implemented now a custom http client to inject the bearer token. I added also a unit test for that use case. Please pull again the same image and kindly run your test again. 🚀 |
I think the bearer token call works, but perhaps the response is nor formatted correctly? When I run it with open-ai it works fine (with the same docker image).
|
Can you explain your setup a bit more? Is there some reverse proxy like NGINX in front of Ollama? Maybe I can replicate the setup locally to test. |
I have implemented this llamatunnel so I run ollama with OpenWebUI on my local mac, but then make them available through a cloudflare tunnel over the internet, so I can connect to my local models from anywhere, as long as my mac is on. |
Interesting. As a workaround: Can you run paperless-gpt on your machine and let it interface with the paperless instance on your server? |
Ok, I will try that! |
@abdessalaam did it work out for you? I don't want to leave you hanging. |
Thank you for checking 😀 It works this way! |
Awesome 👏 |
My ollama is running remotely so it is secured by an API key. ( in the format
Authorization: Bearer YOUR_KEY
).Would it be possible to add it in a similar way to how OPEN_AI key is specified in .env?
Thanks!
The text was updated successfully, but these errors were encountered: