Ollama-compatible endpoints for OpenWebUI integration #4695
bioshazard
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am testing out Ollama with OpenWebUI and I like the convenience of model mangement, specifically model download. It seems to me that with a few alias endpoint wrapping the existing schema for /apply, LocalAI could make itself compatible with the existing OpenWebUI management integration it offers for Ollama. I am interested in working on this, but am unable to commit to any scope at this time. Figured to share the idea in case it was interesting.
Beta Was this translation helpful? Give feedback.
All reactions