Is it possible to use self hosted llm model? #2810
Closed
marcin-github
started this conversation in
General
Replies: 2 comments 1 reply
-
Hi, Yes, here is a guide https://www.octobot.cloud/en/guides/octobot-interfaces/chatgpt#custom-llm-base-url-for-prediction. |
Beta Was this translation helpful? Give feedback.
1 reply
-
My question comes from reading this guide and the fact that I was unable to get the connection to my own llm model to work. I defined:
but there is no traffic to llm . I expected to see, at least, Octo verify connection to llm. But I see nothing on tcpdump. So I assumed that only Openai is supported. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I see that Octobot supports AI. Is it possible to connect to own llm model hosted on ollama? Does it make sense to use local llm instead gpt?
Beta Was this translation helpful? Give feedback.
All reactions