Skip to content

Commit

Permalink
feat: change illm-client ollama url
Browse files Browse the repository at this point in the history
  • Loading branch information
ivynya committed Dec 30, 2023
1 parent 3d6e2ea commit 179ba71
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 1 deletion.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ services:
- ILLM_SCHEME=<ws|wss>
- ILLM_HOST=illm.example.com
- ILLM_PATH=/aura/provider
- OLLAMA_URL=http://host.docker.internal:11434
```
Run the server first, then the client. The client should log that it is connected. Then, if you don't want to write your own user interface, set up [Aura](https://github.com/ivynya/aura) as described in the README. Make sure to pull models before using the user interface because the client will not auto-pull them for you, it will just error.
Expand Down
1 change: 1 addition & 0 deletions client/client.go
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ var (
illm_scheme = os.Getenv("ILLM_SCHEME")
illm_host = os.Getenv("ILLM_HOST")
illm_path = os.Getenv("ILLM_PATH")
ollama_url = os.Getenv("OLLAMA_URL")
)

func main() {
Expand Down
2 changes: 1 addition & 1 deletion client/generate.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import (
)

func generate(c *websocket.Conn, req *internal.Request) ([]*llms.Generation, error) {
llm, err := ollama.New(ollama.WithModel(req.Generate.Model), ollama.WithServerURL("http://host.docker.internal:11434"))
llm, err := ollama.New(ollama.WithModel(req.Generate.Model), ollama.WithServerURL(ollama_url))
if err != nil {
log.Fatal(err)
}
Expand Down

0 comments on commit 179ba71

Please sign in to comment.