Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failure to connect to locally running Ollama instance on WSL #3435

Open
3 tasks
Patrick-Erichsen opened this issue Dec 18, 2024 · 0 comments
Open
3 tasks

Failure to connect to locally running Ollama instance on WSL #3435

Patrick-Erichsen opened this issue Dec 18, 2024 · 0 comments
Assignees
Labels
area:chat Relates to chat interface ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior

Comments

@Patrick-Erichsen
Copy link
Collaborator

Before submitting your bug report

Relevant environment info

OS: Windows 11 with WSL2
Continue version: tried multiple, including v0.8.50, v0.8.60, v0.8.63 and v0.9.245
IDE version: VS Code v1.96.0
Ollama version: tried v0.3.12 and v0.5.1

Description

I've had this setup for few months and it has worked great. Not sure what has changed but Continue isn't working anymore with Ollama (Ollama hosted locally under WSL2). Ollama is still accessible from the host Windows machine. I don't currently have access to other providers to check if those would work.

Getting this error if I try to chat:

Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.

Some trace:

[Extension Host] Error: Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.
    at customFetch2 (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:108369:23)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async withExponentialBackoff (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:104224:26)
    at async Ollama2._streamChat (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:116838:26)
    at async Ollama2.streamChat (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:108621:32)
    at async llmStreamChat (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:602632:23)
    at async kh.value (c:\Users\morland\.vscode\extensions\continue.continue-0.8.63-win32-x64\out\extension.js:609610:29)

At least Continue v0.8.0 (I did not try everything in between) is still able to talk to Ollama and I'm getting some response, although quite messy one.

To reproduce

No response

Log output

No response

@dosubot dosubot bot added area:chat Relates to chat interface ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior labels Dec 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:chat Relates to chat interface ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior
Projects
None yet
Development

No branches or pull requests

1 participant