-
Notifications
You must be signed in to change notification settings - Fork 844
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OLLAMA models support? v1.1.0 #135
Comments
+1 |
@saibalz2ai @showkeyjar it is supported, please check this document: https://microsoft.github.io/UFO/supported_models/ollama/ |
修改了 API_TYPE: "Ollama" ;API_BASE: "http://localhost:11434";API_MODEL: "llava-llama3",出现了Creating an experience indexer... |
@linmiao5 This is only a warning, which will affect the main workflow. Do you encounter other issues? |
Which Ollama model did you use to run this project successfully? |
@linmiao5 At the beginning, llava:7b was used to have a taste on UFO, but due to its own context length limitation, it needs to satisfy the input restriction of llava itself by removing example and other prompts. It is recommended to choose a model that supports a long context to have a try, of course GPT is for sure the best choice. |
I extend the timeout to 3 mins, the timeout issue gone, but still have the issue you mentioned in the first post. |
I use deepseek-r1:14b served with ollama by the way. |
Yes ollama is okay. This what i wrote you. Not sure who can help. I am getting this from all the ollama models.
On 07-Feb-2025 4:05 pm, icebilly ***@***.***> wrote:
That doesn't matter with timeout. Host timeout means ollama is not reachable with the given ip:port.
I have provided you the example location. Try that.
Reg. Deepseek I am not sure ufo supporting this model or not.
1st you resolved the timeout issue.
On 07-Feb-2025 6:48 am, icebilly @.***> wrote:
I use deepseek-r1:14b served with ollama by the way.
—
Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_microsoft_UFO_issues_135-23issuecomment-2D2641595385&d=DwMCaQ&c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&r=8kZK3igKo8ofL0_RJ4VxHNDpyW3p9Ly_jyMqmaBoeHY&m=B1wYuU2dPv-yfWLlg2cnJ98oWT2uOtZcHxCBQO9EG4qcna_cE7HVOT_pJ0dYOxfC&s=F3ptmuiMwB7NniPVJKJgTA14ov3vB347eShMqKWtHnA&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_AZGVKCDPFOI4Z6QYM57WMQ32OQCWRAVCNFSM6AAAAABRDVBHKCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNBRGU4TKMZYGU&d=DwMCaQ&c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&r=8kZK3igKo8ofL0_RJ4VxHNDpyW3p9Ly_jyMqmaBoeHY&m=B1wYuU2dPv-yfWLlg2cnJ98oWT2uOtZcHxCBQO9EG4qcna_cE7HVOT_pJ0dYOxfC&s=YtwUmbSl6DfgjUjXgggJZu5Jp5rvloEvnZHVSzkjG5Q&e=.
You are receiving this because you were mentioned.Message ID: @.***>
the curl command seems ok, below is what I get when using deepseek-r1:14b as the host agent and app agent.
image.png (view on web)<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_user-2Dattachments_assets_39a91f11-2D6109-2D4c78-2D971b-2D4a52d84d4b26&d=DwMFaQ&c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&r=8kZK3igKo8ofL0_RJ4VxHNDpyW3p9Ly_jyMqmaBoeHY&m=Zs9XSxWZRi2gkeGHmFneBLN9SP96TvNKOBm7jYpFyfm-3hYiJydLzWy1JQP1bILg&s=0-EiIZHv0H7B9i2YEnz5ZsYmHZKmWCvaGGV9t6SbTas&e=>
—
Reply to this email directly, view it on GitHub<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_microsoft_UFO_issues_135-23issuecomment-2D2642543959&d=DwMFaQ&c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&r=8kZK3igKo8ofL0_RJ4VxHNDpyW3p9Ly_jyMqmaBoeHY&m=Zs9XSxWZRi2gkeGHmFneBLN9SP96TvNKOBm7jYpFyfm-3hYiJydLzWy1JQP1bILg&s=otM9Du1pAAxcSbtzq3z2C5PNmZJM3o_J-mFVny2wp_Y&e=>, or unsubscribe<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_AZGVKCD4C7MMZI5VFWTWKSL2OSD5DAVCNFSM6AAAAABRDVBHKCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNBSGU2DGOJVHE&d=DwMFaQ&c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&r=8kZK3igKo8ofL0_RJ4VxHNDpyW3p9Ly_jyMqmaBoeHY&m=Zs9XSxWZRi2gkeGHmFneBLN9SP96TvNKOBm7jYpFyfm-3hYiJydLzWy1JQP1bILg&s=26ZbTARLBgalpnBLrpAxdxUxEmXj3FfEHYALzLOjsrQ&e=>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Which local model supports running UFO with ollama? I have tried with the latest models, but not a single one worked.
python -m ufo --task ollama1
Welcome to use UFO🛸, A UI-focused Agent for Windows OS Interaction.
| | | || | / _
| | | || | | | | |
| || || | | || |
_/ |_| __/
Please enter your request to be completed🛸:
open word, bold word breakup
Round 1, Step 1, HostAgent: Analyzing the user intent and decomposing the request...
Observations👀: None
Thoughts💡: None
Plans📚: (1) None
(2)
Next Selected application📲: [The required application needs to be opened.]
Messages to AppAgent📩:
Status📊: None
Comment💬: None
Creating an experience indexer...
Warning: Failed to load experience indexer from vectordb/experience/experience_db.
Creating an demonstration indexer...
Warning: Failed to load demonstration indexer from vectordb/demonstration/demonstration_db.
Please enter your new request. Enter 'N' for exit.
The text was updated successfully, but these errors were encountered: