Skip to content
This repository has been archived by the owner on Oct 30, 2024. It is now read-only.

Ty/ollama local models #124

Merged
merged 21 commits into from
Aug 28, 2024
Merged

Ty/ollama local models #124

merged 21 commits into from
Aug 28, 2024

Conversation

AlephNotation
Copy link
Contributor

adds ollama models

@matildepark matildepark changed the base branch from main to dev August 16, 2024 18:35
@matildepark
Copy link
Contributor

If we can't get tiny models to .browse(), do we feel ok about them doing .get() and .do()? It would be nice to incorporate this, even with limitations.

@matildepark
Copy link
Contributor

Current status (over phone):

  • works okay with page.get(), not so sure about anything else.
  • Test, document, merge into dev.

@matildepark
Copy link
Contributor

Been doing some experiments with examples/findEmail.ts and node-llama-cpp, and I needed to set the context and batch size in generateObjectLocal.ts:

  const context = new LlamaContext({ model, contextSize: 32168, batchSize: 32168 });

Otherwise, it would throw:

/Users/maru/git/hdr/nolita/node_modules/node-llama-cpp/llama/llama.cpp/src/llama.cpp:14466: GGML_ASSERT(n_tokens_all <= cparams.n_batch) failed

The model in question I picked was capybarahermes-2.5-mistral-7b. Once I set contextSize and batchSize, it did get the emails off the page. If you try to get it to browse, it won't generate a command.

  issues: [
    {
      code: 'too_small',
      minimum: 1,
      type: 'array',
      inclusive: true,
      exact: false,
      message: 'Array must contain at least 1 element(s)',
      path: [ 'command' ]
    }
  ],
  addIssue: [Function (anonymous)],
  addIssues: [Function (anonymous)],
  errors: [
    {
      code: 'too_small',
      minimum: 1,
      type: 'array',
      inclusive: true,
      exact: false,
      message: 'Array must contain at least 1 element(s)',
      path: [ 'command' ]
    }
  ]
}

@AlephNotation
Copy link
Contributor Author

@matildepark good to go?

@matildepark matildepark merged commit 385c36e into dev Aug 28, 2024
3 checks passed
@matildepark matildepark deleted the ty/ollama-local-models branch August 28, 2024 18:28
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants