Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(llama-ggml): drop deprecated backend #4775

Merged
merged 1 commit into from
Feb 6, 2025
Merged

chore(llama-ggml): drop deprecated backend #4775

merged 1 commit into from
Feb 6, 2025

Conversation

mudler
Copy link
Owner

@mudler mudler commented Feb 6, 2025

Description

The llama.cpp/ggml format is now dead, since in the next version of LocalAI we already bring many breaking compatibility changes, taking the occasion also to drop ggml support (pre-gguf).

Notes for Reviewers

Signed commits

  • Yes, I signed my commits.

Copy link

netlify bot commented Feb 6, 2025

Deploy Preview for localai ready!

Name Link
🔨 Latest commit 695935c
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/67a4e291a73c37000903d15b
😎 Deploy Preview https://deploy-preview-4775--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

The GGML format is now dead, since in the next version of LocalAI we
already bring many breaking compatibility changes, taking the occasion
also to drop ggml support (pre-gguf).

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler merged commit 7f90ff7 into master Feb 6, 2025
25 checks passed
@mudler mudler deleted the drop-ggml branch February 6, 2025 17:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant