Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

models3: fix Llama 3.2 chat template #3251

Merged
merged 1 commit into from
Dec 10, 2024
Merged

models3: fix Llama 3.2 chat template #3251

merged 1 commit into from
Dec 10, 2024

Conversation

cebtenzzre
Copy link
Member

The custom Llama 3 chat template had a mistake in the way it was slicing the messages array. It used [1] instead of the intended[1:], but slicing is not actually supported in Jinja2Cpp. This was fixed for the other models by using loop_start, so the same is done here.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
@cebtenzzre cebtenzzre changed the title models3: fix Llama 3 chat template models3: fix Llama 3.2 chat template Dec 10, 2024
@cebtenzzre cebtenzzre marked this pull request as ready for review December 10, 2024 17:17
@cebtenzzre cebtenzzre requested a review from manyoso December 10, 2024 17:17
@manyoso manyoso merged commit 663ea61 into main Dec 10, 2024
4 of 8 checks passed
cebtenzzre added a commit that referenced this pull request Dec 10, 2024
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants