Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sparse 2:4 + FP8 Quantization e2e vLLM tests #1073

Merged
merged 2 commits into from
Jan 14, 2025
Merged

Sparse 2:4 + FP8 Quantization e2e vLLM tests #1073

merged 2 commits into from
Jan 14, 2025

Conversation

dsikka
Copy link
Collaborator

@dsikka dsikka commented Jan 14, 2025

SUMMARY:

  • Add 2:4 Sparsity + FP8 Quantization e2e tests

TEST PLAN:

  • Models produced by the tests:
    nm-testing/TinyLlama-1.1B-Chat-v1.0-sparse2of4_fp8_dynamic-e2e
    nm-testing/TinyLlama-1.1B-Chat-v1.0-sparse2of4_only-e2e
  • Verified to run e2e with vLLM

Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

@dsikka dsikka marked this pull request as ready for review January 14, 2025 22:06
@dsikka dsikka changed the title Sparse2of4 e2e Sparse 2:4 + FP8 Quantization e2e vLLM tests Jan 14, 2025
@dsikka dsikka requested review from rahul-tuli and horheynm January 14, 2025 22:07
@dsikka dsikka added the ready When a PR is ready for review label Jan 14, 2025
@dsikka dsikka merged commit 0755398 into main Jan 14, 2025
6 of 7 checks passed
@dsikka dsikka deleted the sparse2of4_e2e branch January 14, 2025 23:54
kylesayrs pushed a commit that referenced this pull request Jan 15, 2025
SUMMARY:
- Add 2:4 Sparsity + FP8 Quantization e2e tests

TEST PLAN:
- Models produced by the tests:
nm-testing/TinyLlama-1.1B-Chat-v1.0-sparse2of4_fp8_dynamic-e2e
nm-testing/TinyLlama-1.1B-Chat-v1.0-sparse2of4_only-e2e
- Verified to run e2e with vLLM

Signed-off-by: Kyle Sayers <kylesayrs@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants