-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serialize Config from Model #7
Merged
Merged
Changes from 6 commits
Commits
Show all changes
14 commits
Select commit
Hold shift + click to select a range
9aae8e8
Apply quantization config implementation
8465015
add TODO
24e04b6
integrate full lifecycle support, QuantizationStatus updates, add tin…
b5a07c4
fix comment
7142a71
initial implementation
23e9ae8
add unit test
dd77890
Merge branch 'main' into serialize_config
b9c9530
cleanup is_quantized
845bfb9
clean up targets and ignore lists
1a7984c
global compression ratio and docstrings
faa93c9
make sure scale/zp on correct device
caeab7d
helper for model quantization
e7e6f43
Merge branch 'fix_device_mismatch' into serialize_config
ec2ef84
Merge branch 'main' into serialize_config
bfineran File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,113 @@ | ||
# Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
|
||
import re | ||
from collections import OrderedDict | ||
from typing import Iterable, Optional | ||
|
||
from sparsetensors.quantization.lifecycle.calibration import set_module_for_calibration | ||
from sparsetensors.quantization.lifecycle.frozen import freeze_module_quantization | ||
from sparsetensors.quantization.lifecycle.initialize import ( | ||
initialize_module_for_quantization, | ||
) | ||
from sparsetensors.quantization.quant_config import ( | ||
QuantizationConfig, | ||
QuantizationStatus, | ||
) | ||
from sparsetensors.quantization.quant_scheme import QuantizationScheme | ||
from sparsetensors.quantization.utils import iter_named_leaf_modules | ||
from torch.nn import Module | ||
|
||
|
||
__all__ = [ | ||
"apply_quantization_config", | ||
"apply_quantization_status", | ||
] | ||
|
||
|
||
def apply_quantization_config(model: Module, config: QuantizationConfig): | ||
""" | ||
Initializes the model for quantization in-place based on the given config | ||
|
||
:param model: model to apply quantization config to | ||
:param config: quantization config | ||
""" | ||
# build mapping of targets to schemes for easier matching | ||
# use ordered dict to preserve target ordering in config | ||
target_to_scheme = OrderedDict() | ||
for scheme in config.config_groups.values(): | ||
for target in scheme.targets: | ||
target_to_scheme[target] = scheme | ||
|
||
# build list of layers to target to avoid mutating submodule dict during iteration | ||
layer_quant_scheme_pairs = [] | ||
for name, submodule in iter_named_leaf_modules(model): | ||
if _find_first_name_or_class_match(name, submodule, config.ignore): | ||
continue # layer matches ignore list, continue | ||
target = _find_first_name_or_class_match(name, submodule, target_to_scheme) | ||
if target is not None: | ||
# target matched - add layer and scheme to target list | ||
layer_quant_scheme_pairs.append((submodule, target_to_scheme[target])) | ||
|
||
# apply current quantization status for each matched pair | ||
for layer, scheme in layer_quant_scheme_pairs: | ||
apply_quantization_status( | ||
module=layer, | ||
scheme=scheme, | ||
status=config.quantization_status, | ||
) | ||
|
||
|
||
def apply_quantization_status( | ||
module: Module, scheme: QuantizationScheme, status: QuantizationStatus | ||
): | ||
""" | ||
Applies in place the quantization lifecycle up to the given status | ||
|
||
:param module: module to apply quantization to | ||
:param scheme: quantization scheme to apply | ||
:param status: status to update the module to | ||
""" | ||
if status >= QuantizationStatus.INITIALIZED: | ||
initialize_module_for_quantization(module, scheme) | ||
if status >= QuantizationStatus.CALIBRATION: | ||
set_module_for_calibration(module) | ||
if status >= QuantizationStatus.FROZEN: | ||
freeze_module_quantization(module) | ||
|
||
|
||
def _find_first_name_or_class_match( | ||
name: str, | ||
module: Module, | ||
targets: Iterable[str], | ||
) -> Optional[str]: | ||
# first element of targets that matches the given name | ||
# if no name matches returns first target that matches the class name | ||
# returns None otherwise | ||
return _find_first_match(name, targets) or _find_first_match( | ||
module.__class__.__name__, targets | ||
) | ||
|
||
|
||
def _find_first_match(value: str, targets: Iterable[str]) -> Optional[str]: | ||
# returns first element of target that matches value either | ||
# exactly or as a regex after 're:' | ||
for target in targets: | ||
if target.startswith("re:"): | ||
pattern = target[3:] | ||
if re.match(pattern, value): | ||
return target | ||
elif target == value: | ||
return target | ||
return None |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,36 @@ | ||
# Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
|
||
from typing import Tuple | ||
|
||
from torch.nn import Module | ||
|
||
|
||
__all__ = ["is_module_quantized", "iter_named_leaf_modules", "module_type"] | ||
|
||
|
||
def is_module_quantized(module: Module) -> bool: | ||
return hasattr(module, "quantization_scheme") | ||
Satrat marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
|
||
def module_type(module: Module) -> str: | ||
return type(module).__name__ | ||
|
||
|
||
def iter_named_leaf_modules(model: Module) -> Tuple[str, Module]: | ||
# yields modules that do not have any submodules | ||
# TODO: potentially expand to add list of allowed submodules such as observers | ||
for name, submodule in model.named_modules(): | ||
if len(list(submodule.children())) == 0: | ||
yield name, submodule |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# Copyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See TODO comment about allowing for exceptions in leaf nodes for observers. This will be relevant for non frozen quantized models