Skip to content

Commit 3e7d212

Browse files
authored
improves modelbuilder wrapper (#409)
* improves modelbuilder wrapper * fix * doc * fix urls
1 parent b99ccbf commit 3e7d212

4 files changed

Lines changed: 7 additions & 5 deletions

File tree

.github/workflows/check-urls.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,8 @@ jobs:
3030
print_all: false
3131
timeout: 2
3232
retry_count# : 2
33-
exclude_urls: https://github.com/pytorch/pytorch/pull/117009,https://github.com/huggingface/transformers/pull/29285,https://github.com/pytorch/pytorch/blob/a44f8894fa6d973693aab44a3dda079a168b05c1/torch/_decomp/decompositions.py#L1475,https://github.com/huggingface/transformers/pull/36652
34-
exclude_patterns: https://dumps.wikimedia.org/,https://github.com/,https://huggingface.co/,https://huggingface.co/
33+
exclude_urls: https://github.com/pytorch/pytorch/pull/117009,https://github.com/huggingface/transformers/pull/29285,https://github.com/pytorch/pytorch/blob/a44f8894fa6d973693aab44a3dda079a168b05c1/torch/_decomp/decompositions.py#L1475,https://github.com/huggingface/transformers/pull/36652,https://www.ilankelman.org/stopsigns/australia.jpg
34+
exclude_patterns: https://dumps.wikimedia.org/,https://github.com/,https://huggingface.co/,https://huggingface.co/,https://www.ilankelman.org/stopsigns/australia.jpg
3535
# force_pass : true
3636

3737
- name: urls-checker-docs

.github/workflows/documentation.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -109,9 +109,9 @@ jobs:
109109
grep ERROR doc.txt | grep -v 'l-plot-tiny-llm-export' | grep -v 'Unexpected section title or transition.'
110110
exit 1
111111
fi
112-
if [[ $(grep WARNING doc.txt | grep -v 'l-plot-tiny-llm-export' | grep -v 'Inline emphasis start-string' | grep -v 'Definition list ends without a blank line' | grep -v 'Unexpected section title or transition' | grep -v 'Inline strong start-string' | grep -v 'MambaCache' ) ]]; then
112+
if [[ $(grep WARNING doc.txt | grep -v 'l-plot-tiny-llm-export' | grep -v 'Inline emphasis start-string' | grep -v 'Definition list ends without a blank line' | grep -v 'Unexpected section title or transition' | grep -v 'Inline strong start-string' | grep -v 'MambaCache' | grep -v 'duplicate label patch-_patched_patch_transformers-common' ) ]]; then
113113
echo "Documentation produces warnings."
114-
grep WARNING doc.txt | grep -v 'l-plot-tiny-llm-export' | grep -v 'Inline emphasis start-string' | grep -v 'Definition list ends without a blank line' | grep -v 'Unexpected section title or transition' | grep -v 'Inline strong start-string' | grep -v 'MambaCache'
114+
grep WARNING doc.txt | grep -v 'l-plot-tiny-llm-export' | grep -v 'Inline emphasis start-string' | grep -v 'Definition list ends without a blank line' | grep -v 'Unexpected section title or transition' | grep -v 'Inline strong start-string' | grep -v 'MambaCache' | grep -v 'duplicate label patch-_patched_patch_transformers-common'
115115
exit 1
116116
fi
117117

CHANGELOGS.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ Change Logs
44
0.9.1
55
+++++
66

7+
* :pr:`409`: improves ModelBuilder wrapper
78
* :pr:`408`: fix torch_deepcopy for empty DynamicCache and transformers==5.1.0, 5.2.0 (see https://github.com/huggingface/transformers/pull/43765/)
89

910
0.9.0

onnx_diagnostic/helpers/model_builder_helper.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -310,7 +310,8 @@ def _post(onnx_model):
310310
if not hasattr(config, key):
311311
setattr(config, key, getattr(text_config, key))
312312
elif config.architectures[0] == "GptOssForCausalLM":
313-
delattr(config, "quantization_config")
313+
if hasattr(config, "quantization_config"):
314+
delattr(config, "quantization_config")
314315
elif (
315316
config.architectures[0] == "PhiMoEForCausalLM"
316317
and config.max_position_embeddings != config.original_max_position_embeddings

0 commit comments

Comments
 (0)