Skip to content

Conversation

justinchuby
Copy link
Collaborator

I have seen graphs like Add(bias, 1) in gemma3 where bias is an initializer. (Why?) This PR increases the default input limit so these bias initializers can be folded

I have seen graphs like `Add(bias, 1)` in gemma3 where bias is an initializer. (Why?) This PR increases the input limit so these bias initializers can be folded
Copy link

codecov bot commented Aug 28, 2025

❌ 11 Tests Failed:

Tests completed Failed Passed Skipped
12890 11 12879 1889
View the top 3 failed test(s) by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1182_test_sqrt_example
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:132: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_sqrt_example'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:266: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:134: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_sqrt_example' (e=No module named 'tests.onnx_backend_test_code.test_sqrt_example') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sqrt_example.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_sqrt_example.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy as np
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_sqrt_example(x: FLOAT[3]) -> (FLOAT[3]):
E       y = opset13.Sqrt(x)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1250_test_tril_out_pos
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:132: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_tril_out_pos'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:266: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:134: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_tril_out_pos' (e=No module named 'tests.onnx_backend_test_code.test_tril_out_pos') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_tril_out_pos.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_tril_out_pos.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy as np
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import INT64
E   from onnxscript.onnx_opset import opset14
E   
E   @script()
E   def bck_test_tril_out_pos(x: INT64[4,5], k: INT64) -> (INT64[4,5]):
E       y = opset14.Trilu(x, k, upper=0)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_1309_test_softmax_large_number
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:132: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_softmax_large_number'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:266: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:134: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_softmax_large_number' (e=No module named 'tests.onnx_backend_test_code.test_softmax_large_number') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_softmax_large_number.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_softmax_large_number.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy as np
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_softmax_large_number(x: FLOAT[2,4]) -> (FLOAT[2,4]):
E       y = opset13.Softmax(x)
E       return y

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@justinchuby justinchuby added this to the 0.4.1 milestone Aug 28, 2025
@justinchuby justinchuby enabled auto-merge (squash) August 28, 2025 02:05
@xadupre
Copy link
Member

xadupre commented Aug 28, 2025

Maybe we should make this an option we can change if needed.

@justinchuby
Copy link
Collaborator Author

It is already an option in the optimize() interface. This changes the default

@gramalingam
Copy link
Collaborator

Sounds fine to me as long as it doesn't have any other unexpected impact in standard models. The one place where this may be relevant is the ConstantOfShape operator (already discussed in issue/PR elsewhere), because I think it is common to see ConstantOfShape that generate tensors with order of 500 or 1000 values. I think we keep them as ConstantOfShape instead of folding, so there may be no impact there.

@justinchuby justinchuby merged commit 0433e04 into main Aug 28, 2025
26 of 32 checks passed
@justinchuby justinchuby deleted the justinchu/increase-limit branch August 28, 2025 17:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging this pull request may close these issues.

3 participants