Skip to content

Conversation

justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented May 23, 2025

Fix pow.Tensor_Scalar type promotion by accounting different combination of input dtypes. This change ensures the inputs to Pow is always the same type for compatibility with downstream tools.

Also

Fix #2213

Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes type promotion in the pow.Tensor_Scalar operation by ensuring that both inputs to the Pow operator have matching dtypes. In addition, it updates related tests and improves IR string representation and enum capabilities.

  • Updated tests in torch_lib to cover various tensor-scalar pow operations.
  • Added a new is_floating_point method to the data type enum.
  • Modified the aten_pow_tensor_scalar implementation to handle type promotion for both floating point and integer cases.

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 1 comment.

File Description
tests/function_libs/torch_lib/e2e_ops_tests.py Updated test cases to assert ONNX program outputs for various pow scenarios.
onnxscript/ir/_enums.py Added a new is_floating_point method for better data type handling.
onnxscript/ir/_core.py Refactored the usage of _short_tensor_str for cleaner IR string representation.
onnxscript/function_libs/torch_lib/ops/core.py Revised aten_pow_tensor_scalar to include explicit type promotion logic.

@justinchuby justinchuby added module: torchlib Related to the torch/aten function lib in development module: IR Intermediate representation labels May 23, 2025
@justinchuby justinchuby mentioned this pull request May 23, 2025
9 tasks
Copy link

codecov bot commented May 23, 2025

❌ 4 Tests Failed:

Tests completed Failed Passed Skipped
16936 4 16932 2345
View the top 3 failed test(s) by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0175_test_cast_FLOAT16_to_DOUBLE
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.12.10\x64\Lib\importlib\__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_cast_FLOAT16_to_DOUBLE'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_cast_FLOAT16_to_DOUBLE' (e=No module named 'tests.onnx_backend_test_code.test_cast_FLOAT16_to_DOUBLE') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_cast_FLOAT16_to_DOUBLE.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_cast_FLOAT16_to_DOUBLE.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import DOUBLE, FLOAT16
E   from onnxscript.onnx_opset import opset21
E   
E   @script()
E   def bck_test_cast_FLOAT16_to_DOUBLE(input: FLOAT16[3,4]) -> (DOUBLE[3,4]):
E       output = opset21.Cast(input, to=11)
E       return output
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0393_test_gather_negative_indices
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.12.10\x64\Lib\importlib\__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_gather_negative_indices'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gather_negative_indices' (e=No module named 'tests.onnx_backend_test_code.test_gather_negative_indices') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gather_negative_indices.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_gather_negative_indices.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_gather_negative_indices(data: FLOAT[10], indices: INT64[3]) -> (FLOAT[3]):
E       y = opset13.Gather(data, indices, axis=0)
E       return y
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0772_test_prelu_broadcast
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.12.10\x64\Lib\importlib\__init__.py:90: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_prelu_broadcast'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_prelu_broadcast' (e=No module named 'tests.onnx_backend_test_code.test_prelu_broadcast') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_prelu_broadcast.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_prelu_broadcast.py', current folder: D:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset16
E   
E   @script()
E   def bck_test_prelu_broadcast(x: FLOAT[3,4,5], slope: FLOAT[5]) -> (FLOAT[3,4,5]):
E       y = opset16.PRelu(x, slope)
E       return y

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@justinchuby justinchuby modified the milestones: 0.2.7, 0.3 May 23, 2025
@justinchuby justinchuby marked this pull request as draft May 27, 2025 20:35
@justinchuby justinchuby marked this pull request as ready for review May 27, 2025 20:35
@justinchuby justinchuby removed this from the 0.2.7 milestone May 28, 2025
@justinchuby justinchuby added this to the 0.3 milestone May 28, 2025
justinchuby added a commit to onnx/ir-py that referenced this pull request May 28, 2025
Migration of changes in microsoft/onnxscript#2335

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Copy link
Contributor

@titaiwangms titaiwangms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have we already deprecated type promotion in exporter? I don't remember the status of type promotion.

justinchuby added a commit to onnx/ir-py that referenced this pull request May 28, 2025
Migration of changes in
microsoft/onnxscript#2335

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby justinchuby enabled auto-merge (squash) May 28, 2025 21:37
@justinchuby justinchuby merged commit 61d4ab5 into main May 28, 2025
24 of 29 checks passed
@justinchuby justinchuby deleted the justinchu/fix-pow branch May 28, 2025 21:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: IR Intermediate representation module: torchlib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

[torchlib] Pow(int, float) isn't converted correctly
2 participants