Skip to content

Conversation

Copilot
Copy link
Contributor

@Copilot Copilot AI commented Jun 14, 2025

This PR implements the missing torch.ops.prims.broadcast_in_dim.default operation that appears in BERT_pytorch and other PyTorch models.

Overview

The broadcast_in_dim operation is a primitive that broadcasts a tensor to a target shape by specifying which dimensions of the output correspond to the input tensor dimensions. This is different from standard broadcasting operations.

Implementation Details

Function signature:

def prims_broadcast_in_dim(
    a: TensorType, shape: INT64, broadcast_dimensions: Sequence[int]
) -> TensorType:

Parameters:

  • a: Input tensor to broadcast
  • shape: Target output shape
  • broadcast_dimensions: Specifies which dimensions of the output shape correspond to the input tensor dimensions

Example:

# Input tensor: [3, 4]
# Target shape: [2, 3, 5, 4] 
# broadcast_dimensions: [1, 3]
# Result: Input dimension 0 (size 3) maps to output dimension 1
#         Input dimension 1 (size 4) maps to output dimension 3
#         Output dimensions 0 and 2 are broadcasted (filled from size 1)

Fixes #2218. Fix pytorch/pytorch#135343

Copilot AI and others added 2 commits June 14, 2025 03:51
Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
@Copilot Copilot AI changed the title [WIP] [torchlib] Implement torch.ops.prims.broadcast_in_dim.default [torchlib] Implement torch.ops.prims.broadcast_in_dim.default Jun 14, 2025
@Copilot Copilot AI requested a review from justinchuby June 14, 2025 03:55
Copilot finished work on behalf of justinchuby June 14, 2025 03:55
Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
Copilot finished work on behalf of justinchuby June 14, 2025 14:30
Copy link

codecov bot commented Jun 14, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 70.33%. Comparing base (8ed3521) to head (690aa95).
⚠️ Report is 1 commits behind head on main.
✅ All tests successful. No failed tests found.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2382      +/-   ##
==========================================
+ Coverage   70.31%   70.33%   +0.01%     
==========================================
  Files         222      222              
  Lines       26647    26657      +10     
  Branches     2664     2666       +2     
==========================================
+ Hits        18738    18749      +11     
+ Misses       6993     6992       -1     
  Partials      916      916              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

…ions

Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
Copilot finished work on behalf of justinchuby June 14, 2025 14:43
Copy link
Contributor

@github-advanced-security github-advanced-security bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lintrunner found more than 20 potential problems in the proposed changes. Check the Files changed tab for more details.

… review

Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
Copilot finished work on behalf of justinchuby June 14, 2025 14:55
…in review

Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
Copilot finished work on behalf of justinchuby June 14, 2025 15:00
@justinchuby
Copy link
Collaborator

What does broadcast_in_dim do?

@justinchuby justinchuby marked this pull request as ready for review June 14, 2025 22:10
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@titaiwangms titaiwangms self-requested a review September 11, 2025 18:00
@xadupre
Copy link
Member

xadupre commented Sep 11, 2025

The proposed implementation is equivalent to the decomposition of broadcast_in_dims in pytorch.

@justinchuby
Copy link
Collaborator

@justinchuby justinchuby enabled auto-merge (squash) September 11, 2025 18:33
@justinchuby
Copy link
Collaborator

Will find ways to test as a follow up

@justinchuby justinchuby added the module: torchlib Related to the torch/aten function lib in development label Sep 11, 2025
@justinchuby justinchuby added this to the 0.5.0 milestone Sep 11, 2025
Copy link
Contributor

@titaiwangms titaiwangms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a test case?

@justinchuby

This comment was marked as resolved.

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby
Copy link
Collaborator

Added tests

@justinchuby justinchuby merged commit 39f1015 into main Sep 12, 2025
32 checks passed
@justinchuby justinchuby deleted the copilot/fix-2218 branch September 12, 2025 16:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

[torchlib] Implement torch.ops.prims.broadcast_in_dim.default Problem with conversion of torch model (htdemucs) to onnx
4 participants