Skip to content

Commit 90e79de

Browse files
committed
change | for Optional
1 parent 8de7b92 commit 90e79de

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/diffusers/models/transformers/transformer_photon.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -383,7 +383,7 @@ def __init__(
383383
hidden_size: int,
384384
num_heads: int,
385385
mlp_ratio: float = 4.0,
386-
qk_scale: float | None = None,
386+
qk_scale: Optional[float] = None,
387387
):
388388
super().__init__()
389389

@@ -424,7 +424,7 @@ def forward(
424424
encoder_hidden_states: Tensor,
425425
temb: Tensor,
426426
image_rotary_emb: Tensor,
427-
attention_mask: Tensor | None = None,
427+
attention_mask: Optional[Tensor] = None,
428428
**kwargs: dict[str, Any],
429429
) -> Tensor:
430430
r"""

0 commit comments

Comments
 (0)