↔ [Converter] Add support for aten.gelu
and aten.tanh
in the FX aten
path
#1713
Labels
aten.gelu
and aten.tanh
in the FX aten
path
#1713
aten.gelu + aten.tanh
Function Schema:
torch.ops.aten.gelu.default: ((torch.float32,), {})
,torch.ops.aten.tanh.default: ((torch.float32,), {})
Original PyTorch API:
torch.gelu
,torch.tanh
Relevant TensorRT Documentation: IActivationLayer
Add support for
gelu
andtanh
as aten converters.The text was updated successfully, but these errors were encountered: