Skip to content

↔ [Converter] Add support for amax in Torch-TensorRT #2095

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
narendasan opened this issue Jul 10, 2023 · 1 comment
Closed

↔ [Converter] Add support for amax in Torch-TensorRT #2095

narendasan opened this issue Jul 10, 2023 · 1 comment
Assignees
Labels
component: converters Issues re: Specific op converters feature request New feature or request Story: ATen Op Support

Comments

@narendasan
Copy link
Collaborator

  • Function Schema:

torch.ops.aten.amax.default

  • Original PyTorch API:

https://pytorch.org/docs/stable/generated/torch.amax.html

  • Relevant TensorRT Documentation:

cc: @zewenli98

Alternatives

Additional context

@narendasan narendasan added feature request New feature or request component: converters Issues re: Specific op converters labels Jul 10, 2023
@narendasan narendasan changed the title ↔ [Converter] Add support for my_op in Torch-TensorRT ↔ [Converter] Add support for amax in Torch-TensorRT Jul 10, 2023
@apbose
Copy link
Collaborator

apbose commented Sep 25, 2023

Completed

@apbose apbose closed this as completed Sep 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: converters Issues re: Specific op converters feature request New feature or request Story: ATen Op Support
Projects
None yet
Development

No branches or pull requests

4 participants