Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PHI decoupling] layer_norm_kernel => phi #49671

Closed
wants to merge 13 commits into from
Closed

[PHI decoupling] layer_norm_kernel => phi #49671

wants to merge 13 commits into from

Commits on Jan 9, 2023

  1. layer_norm_kernel => phi

    DrRyanHuang committed Jan 9, 2023
    Configuration menu
    Copy the full SHA
    cdba1ca View commit details
    Browse the repository at this point in the history
  2. fix mutable_data

    DrRyanHuang committed Jan 9, 2023
    Configuration menu
    Copy the full SHA
    d6d388d View commit details
    Browse the repository at this point in the history
  3. fix CudnnDataType

    DrRyanHuang committed Jan 9, 2023
    Configuration menu
    Copy the full SHA
    2341a50 View commit details
    Browse the repository at this point in the history
  4. fix LayerNormBackward

    DrRyanHuang committed Jan 9, 2023
    Configuration menu
    Copy the full SHA
    94066d0 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    97c017a View commit details
    Browse the repository at this point in the history
  6. fix LayerNormParamType

    DrRyanHuang committed Jan 9, 2023
    Configuration menu
    Copy the full SHA
    29e539a View commit details
    Browse the repository at this point in the history

Commits on Jan 10, 2023

  1. Configuration menu
    Copy the full SHA
    83c91a9 View commit details
    Browse the repository at this point in the history
  2. roll back

    DrRyanHuang committed Jan 10, 2023
    Configuration menu
    Copy the full SHA
    95e414b View commit details
    Browse the repository at this point in the history
  3. fix platform

    DrRyanHuang committed Jan 10, 2023
    Configuration menu
    Copy the full SHA
    90cbfd0 View commit details
    Browse the repository at this point in the history
  4. fix LayerNormParamType

    DrRyanHuang committed Jan 10, 2023
    Configuration menu
    Copy the full SHA
    094db38 View commit details
    Browse the repository at this point in the history
  5. fix LayerNormBackward

    DrRyanHuang committed Jan 10, 2023
    Configuration menu
    Copy the full SHA
    dfe7d16 View commit details
    Browse the repository at this point in the history
  6. fix memory

    DrRyanHuang committed Jan 10, 2023
    Configuration menu
    Copy the full SHA
    fcdb458 View commit details
    Browse the repository at this point in the history

Commits on Feb 3, 2023

  1. Configuration menu
    Copy the full SHA
    ed9cafe View commit details
    Browse the repository at this point in the history