Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Dropout in transformer layers in DeepMIL #590

Merged
merged 8 commits into from
Sep 5, 2022

Conversation

harshita-s
Copy link
Contributor

@harshita-s harshita-s commented Sep 2, 2022

In this PR:

  • transformer_dropout parameter is added to TransformerPooling and TransformerPoolingBenchmark pooling layers.
  • Tests are updated with the transformer_dropout parameter.

@codecov
Copy link

codecov bot commented Sep 2, 2022

Codecov Report

Merging #590 (bc096c4) into transfer_main (763dc7f) will increase coverage by 0.20%.
The diff coverage is 85.71%.

Impacted file tree graph

Flag Coverage Δ
hi-ml 82.45% <100.00%> (+0.59%) ⬆️
hi-ml-azure 25.47% <ø> (ø)
hi-ml-cpath 76.66% <66.66%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...i-ml-cpath/src/health_cpath/utils/deepmil_utils.py 59.70% <66.66%> (+0.61%) ⬆️
.../src/health_ml/networks/layers/attention_layers.py 96.33% <100.00%> (+14.08%) ⬆️

@harshita-s harshita-s marked this pull request as ready for review September 2, 2022 14:36
@harshita-s harshita-s requested a review from dccastro September 2, 2022 14:36
@harshita-s harshita-s merged commit f7c21a6 into transfer_main Sep 5, 2022
@harshita-s harshita-s deleted the hsharma/transformer_dropout branch September 5, 2022 10:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants