Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

feat: refactor head layers #130

Merged
merged 23 commits into from
Oct 17, 2021
Merged

feat: refactor head layers #130

merged 23 commits into from
Oct 17, 2021

Conversation

tadejsv
Copy link
Contributor

@tadejsv tadejsv commented Oct 14, 2021

No description provided.

@github-actions github-actions bot added size/l and removed size/m labels Oct 15, 2021
@hanxiao
Copy link
Member

hanxiao commented Oct 17, 2021

i think the overall risk of this design is it is strongly biased on pytorch & paddle, and breaking some prev efforts on unifying keras & pytorch & paddle implementations. It is true that supporting pytorch & paddle is easy, but a pattern that fits all three frameworks is tricky and that's what this project is about.

# Conflicts:
#	finetuner/tuner/base.py
#	finetuner/tuner/paddle/__init__.py
#	finetuner/tuner/pytorch/__init__.py
@tadejsv
Copy link
Contributor Author

tadejsv commented Oct 17, 2021

@hanxiao wait, my PR is not finished yet.I think the same thing can be done in Keras (tensorflow) as in pytorch

@hanxiao
Copy link
Member

hanxiao commented Oct 17, 2021

yes, let me try on this branch, I think I can finish this PR today

@hanxiao hanxiao marked this pull request as ready for review October 17, 2021 09:51
@hanxiao hanxiao merged commit 84585be into main Oct 17, 2021
@hanxiao hanxiao deleted the feat-new-head-layers branch October 17, 2021 11:30
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants