Confusion regarding the implementation of Federated Transfer Learning #5098
Unanswered
PaulKMandal
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have questions regarding the specific implementation of Federated Transfer Learning (FTL) that FATE Provides. First and foremost, I have reviewed The Liu et al. paper on FTL (available here). I also reviewed the Liu et al. paper on Vertical Federated Learning (VFL) available here which also briefly covers FTL.
I am confused about the following: In both of these papers, FTL is an inherently vertical process. However, looking at both the example file available here and the implementation of HeteroFTL here, there is no bottom model, interactive layer, or top layer. It appears that the NUS_Wide example is just training the same neural network on different data. I'm confused about how this works.
I have also reviewed the implementation here but need additional clarity about how the code actually works, especially since it seems to work completely differently from how VFL is implemented.
Any help is much appreciated!
Beta Was this translation helpful? Give feedback.
All reactions