-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About implementing a reversible MLP Network #95
Comments
Hi, thanks for your questions.
|
Thanks for your reply, I have solve this problem as follows: The network I want to implement is an MLP with 8 layers of FC, the code is as follows I found that when I modify the permute_soft parameter to False, there is no problem at all. I have two questions here.
|
Great!
|
Thank you for your reply. I have another question to disturb you. When I was training this reversible MLP network, I found that as the training progresses, the reversible structure of this reversible MLP is being destroyed. That is to say, as the training progresses, the gap between the input and the inverse of output gradually getting bigger. Excuse me, why is this? Is there any solution? Looking forward to your reply. |
Hi, sorry for the late reply to your question. It looks like you are using a single |
Thank you for sharing.
In this framework, I have some questions.
Question1
Can this INN framework implements an MLP network with different input and output dimensions? For example, the input dimension is (batch size, 10) and the output dimension is (batch size, 2) .
Question2
Using the design of the reversible MLP in your demo, I found that when the input dimension becomes very large (thousands), the program will be stuck when running. How to solve this problem?
The text was updated successfully, but these errors were encountered: