Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test results, the acceptance rate is always 1 #5

Open
lpj12121 opened this issue Jun 3, 2024 · 6 comments
Open

Test results, the acceptance rate is always 1 #5

lpj12121 opened this issue Jun 3, 2024 · 6 comments

Comments

@lpj12121
Copy link

lpj12121 commented Jun 3, 2024

屏幕截图 2024-06-03 112232

After I trained the model and ran several rounds of tests, I found that the acceptance rate was always 1. The acceptance rate during the validation process during training was about 0.9, which was quite different from the test results. I did not modify the core code, but I could not get results close to those in the paper. Why is this?

@lpj12121
Copy link
Author

lpj12121 commented Jun 4, 2024

When I change the parameters of the virtual network, sometimes it will become 0.999. This acceptance rate is too high and it doesn't seem normal.
image

@GeminiLight
Copy link
Owner

Hi, could you provide me with more details so I can better understand the problems?

  1. Which algorithm do you currently run?
  2. How do you set the p_net_setting.yaml and v_sim_setting.yaml?
  3. Could you select other baseline algorithms and report their results?

@lpj12121
Copy link
Author

lpj12121 commented Jun 4, 2024

Hi, could you provide me with more details so I can better understand the problems?

  1. Which algorithm do you currently run?
  2. How do you set the p_net_setting.yaml and v_sim_setting.yaml?
  3. Could you select other baseline algorithms and report their results?

1.I am verifying HRL-ACRA, and I have trained the upper and lower agents.

2.The p_net_setting.yaml is the default value, which I did not change. The v_sim_setting.yaml is based on the comparative experiments in the paper. I am very curious why the acceptance rate is always 1 or 0.999 during the verification process. During training, there is a verification every ten rounds, and the acceptance rate finally converges to about 0.89.

3.I have trained the A3C-GCN model before, and this is more normal and closer to the data in the paper.

This problem seems to have been mentioned in issue 1, and it looks similar to mine

Repository owner deleted a comment from neflibata688 Jun 14, 2024
@GeminiLight
Copy link
Owner

Hi, the results of my running testing show that the AC rate is not always 1. Could you provide me with more details on any modifications to the original code?

@MerrillLi
Copy link

I may know what happens... if you set the decoding strategy to "beam," the AC will have a very high probability of being 1...
Set to greedy can reproduce the results reported in the paper.

@GeminiLight
Copy link
Owner

Thanks for your feedback. I will check this issue this week.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants