Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Durabletask Tests #36

Open
3 tasks
saadsheralam opened this issue Jun 15, 2023 · 8 comments
Open
3 tasks

Durabletask Tests #36

saadsheralam opened this issue Jun 15, 2023 · 8 comments
Assignees

Comments

@saadsheralam
Copy link
Member

The developers rely on Azurite to test their code knowing that the tests will fail on the cloud but pass on Azurite. According to Tianyin, this can be a good introductory example.

  • Debug FileNotFound error for Azurite
  • Re-run tests on cloud and emulator
  • Update table with test count and discrepant tests in the Paper Structure doc.
@tianyin
Copy link
Member

tianyin commented Jul 1, 2023

@saadsheralam How are the results of the three sub-tasks?

Also, could you provide more information/pointers and thoughts about the Durable Tasks?

@saadsheralam
Copy link
Member Author

@tianyin I got occupied with debugging discrepancies and these got a bit sidelined from my end. Let me provide more info about this and also my thoughts.

  • Currently, we have run durabletask on both cloud and emulator. For the test cases that we have sensible results for (first 10), we observe 9 discrepancies.
  • The problem that we had was that durabletask had a lot more tests and a lot of them were not running properly and gave a 'FileNotFound' error (probably a setup issue while trying to run on azurite).
  • While having 9 discrepancies might be fine in comparison to other applications that we have tested, I believe we can find a lot more discrepancies if we can get the other tests to run properly. Which is why Anna and I have to debug the 'FileNotFound' error.
  • In the README for durabletask, the developers mention that many tests fail with a 409 conflict error if they are run on the cloud so it is recommended to run tests on azurite. This 409 conflict error arises due to one of the root causes that we have identified (I can talk about this in more detail in the meeting as well). This means that the developers were aware of discrepancies between cloud and emulator and recommended to only test on azurite. We had a discussion about this in one of the meetings where you mentioned that such a scenario might be a good introductory example.

If I have not explained anything clearly or if there are more quesitons, I would love to have a detailed discussion on slack/zoom.

@tianyin
Copy link
Member

tianyin commented Jul 1, 2023

Thanks for the explanation. It's great.

So my understanding is that debugging "FileNotFound" is the main task here, right?

It's completely fine that you prioritize other tasks such as debugging discrepancies of the fuzzing results.

@saadsheralam
Copy link
Member Author

Yes. Once we fix that, we can re-run tests on cloud and emulator and update the discrepancy count in the Paper Structure doc. For now though, I can add the current result that we have for durabletask. I have requested edit access for the doc.

@anna-mazhar
Copy link
Collaborator

Emulator run on Windows VM: emulator_log.txt

Many tests fail due to timeouts, I believe it is because of the use of VM. I will ask @xinze-zheng to run it on his Windows machine - if he runs into issues in building it then I will do the cloud run and conclude with these results.

@anna-mazhar
Copy link
Collaborator

@xinze-zheng will summarize the results from his local Windows machine run.
@anna-mazhar will summarize the results from the VM run.

TO-DOs:

  • Filter candidate tests
  • Separate the discrepant tests and find root causes.

@anna-mazhar
Copy link
Collaborator

@xinze-zheng

Here are the durabletask logs from my VM (emulator run). It also got stuck like in your case. Please compare and finalize its results.

@xinze-zheng
Copy link
Collaborator

@anna-mazhar
I've added the compare result to the github. Your log does not have the passed test printed so I used mine. According to my log, there seems to be two run of test case and some of the test are flicky (e.g. TestScalingDownToOneWorkers passed for the first run but failed in the second rund). Currently my script will only keep the latest results if a test runs more than once.
https://github.com/xlab-uiuc/cloudtest/blob/main/app_tests/durabletask/compare_result.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants