-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add E2E test to test checkout performance #4344
Conversation
tests/e2e/utils/performance.js
Outdated
@@ -0,0 +1,37 @@ | |||
export async function getLoadingDurations() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to the function from woocommerce-blocks https://github.com/woocommerce/woocommerce-blocks/blob/trunk/tests/utils/performance.ts. The differences are that I can't destruct the performance.getEntriesByType(..)
assignment or use array prototype.find. I get the following errors:
ReferenceError: _slicedToArray2
ReferenceError: _find
It could be a puppeteer problem since these are used within page.evaulate()
. Our puppeteer version is 2 and it relies on puppeteer-utils. Blocks uses jest-environment-puppeteer where its puppeteer is at 14. It doesn't seem to have these issues. I think it would be good to upgrade it to the newer puppeteer version.
In any case, I worked around the ReferenceError
and this function works in this branch.
Instead of using test:e2e, use test:e2e-performance. Performance test isn't needed for everyone. Splitting it up can speed up development.
👋 @achyuthajoy @htdat I am adding an E2E test to measure |
Hey @harriswong I'll take a look at the PR today :). Just had a couple of questions though.
We can look into upgrading puppeteer to higher version. We're also looking into upgrading the node version so wp-scripts and e2e dependencies should be a part of it. |
No. There isn't any specific reason to store it in a file. As long as we can get the metrics in our local dev/CI, then that should be enough for our needs. I saw woocommerce-blocks' PR (woocommerce/woocommerce-blocks#6119) which they stored in a file for output. I thought that was a pretty good idea and so I took that direction as well.
I think it will be good to run this in Github Actions in the future, but maybe not for every PR. I was thinking it could be run manually from a trigger phrase in the comment, or per release candidate. In this case, would rendering output to the terminal be a better approach? |
Ok thanks! For now, I don't think we are running it in CI and we don't need the historic data. If this changes, I will pursue a different approach so that we can preserve the results. Let me know what you think. |
@harriswong It would be much cleaner if we keep a separate jest config for performance tests. That way, you don't need to exclude the test using environment variables and keep the NPM script to run just the performance test. What are your thoughts? |
Sure. If historic data is not needed, writing to file or outputting to terminal works. I also noticed that terminal logs are also removed after a while. @harriswong Do you plan to set up a Github workflow for running performance tests periodically/with manual trigger? Or just keep this for local testing. For more accurate results I would suggest setting up a GitHub actions workflow. It's because running locally on mac with docker can be very slow and might not be the original performance. |
@achyuthajoy To clarify, do you mean it is better to keep this commit a326f98? |
@harriswong Yes. That would be best to keep the performance test separate from current E2E. Was this commit removed based on our previous discussion? If yes, I'm so sorry, I might've missed something while checking the implementation. |
@achyuthajoy Thanks for the suggestion! I wasn't planning on setting up a GitHub action workflow as of yet. I was planning to run this locally so that we can measure a before-and-after effect on this other PR(#3968). We also don't know if we will be going forward with #3968. But you are right that the results may vary from machine to machine. Maybe creating a manual trigger is better. Even though #3968 is in draft, I can see some benefits in triggering it manually in the PR's comment. This way, the metric will be more consistent compared to each of our individual machines. I will look into adding a github action to this. Can this be a follow-up or would you prefer to have these all in the same PR? |
The GH workflow can be another PR. |
No problem at all! |
Added follow up here: #4389 |
Done. Running @achyuthajoy please take another look. |
@@ -490,4 +490,62 @@ export const merchantWCP = { | |||
await checkbox.click(); | |||
} | |||
}, | |||
|
|||
activateWooPay: async () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally we shouldn't write flows for toggling Dev Tools flag and use WP-CLI instead. This can be ignored for now since I've opened an issue to address these - #4402
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cli would be great! It will make it a lot faster! Thanks for the note.
recreatePerformanceFile(); | ||
} ); | ||
|
||
describe( 'Stripe element', () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@harriswong It would be best if the test spec contains only 1 describe
group and multiple it
groups. Or you could split the tests to multiple files which I don't think is required in this case.
Let me know your thoughts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason I did this was because I wanted to have a clean file per performance run. At the same time, I wanted each scenario (Stripe element, UPE, WooPay without UPE, etc) to have its own setup. As a result, I did
describe // first level to always wipe the file before we start a run
describe // Stripe element setup
describe // UPE setup
describe // WooPay without UPE setup
Looking at this again, I wonder if I can mix beforeAll
and beforeEach
with 1 layer of describe
, such that wiping the files execute only once. 👀
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@harriswong I don't have a strong opinion here. I'm good to go with your approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The other thing I could do is to get rid of all the beforeEach
and afterEach
. That's because there is only 1 test case for now, all the setup and teardown can be part of the it()
. Then, I can have just 1 describe
, is this preferred or should I keep the nested describe
code here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@harriswong I've tested the changes locally and can confirm that it's working a expected. Awesome work on this one.
I've left a few comments. Once it's cleared, it's good to merge 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@harriswong I noticed that the report file created is not added to gitignore
. Can you add it?
Also, tested the changes on a linux machine against a local server instance. The results are much faster that running on a Mac.
Hi @achyuthajoy , I merged the latest
Thanks for giving this a test! The power of linux 💪 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🚢
Thanks for your awesome work on this 👍
@achyuthajoy The "PHP testing (7.1)" job is stuck but this recent PR https://github.com/Automattic/woocommerce-payments/pull/4425/files removed 7.1 Should we ignore this stuck test and merge? |
@harriswong Yes, please ignore the 7.1 test. It was removed when WP & WC versions were bumped as part of the next release |
Changes proposed in this Pull Request
We would like to create an E2E test to measure metrics such as first paint, first contentful paint, etc. The test will run 3 times to take an average. It measures 3 things on the
/checkout
page:The PR uses a similar technique as woocommerce-blocks https://github.com/woocommerce/woocommerce-blocks/blob/trunk/tests/utils/performance.ts.
Testing instructions
tests/e2e/deps/wcp-dev-tools
folder is up-to-date. You can dogit pull
to make sure it is the latest.npm run test:e2e-performance
tests/e2e/reports/checkout-performance.txt
. Each line of the file is a JSON record of a performance metric. It should look something like this:npm run changelog
to add a changelog file, choosepatch
to leave it empty if the change is not significant. You can add multiple changelog files in one PR by running this command a few times.Post merge