-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds a CI/CD workflow to detect performance regression. #264
Conversation
* Adds a `--results-file` option. * Adds a `compare` command.
for field in relative_difference_score: | ||
value_diff = relative_difference_score[field] | ||
# TODO simply set the threshold to 1. Need optimization. | ||
if value_diff > REGRESSION_THRESHOLD: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is 0 a safer value to choose in the meantime? To exceed 1, wouldn't the new version have to be twice as slow?
The above commit addressed feedback except the threshold one. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving, acknowledging that there is still work to do to refine the threshold value.
Okay, I'll open a GH issue for the threshold value, and discuss it with you later. |
Opened an issue here. I'll merge this PR first. |
Description
This PR enables GitHub Actions to detect performance regression using ion-python-benchmark-cli.
This PR adds
--results-file
option to specify benchmark results destination,compare
command to compare perf results between the previous commit and the current one, andImportant Notes
compare
andoutput-results
. Need to change the target branch back to the main branch here after merging this PR.Details
The
--results-file
OptionThe
--results-file
writes the benchmark results to the specified destination in Ion. Otherwise the tool will print the result table in stdout. An example result below.Running
and you can find the result in file
my_output
The
compare
commandThis
compare
command compares two commits and identifies if regression has occurred. The results will be outputted to the specified destination. An example of the comparison result looks like the following:Running below command, note that both
previous_result
andcurrent_result
are very similar to themy_output
above section but just has an additional--format ion_text
.and will see the result in the
my_compare_result
(Edited to be pretty print for better read experience)both -0.2 and -0.3 are smaller than the threshold +0.6, so no regression is detected!
The CI/CD workflow
A workflow that generates sample data, benchmarks write/read performance of both previous and new commits, compares the results, and identifies if there are any regressions.
Example Output
A good example workflow including both ✅ and ❌ summaries.

and the detailed pipeline log when regression is detected:After downloading the benchmark results:
we can see that the relative execution time difference for
command:"write",options:["load_dump","ion_binary","buffer"]
exceeds the threshold 0.6 (back then it's 0.6) so the workflow is failed. Note that we use the current commit to compare against itself so there's nothing really affect the performance. I increased the threshold to 1.Recommended Review Order
Recommend to start with the GitHub Actions workflow to see the big picture of the workflow, then look into the benchmark-cli implementation to learn more about the technical details.
Test
See CI/CD below, and will create a new PR to change the targeted comparison commit back to the main branch.