Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit test timings - worst offenders #3678

Closed
ltalirz opened this issue Dec 16, 2019 · 9 comments
Closed

Unit test timings - worst offenders #3678

ltalirz opened this issue Dec 16, 2019 · 9 comments

Comments

@ltalirz
Copy link
Member

ltalirz commented Dec 16, 2019

One rule of thumb for writing unit tests is that they should test one unit of code and run in the blink of an eye.

While this is not always possible, we have a large number of tests that violate this rule quite drastically, and should probably be refactored to save everybody's time.

Here are the worst offenders in terms of runtime as of 2019-12-16:

21.01s call     aiida/backends/tests/tools/importexport/orm/test_links.py::TestLinks::test_high_level_workflow_links
9.69s call     aiida/backends/tests/tools/importexport/orm/test_links.py::TestLinks::test_complex_workflow_graph_export_sets
7.42s call     aiida/backends/tests/tools/importexport/orm/test_links.py::TestLinks::test_link_flags
7.17s call     aiida/backends/tests/test_nodes.py::TestNodeDeletion::test_delete_cases
5.73s call     aiida/backends/tests/engine/test_rmq.py::TestProcessControl::test_launch_with_inputs
5.46s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDuplicateNodeUuidMigration::test_deduplicated_uuids
5.43s call     aiida/backends/tests/engine/test_futures.py::TestWf::test_calculation_future_broadcasts
5.42s call     aiida/backends/tests/engine/test_rmq.py::TestProcessControl::test_exception_process
5.40s call     aiida/backends/tests/engine/test_rmq.py::TestProcessControl::test_submit_simple
5.05s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestUuidMigration::test_uuid_untouched
4.11s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestCalcAttributeKeysMigration::test_attribute_key_changes
3.87s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDbLogMigrationBackward::test_objpk_objname
3.76s call     aiida/cmdline/utils/test_multiline.py::TestMultilineInput::test_edit_pre_bash_comment
3.71s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestGroupRenamingMigration::test_group_string_update
3.70s call     aiida/backends/tests/cmdline/commands/test_code.py::TestVerdiCodeSetup::test_interactive_remote
3.63s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDbLogMigrationRecordCleaning::test_dblog_correct_export_of_logs
3.55s call     aiida/cmdline/utils/test_multiline.py::TestMultilineInput::test_edit_comment
3.55s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDbLogMigrationRecordCleaning::test_dblog_calculation_node
3.54s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDbLogMigrationRecordCleaning::test_dblog_unique_uuids
3.50s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDbLogMigrationRecordCleaning::test_metadata_correctness
3.44s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestNodePrefixRemovalMigration::test_data_node_type_string
3.37s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestParameterDataToDictMigration::test_data_node_type_string
3.24s call     aiida/backends/tests/cmdline/commands/test_code.py::TestVerdiCodeCommands::test_code_duplicate_interactive
3.19s call     aiida/backends/tests/test_nodes.py::TestNodeDeletion::test_indep2w
3.18s call     aiida/backends/djsite/db/subtests/migrations/test_migrations_many.py::TestDataMoveWithinNodeMigration::test_data_type_string
3.15s call     aiida/backends/tests/cmdline/commands/test_computer.py::TestVerdiComputerCommands::test_computer_duplicate_interactive
3.14s call     aiida/backends/tests/cmdline/commands/test_code.py::TestVerdiCodeSetup::test_mixed
3.12s call     aiida/backends/tests/cmdline/commands/test_code.py::TestVerdiCodeSetup::test_interactive_upload
3.12s call     aiida/backends/tests/cmdline/commands/test_computer.py::TestVerdiComputerSetup::test_interactiv

@CasperWA Looking at number 1, perhaps it is not necessary to export every small graph individually but rather first create all of the data, then export, then import and compare?

@sphuber Many of the long-running tests are obviously migrations. I wonder, is there anything that could be sped up there in general?

@sphuber
Copy link
Contributor

sphuber commented Dec 16, 2019

I don't think so I'm afraid. In the specific tests themselves we are doing nothing special. Just creating some nodes in the setup and then checking results after migration. I think the majority of the time is just the database schema being migrated back and forth.

@greschd
Copy link
Member

greschd commented Dec 17, 2019

If we can't fix them, maybe we should mark them as slow so they can easily be deselected? Reference: https://docs.pytest.org/en/latest/mark.html

@ltalirz
Copy link
Member Author

ltalirz commented Feb 22, 2022

There's a new top offender in town
https://github.com/aiidateam/aiida-core/blame/842d47dcc441919adf052d7ecd794e189aa46ba4/tests/cmdline/commands/test_code.py#L322

logs

110.22s call     tests/cmdline/commands/test_code.py::test_code_setup_local_duplicate_full_label_interactive[sleep 1; vim -cwq]
20.37s call     tests/tools/archive/orm/test_links.py::test_high_level_workflow_links
19.83s call     tests/cmdline/commands/test_run.py::TestAutoGroups::test_autogroup_filter_class
10.57s call     tests/tools/archive/orm/test_links.py::test_complex_workflow_graph_export_sets
9.42s call     tests/tools/archive/orm/test_links.py::test_link_flags
6.63s call     tests/cmdline/commands/test_daemon.py::test_daemon_restart
6.63s call     tests/transports/test_all_plugins.py::TestExecuteCommandWait::test_transfer_big_stdout
6.18s call     tests/cmdline/commands/test_code.py::test_code_setup_remote_duplicate_full_label_interactive[sleep 1; vim -cwq]
6.12s call     tests/cmdline/commands/test_computer.py::test_interactive[sleep 1; vim -cwq]
6.06s call     tests/test_nodes.py::TestNodeDeletion::test_delete_cases
5.54s call     tests/manage/test_profile_access.py::test_lock
4.77s call     tests/cmdline/commands/test_code.py::test_interactive_remote[vim -cwq]
4.69s call     tests/cmdline/commands/test_data.py::TestVerdiData::test_reachable
4.69s call     tests/orm/nodes/data/test_data.py::test_data_exporters[data_plugin1]
4.26s call     tests/cmdline/commands/test_code.py::test_interactive_upload[vim -cwq]
4.19s call     tests/cmdline/commands/test_code.py::test_code_duplicate_interactive[vim -cwq]
4.19s call     tests/cmdline/commands/test_code.py::test_code_duplicate_ignore[vim -cwq]
4.17s call     tests/cmdline/commands/test_computer.py::test_computer_duplicate_interactive[vim -cwq]
4.15s call     tests/cmdline/commands/test_code.py::test_mixed[vim -cwq]
3.80s call     tests/cmdline/commands/test_process.py::test_list_worker_slot_warning
3.54s call     tests/cmdline/commands/test_daemon.py::test_daemon_start_number
3.45s call     tests/cmdline/commands/test_daemon.py::test_daemon_status
3.27s call     tests/restapi/test_threaded_restapi.py::test_run_threaded_server
3.17s call     tests/cmdline/commands/test_daemon.py::test_daemon_start
3.15s call     tests/test_nodes.py::TestNodeDeletion::test_indep2w
3.05s call     tests/cmdline/utils/test_multiline.py::test_new_comment[sleep 1 ; vim -c "g!/^#=/s/$/Test" -cwq]
3.03s call     tests/cmdline/utils/test_multiline.py::test_edit_comment[sleep 1 ; vim -c "g!/^#=/s/$/Test" -cwq]
3.02s call     tests/cmdline/utils/test_multiline.py::test_edit_multiline_template[sleep 1 ; vim -c "g!/^#=/s/$/Test" -cwq]

git seems to blame @sphuber

@chrisjsewell
Copy link
Member

git seems to blame @sphuber

I always blame @sphuber 😝

But yeh, I had also noticed that

@sphuber
Copy link
Contributor

sphuber commented Feb 22, 2022

#5376 should fix that one

@ltalirz
Copy link
Member Author

ltalirz commented Feb 22, 2022

Thanks for the fix, it's gone

============================= slowest 50 durations =============================
20.48s call     tests/tools/archive/orm/test_links.py::test_high_level_workflow_links
19.84s call     tests/cmdline/commands/test_run.py::TestAutoGroups::test_autogroup_filter_class
11.08s call     tests/tools/archive/orm/test_links.py::test_complex_workflow_graph_export_sets

@ltalirz
Copy link
Member Author

ltalirz commented Mar 30, 2023

Update - looks like verdi group tests are not leading test durations

============================= slowest 50 durations =============================
38.36s call     tests/cmdline/commands/test_run.py::TestAutoGroups::test_autogroup_clashing_label
32.61s call     tests/cmdline/commands/test_group.py::TestVerdiGroup::test_add_remove_nodes
32.38s call     tests/cmdline/commands/test_group.py::TestVerdiGroup::test_move_nodes
17.92s call     tests/tools/archive/orm/test_links.py::test_high_level_workflow_links
17.24s call     tests/cmdline/commands/test_run.py::TestAutoGroups::test_autogroup_filter_class
15.63s call     tests/cmdline/commands/test_group.py::TestVerdiGroup::test_delete
11.57s call     tests/cmdline/commands/test_group.py::TestVerdiGroup::test_list_order
10.55s call     tests/engine/processes/calcjobs/test_calc_job.py::test_restart_after_daemon_reset
9.92s call     tests/cmdline/commands/test_node.py::TestVerdiNode::test_node_show
9.58s call     tests/cmdline/commands/test_group.py::TestVerdiGroup::test_show_limit
9.17s call     tests/tools/archive/orm/test_links.py::test_link_flags

I wonder whether this indicates that something about groups got slower?

@sphuber
Copy link
Contributor

sphuber commented Mar 30, 2023

I see you searched for is:issue is:open template 😄

The reason is not that groups got slower but due to #5846 . It changed certain CLI tests to run through an actual subprocess which was necessary to catch a critical bug that was introduced in v2.0.0, but was missed because we were running the CLI tests in memory.

I spent a lot of effort to try and minimize the additional runtime in the tests, but this was the best tradeoff I could find.

@ltalirz
Copy link
Member Author

ltalirz commented Mar 30, 2023

Cheers, good to know that this is behind those timing increases.

I will close this issue for now, since there is no immediate action we're planning.
Feel free to reopen when there is.

@ltalirz ltalirz closed this as completed Mar 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants