Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve test speed #533

Closed
DominicOram opened this issue May 14, 2024 · 8 comments
Closed

Improve test speed #533

DominicOram opened this issue May 14, 2024 · 8 comments
Assignees

Comments

@DominicOram
Copy link
Contributor

DominicOram commented May 14, 2024

Unit tests are now taking ~45 sec (https://github.com/DiamondLightSource/dodal/actions/runs/9084455713/job/24965427206) the last test only took ~20 sec (https://github.com/DiamondLightSource/dodal/actions/runs/9017299416/job/24775541622)

The main difference is #525

Acceptance Criteria

@callumforrester
Copy link
Contributor

From passing --durations=10 to pytest:

The 10 slowest tests from before #525 (total local run time of 17s)

====================================================================================================================================== slowest 10 durations ======================================================================================================================================
3.00s call     tests/devices/unit_tests/test_focusing_mirror.py::test_mirror_set_voltage_sets_and_waits_settle_timeout_expires
2.01s call     tests/devices/unit_tests/test_zocalo_results.py::test_subscribe_only_on_called_stage
1.01s call     tests/devices/unit_tests/test_xspress3mini.py::test_stage_fails_in_failed_acquire_state
0.15s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i22]
0.14s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i22]
0.14s call     tests/devices/unit_tests/test_zocalo_interaction.py::test_run_start[with_exception-expected_message1]
0.13s call     tests/beamlines/unit_tests/test_beamline_utils.py::test_device_is_new_after_clearing
0.13s call     tests/devices/unit_tests/test_gridscan.py::test_scan_within_limits_1d[-1-5-1-False]
0.11s setup    tests/devices/unit_tests/test_attenuator.py::test_set_transmission_in_run_engine
0.11s call     tests/devices/unit_tests/test_eiger.py::test_given_in_free_run_mode_and_not_all_frames_collected_in_time_when_unstaged_then_odin_stopped_and_exception_thrown

The 10 slowest tests after (total local run time of 35s)

====================================================================================================================================== slowest 10 durations ======================================================================================================================================
3.00s call     tests/devices/unit_tests/test_focusing_mirror.py::test_mirror_set_voltage_sets_and_waits_settle_timeout_expires
2.01s call     tests/devices/unit_tests/test_zocalo_results.py::test_subscribe_only_on_called_stage
1.43s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i22]
1.26s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[p38]
1.26s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i22]
1.19s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[p38]
1.06s call     tests/beamlines/unit_tests/test_beamline_utils.py::test_device_is_new_after_clearing
1.00s call     tests/devices/unit_tests/test_xspress3mini.py::test_stage_fails_in_failed_acquire_state
0.90s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i03]
0.83s setup    tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i03]

Looks like the device instantiation tests are slow, so we should profile them. Possibly #415 will help as well?

@DominicOram
Copy link
Contributor Author

We're also seeing a lot of timeout errors in tests now. This could be related

@olliesilvester
Copy link
Collaborator

olliesilvester commented May 16, 2024

Annoyingly this doesn't seem to have a single cause. I agree that #415 should help a lot, as we suddenly have a lot more devices. Removing the logging which was added in ophyd_async 0.3a4 gets the total test time to ~30s. I'll try to add something which deals with this

@callumforrester
Copy link
Contributor

@coretl @DominicOram and I were discussing strange slowness in mock signals too, have you seen any of that?

@olliesilvester
Copy link
Collaborator

olliesilvester commented May 16, 2024

Yeah, ignore what I said about the logs. Running the dodal tests with a version of ophyd-async which includes everything from v0.3a4 everything except Create mock signal runs at a reasonable speed (~18s), so it's something in that commit

@coretl
Copy link
Collaborator

coretl commented May 17, 2024

Yeah, ignore what I said about the logs. Running the dodal tests with a version of ophyd-async which includes everything from v0.3a4 everything except Create mock signal runs at a reasonable speed (~18s), so it's something in that commit

Tracked as bluesky/ophyd-async#312

Will try and take a look this afternoon

@coretl
Copy link
Collaborator

coretl commented May 17, 2024

Done in bluesky/ophyd-async#316

@DominicOram
Copy link
Contributor Author

Confirmed this fixes the issue, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants