-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve test speed #533
Comments
From passing The 10 slowest tests from before #525 (total local run time of 17s) ====================================================================================================================================== slowest 10 durations ======================================================================================================================================
3.00s call tests/devices/unit_tests/test_focusing_mirror.py::test_mirror_set_voltage_sets_and_waits_settle_timeout_expires
2.01s call tests/devices/unit_tests/test_zocalo_results.py::test_subscribe_only_on_called_stage
1.01s call tests/devices/unit_tests/test_xspress3mini.py::test_stage_fails_in_failed_acquire_state
0.15s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i22]
0.14s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i22]
0.14s call tests/devices/unit_tests/test_zocalo_interaction.py::test_run_start[with_exception-expected_message1]
0.13s call tests/beamlines/unit_tests/test_beamline_utils.py::test_device_is_new_after_clearing
0.13s call tests/devices/unit_tests/test_gridscan.py::test_scan_within_limits_1d[-1-5-1-False]
0.11s setup tests/devices/unit_tests/test_attenuator.py::test_set_transmission_in_run_engine
0.11s call tests/devices/unit_tests/test_eiger.py::test_given_in_free_run_mode_and_not_all_frames_collected_in_time_when_unstaged_then_odin_stopped_and_exception_thrown The 10 slowest tests after (total local run time of 35s) ====================================================================================================================================== slowest 10 durations ======================================================================================================================================
3.00s call tests/devices/unit_tests/test_focusing_mirror.py::test_mirror_set_voltage_sets_and_waits_settle_timeout_expires
2.01s call tests/devices/unit_tests/test_zocalo_results.py::test_subscribe_only_on_called_stage
1.43s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i22]
1.26s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[p38]
1.26s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i22]
1.19s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[p38]
1.06s call tests/beamlines/unit_tests/test_beamline_utils.py::test_device_is_new_after_clearing
1.00s call tests/devices/unit_tests/test_xspress3mini.py::test_stage_fails_in_failed_acquire_state
0.90s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_device_creation[i03]
0.83s setup tests/beamlines/unit_tests/test_device_instantiation.py::test_devices_are_identical[i03] Looks like the device instantiation tests are slow, so we should profile them. Possibly #415 will help as well? |
We're also seeing a lot of timeout errors in tests now. This could be related |
Annoyingly this doesn't seem to have a single cause. I agree that #415 should help a lot, as we suddenly have a lot more devices. Removing the logging which was added in ophyd_async 0.3a4 gets the total test time to ~30s. I'll try to add something which deals with this |
@coretl @DominicOram and I were discussing strange slowness in mock signals too, have you seen any of that? |
Yeah, ignore what I said about the logs. Running the dodal tests with a version of |
Tracked as bluesky/ophyd-async#312 Will try and take a look this afternoon |
Done in bluesky/ophyd-async#316 |
Confirmed this fixes the issue, thank you! |
Unit tests are now taking ~45 sec (https://github.com/DiamondLightSource/dodal/actions/runs/9084455713/job/24965427206) the last test only took ~20 sec (https://github.com/DiamondLightSource/dodal/actions/runs/9017299416/job/24775541622)
The main difference is #525
Acceptance Criteria
The text was updated successfully, but these errors were encountered: