Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delay at startup and high cpu usage on Windows in Python 3.12 #2555

Closed
2 tasks done
kimdre opened this issue Jan 16, 2024 · 23 comments · Fixed by #2730
Closed
2 tasks done

Delay at startup and high cpu usage on Windows in Python 3.12 #2555

kimdre opened this issue Jan 16, 2024 · 23 comments · Fixed by #2730
Labels

Comments

@kimdre
Copy link

kimdre commented Jan 16, 2024

Prerequisites

Description

When I start a test with let's say 50 peak users and a spawn rate of 10, locust won't run any requests for 5 seconds and the starts them all at once.
Same happens when I increase or decrease the peak number while a test is running: The test stops completely

Weird is also, that it complains about high CPU usage, which is not the case at all (CPU is a Ryzen 7 7800X3D).

This issue only happens since I updated locust to the latest version a while ago. Before that the user ramp-up/-down ran butter smooth.

I use locust via the Web UI.

PS C:\Users\kimdr\OneDrive\Dokumente\Git\magicline-webhook-endpoint\app> locust -f .\tests\load_test.py
[2024-01-16 19:59:24,612] Kim-PC-Win-11/INFO/locust.main: Starting web interface at http://localhost:8089 (accepting connections from all network interfaces)
[2024-01-16 19:59:24,620] Kim-PC-Win-11/INFO/locust.main: Starting Locust 2.20.1
[2024-01-16 19:59:34,757] Kim-PC-Win-11/INFO/locust.runners: Ramping to 50 users at a rate of 10.00 per second
[2024-01-16 19:59:40,314] Kim-PC-Win-11/WARNING/root: CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines
[2024-01-16 19:59:42,107] Kim-PC-Win-11/INFO/locust.runners: All users spawned: {"WebsiteUser": 50} (50 total users)
[2024-01-16 19:59:43,914] Kim-PC-Win-11/INFO/load_test: Response 200: {'entityId': 1210013898, 'uuid': 'abf6cc05-6ff6-4a95-b846-b2744cd931cf', 'payload': [{'timestamp': 1705431575350, 'type': 'CUSTOMER_CHECKOUT', 'content': None}]}
[2024-01-16 19:59:43,921] Kim-PC-Win-11/INFO/load_test: Response 200: {'entityId': 1210006695, 'uuid': '6c8220fa-15b4-4e42-b650-a07e035689ea', 'payload': [{'timestamp': 1705431574758, 'type': 'CUSTOMER_CHECKOUT', 'content': None}]}
...

This is what it looks like in the UI with the weird startup and rampup (negative spike at the end of the graph)
image

Command line

locust -f .\tests\load_test.py

Locustfile contents

import logging

from locust import FastHttpUser, TaskSet, task, between

from dependencies import headers, settings
from tests.utils import TestClient


logger = logging.getLogger(__name__)

print("Preparing Test Client...")
test_client = TestClient(
    magicline_tenant_name=settings.magicline_tenant_name,
    magicline_api_key=settings.magicline_api_key,
    is_sandbox=True
)


class PerformanceTest(TaskSet):

    def on_start(self):
        self.headers = headers
        self.client.headers = self.headers

    def on_stop(self):
        pass  # add code that you want to run during ramp down

    @task(300)
    def test_api_performance(self):
        self.client.headers = headers

        # Light-weight function that generates fake data using pre-fetched data and the faker package
        data = test_client.generate_fake_event()

        res = self.client.post("/api/v1/magicline/event", json=data, headers=headers)
        logger.info(f"Response {res.status_code}: {data}")

    @task(1)
    def test_health_check(self):
        self.client.get("/api/v1/health", headers=self.headers)


class WebsiteUser(FastHttpUser):
    tasks = [PerformanceTest]
    network_timeout = 5.0
    connection_timeout = 5.0
    wait_time = between(0.5, 1)

Python version

3.12

Locust version

2.20.1

Operating system

Windows 11

@kimdre kimdre added the bug label Jan 16, 2024
@cyberw
Copy link
Collaborator

cyberw commented Jan 16, 2024

Can you reproduce this with a locustfile that doesnt use any of your dependencies?

If possible, can you downgrade locust and find the latest version that doesnt have this issue?

@cyberw
Copy link
Collaborator

cyberw commented Jan 16, 2024

Also, what if you use tasks directly on the User instead of using a separate TaskSet?

@kimdre
Copy link
Author

kimdre commented Jan 16, 2024

Can you reproduce this with a locustfile that doesnt use any of your dependencies?

If possible, can you downgrade locust and find the latest version that doesnt have this issue?

Just tested it with only locust imported and I was able to reproduce it. 😕
I also tested it with 2.19.0 and 2.19.1, which one of must have been the version that I used before, and also got the same results.

Could it maybe be a updated Python version or Windows Update?
I may have updated my Python version aswell from 3.12.0 to 3.12.1 in the meantime.

Edit:

Also, what if you use tasks directly on the User instead of using a separate TaskSet?

This also didn't help

@cyberw
Copy link
Collaborator

cyberw commented Jan 16, 2024

Hmm… very strange. Can you try running just a plain locustfile with none of your ”special” stuff? And maybe try changing the weight from 300 to 1?

@kimdre
Copy link
Author

kimdre commented Jan 16, 2024

Same results.

I used the basic example from the docs and changed the url:

from locust import HttpUser, task

class HelloWorldUser(HttpUser):
    @task
    def hello_world(self):
        self.client.get("/api/v1/health")
PS C:\Users\kimdr\OneDrive\Dokumente\Git\magicline-webhook-endpoint\app> locust -f .\tests\example.py  
[2024-01-16 22:02:39,663] Kim-PC-Win-11/INFO/locust.main: Starting web interface at http://localhost:8089 (accepting connections from all network interfaces)
[2024-01-16 22:02:39,669] Kim-PC-Win-11/INFO/locust.main: Starting Locust 2.20.1
[2024-01-16 22:03:07,883] Kim-PC-Win-11/INFO/locust.runners: Ramping to 100 users at a rate of 10.00 per second
[2024-01-16 22:03:20,163] Kim-PC-Win-11/WARNING/root: CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines
[2024-01-16 22:03:23,718] Kim-PC-Win-11/INFO/locust.runners: All users spawned: {"HelloWorldUser": 100} (100 total users)
[2024-01-16 22:03:37,111] Kim-PC-Win-11/WARNING/locust.runners: CPU usage was too high at some point during the test! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines

@kimdre
Copy link
Author

kimdre commented Jan 16, 2024

By the way, the same happens when I run it headless:

PS C:\Users\kimdr\OneDrive\Dokumente\Git\magicline-webhook-endpoint\app> locust --headless --users 100 --spawn-rate 10 -H https://api.test.maki-it.com -f .\tests\load_test.py
Preparing Test Client...
 [ Done ] Get customers...
 [ Done ] Get customers contracts]]
[2024-01-16 22:16:39,979] Kim-PC-Win-11/INFO/locust.main: No run time limit set, use CTRL+C to interrupt
[2024-01-16 22:16:39,979] Kim-PC-Win-11/INFO/locust.main: Starting Locust 2.20.1
Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                         0     0(0.00%) |      0       0       0      0 |    0.00        0.00

[2024-01-16 22:16:39,980] Kim-PC-Win-11/INFO/locust.runners: Ramping to 100 users at a rate of 10.00 per second
Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                         0     0(0.00%) |      0       0       0      0 |    0.00        0.00

[2024-01-16 22:16:45,682] Kim-PC-Win-11/WARNING/root: CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines
Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                         0     0(0.00%) |      0       0       0      0 |    0.00        0.00

Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                         0     0(0.00%) |      0       0       0      0 |    0.00        0.00

[2024-01-16 22:16:54,805] Kim-PC-Win-11/INFO/load_test: Response 200: {'entityId': 1210005273, 'uuid': '67e7e35e-7dcf-4143-836e-7edbb41e5e09', 'payload': [{'timestamp': 1705439799981, 'type': 'CUSTOMER_CHECKIN', 'content': None}]}
[2024-01-16 22:16:54,806] Kim-PC-Win-11/INFO/load_test: Response 200: {'entityId': 1210005767, 'uuid': '2c1655a6-d638-4d02-804d-be139eb1e759', 'payload': [{'timestamp': 1705439801308, 'type': 'CLASS_BOOKING_CREATED', 'content': None}]}
...

@cyberw
Copy link
Collaborator

cyberw commented Jan 16, 2024

Wow. That is really weird. I’ll have to try it on my windows machine some time. I dont expect I’ll be able to reproduce it though :-/

@kimdre
Copy link
Author

kimdre commented Jan 16, 2024

Thanks in advance. Your project is awesome. 👍

@cyberw
Copy link
Collaborator

cyberw commented Jan 27, 2024

Sorry for not getting back to you sooner. I tried the simplified script on my machine with no issues.

C:\Users\larsh\git\locust\examples>locust -H http://example.com --headless --users 100 -r 10
[2024-01-27 15:19:37,020] lars-amd/INFO/locust.main: No run time limit set, use CTRL+C to interrupt
[2024-01-27 15:19:37,021] lars-amd/INFO/locust.main: Starting Locust 2.21.0
Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                 0     0(0.00%) |      0       0       0      0 |    0.00        0.00

[2024-01-27 15:19:37,023] lars-amd/INFO/locust.runners: Ramping to 100 users at a rate of 10.00 per second
Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                 0     0(0.00%) |      0       0       0      0 |    0.00        0.00

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                            20  20(100.00%) |   2032    2025    2049   2025 |    0.00        0.00
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                20  20(100.00%) |   2032    2025    2049   2025 |    0.00        0.00

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                            60  60(100.00%) |   2034    2018    2049   2018 |    3.33        3.33
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                60  60(100.00%) |   2034    2018    2049   2018 |    3.33        3.33

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                           120 120(100.00%) |   2039    2018    2083   2018 |    8.00        8.00
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                               120 120(100.00%) |   2039    2018    2083   2018 |    8.00        8.00

[2024-01-27 15:19:46,087] lars-amd/INFO/locust.runners: All users spawned: {"HelloWorldUser": 100} (100 total users)
Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                           200 200(100.00%) |   2041    2018    2083   2018 |   12.86       12.86
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                               200 200(100.00%) |   2041    2018    2083   2018 |   12.86       12.86

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                           300 300(100.00%) |   2037    2015    2083   2015 |   17.78       17.78
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                               300 300(100.00%) |   2037    2015    2083   2015 |   17.78       17.78

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                           400 400(100.00%) |   2037    2015    2083   2015 |   25.00       25.00
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                               400 400(100.00%) |   2037    2015    2083   2015 |   25.00       25.00

Type     Name                                  # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/v1/health                           500 500(100.00%) |   2037    2015    2083   2015 |   34.00       34.00
--------|------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                               500 500(100.00%) |   2037    2015    2083   2015 |   34.00       34.00

[2024-01-27 15:19:53,385] lars-amd/INFO/locust.main: Shutting down (exit code 1)

@cyberw
Copy link
Collaborator

cyberw commented Jan 27, 2024

I also tried with the web-ui. It is very noisy when I use ctrl-c to exit, but other than that I had no issues (with the simplified test).

@kimdre
Copy link
Author

kimdre commented Jan 28, 2024

After continue working with locust I noticed that this issue seems to occur only with targets over https.
When I try to load test via http it works fine.
Does maybe the SSL connection somehow cause something unexpected with locust?

@cyberw
Copy link
Collaborator

cyberw commented Jan 28, 2024

Is your server maybe not responding? https works fine for me as well.

@cyberw
Copy link
Collaborator

cyberw commented Jan 28, 2024

wait, now I WAS able to reproduce with https. Probably some exception is being triggered and then there is an infinite loop. I will investigate.

@cyberw
Copy link
Collaborator

cyberw commented Jan 28, 2024

For now, you can try using FastHttpUser as a workaround.

@kimdre
Copy link
Author

kimdre commented Jan 28, 2024

For now, you can try using FastHttpUser as a workaround.

The FastHttpUser is actually what I'm using already. 😅 Thanks for investigating!

@cyberw
Copy link
Collaborator

cyberw commented Feb 10, 2024

Huh. there seems to be a regression in OpenSSL OpenSSL 3.0.13/Python 3.12

python/cpython#95031

I was able to reproduce with 3.12 but not with 3.10.

@kimdre
Copy link
Author

kimdre commented Feb 10, 2024

Interesting. I hope there will be a fix anytime soon.
So the current workarounds are either use plain http or an older python version <= 3.11

@cyberw
Copy link
Collaborator

cyberw commented Feb 10, 2024

I'm looking into it. Might be possible to just set insecure option somewhere.

@cyberw
Copy link
Collaborator

cyberw commented Feb 10, 2024

A really awkward workaround is letting your users share a connection pool. That seems to avoid the issue....

from geventhttpclient.client import HTTPClientPool

class QuickstartUser(FastHttpUser):
    client_pool = HTTPClientPool(concurrency=100)
    ...

@cyberw cyberw changed the title Locust is not sending any requests while ramping up Delay at startup and high cpu usage on Windows in Python 3.12 Feb 10, 2024
Copy link

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 10 days.

@github-actions github-actions bot added the stale Issue had no activity. Might still be worth fixing, but dont expect someone else to fix it label Apr 11, 2024
@vethan
Copy link

vethan commented Apr 12, 2024

Any news on this one other than the workaround?

@cyberw
Copy link
Collaborator

cyberw commented Apr 12, 2024

There’s an open PR for requests that could fix it for HttpUser:
psf/requests#6667

(maybe something similar could be implemented for geventhttpclient as well)

Either way, this cant really be fixed in locust as it is an underlying issue.

@cyberw cyberw removed the stale Issue had no activity. Might still be worth fixing, but dont expect someone else to fix it label Apr 12, 2024
@cyberw
Copy link
Collaborator

cyberw commented May 17, 2024

If you encounter this issue, then install the latest and greatest requests from github (no release from them yet, might be another month).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants