Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

performance latency after updating to Fastapi > 0.82.0 #5602

Closed
9 tasks done
OpetherMB opened this issue Nov 9, 2022 · 8 comments
Closed
9 tasks done

performance latency after updating to Fastapi > 0.82.0 #5602

OpetherMB opened this issue Nov 9, 2022 · 8 comments

Comments

@OpetherMB
Copy link

First Check

  • I added a very descriptive title to this issue.
  • I used the GitHub search to find a similar issue and didn't find it.
  • I searched the FastAPI documentation, with the integrated search.
  • I already searched in Google "How to X in FastAPI" and didn't find any information.
  • I already read and followed all the tutorial in the docs and didn't find an answer.
  • I already checked if it is not related to FastAPI but to Pydantic.
  • I already checked if it is not related to FastAPI but to Swagger UI.
  • I already checked if it is not related to FastAPI but to ReDoc.

Commit to Help

  • I commit to help with one of those options 👆

Example Code

#db dependency

def get_db():
    with SessionLocal() as db:
        try:
            yield db
        except:
            raise
        finally:
            db.close()


#exceptions.py
# catch all exception in a middleware

app.middleware("http")(exceptions.catch_exceptions_middleware)

Description

i noticed that after i've updated Fastapi to a superior version > 0.82.0, i noticed that mu integrations tests are stuck and keeps running in a slow manner, which is not the usual behavior of any version that is less than <= 0.82.0 , Can anyone confirm the same behavior please ? i can not understand why i have a really long latency is my dependency causing this, becuz since the 0.82.0, i have changed it to this after the fix in PR #5122 .

Allow exit code for dependencies with yield to always execute, by removing capacity limiter for them, to e.g. allow closing DB connections without deadlocks. PR #5122 by @adriangb.

Can anybody help me figure out why updalting to any superior > 0.82.0 i have this much latency ?

Thank you guys.

Operating System

Linux

Operating System Details

No response

FastAPI Version

0.82.0

Python Version

3.10

Additional Context

No response

@OpetherMB OpetherMB added the question Question or problem label Nov 9, 2022
@OpetherMB
Copy link
Author

@tiangolo @dmontagu can u please help if you have time, lit's a bit critical for me ?

@jgould22
Copy link

jgould22 commented Nov 9, 2022

It really helps other people debug problems like this if you can produce a test harness that demonstrates the issue succinctly.

@tiangolo
Copy link
Owner

Thanks @jgould22 ! 🍰

Indeed, please provide an example code that shows your problem so that I (we) can replicate it and investigate it here. Otherwise it's just blind guesses... 😅

You can start by upgrading other dependencies too, in case that has anything to do with it.

@OpetherMB
Copy link
Author

Thank you guys for your reply @tiangolo , @jgould22 i appreciate it.

ok, let's put more context to this, i have a lot of tests unit test and integrations, but what i'm going to do here to illustrate the performance pblm, i am going to take one of simplest tests i have in which i test if my strings gets strip correctly before getting inserted to the DB.

Then i'll proceed to show you the timing made to perform this test in both of the versions Fastapi version 0.82.0 and Fastapi 0.85.0 or any greater version than 0.82.0.

so let's first show you ho i handle my dependency:

#db dependency

def get_db():
    with SessionLocal() as db:
        try:
            yield db

     # raise exception which will be catched by  a middleware.
        except:
            raise
        finally:
            db.close()

this is my middleware to catch all exceptions raised

#exceptions.py
# catch all exception in a middleware

app.middleware("http")(exceptions.catch_exceptions_middleware)


async def catch_exceptions_middleware(request: Request, call_next):
    try:
        resp = await call_next(request)

    except TimeoutError:
        return JSONResponse(
            status_code=502,
            content={"detail": "DB connection timeout probably because of Queue limit error"},
        )
    except IntegrityError as exc:
        if isinstance(exc.orig, NotNullViolation):  # proves the original exc.nameeption
            return JSONResponse(
                status_code=status.HTTP_400_BAD_REQUEST,
                content={"detail": f" There is a missing field {str(exc.__dict__['orig'])}"},
            )
        elif isinstance(exc.orig, UniqueViolation):  # proves the original exc.nameeption
            return JSONResponse(
                status_code=status.HTTP_409_CONFLICT,
                content={"detail": f"Already exists {str(exc.__dict__['orig'])}"},
            )
        elif isinstance(exc.orig, ForeignKeyViolation):
            return JSONResponse(
                status_code=status.HTTP_400_BAD_REQUEST,
                content={"detail": f"{str(exc.__dict__['orig'])}"},
            )
        else:
            return JSONResponse(
                status_code=500,
                content={"detail": f"{str(exc.__dict__['orig'])}"},
            )

    except (RequestValidationError, ValidationError) as exc:
        return JSONResponse(
            {"status": "ERROR", "message": "Validation Error", "errors": exc.errors()},
            status_code=HTTP_422_UNPROCESSABLE_ENTITY,
        )

    except Exception as e:
        return JSONResponse(content=repr(e), status_code=500)

    return resp

okay, let's then take a look to the test

# integration tests
from fastapi.testclient import TestClient
from sqlalchemy.orm import Session
from app.services.config import settings

def test_data(client: TestClient, db: Session) -> None:
    good_label = "strip me !"
    labels = [
        f"\t{good_label}\r",
        f" {good_label}\r",
        f" {good_label}\n",
        f"   {good_label}\t ",
    ]
    for label in labels:
        data_building = {"label": label}
        response = client.post(
            f"{settings.API_V1_STR}/building/",
            json=data_building,
        )
        assert response.status_code == 201
        content_building = response.json()
        building_id = content_building["building_id"]

        assert content_building["label"] == good_label
        assert building_id > 0

        # get building
        response = client.get(f"{settings.API_V1_STR}/building/{building_id}")
        assert response.status_code == 200
        content_get_building = response.json()

        assert content_get_building["label"] == good_label

Let's compare results

for fastapi 0.82.0 or less i get around 0.70 sto run the above test. you can see this in hte following logs :

zeus:ftth$ docker exec -it arobase  pytest   -s -v app/tests/api/test_basic.py
========================================================= test session starts ==========================================================
platform linux -- Python 3.10.4, pytest-7.2.0, pluggy-1.0.0 -- /usr/local/bin/python
cachedir: .pytest_cache
rootdir: /code
plugins: cov-4.0.0, anyio-3.6.2, Faker-15.3.1
collected 1 item                                                                                                                       

app/tests/api/test_basic.py::test_data [2022-11-10 12:46:19,364] [INFO] app starting..
PASSED[2022-11-10 12:46:19,980] [INFO] App shutting down...


========================================================== 1 passed in 0.69s ===========================================================

for fastapi > 0.82.0, it takes 1.58 s to do the same thing

zeus:ftth$ docker exec -it arobase  pytest   -s -v app/tests/api/test_basic.py
========================================================= test session starts ==========================================================
platform linux -- Python 3.10.4, pytest-7.2.0, pluggy-1.0.0 -- /usr/local/bin/python
cachedir: .pytest_cache
rootdir: /code
plugins: cov-4.0.0, anyio-3.6.2, Faker-15.3.1
collected 1 item                                                                                                                       

app/tests/api/test_basic.py::test_data [2022-11-10 12:48:05,111] [INFO] app starting..
PASSED[2022-11-10 12:48:06,612] [INFO] App shutting down...


========================================================== 1 passed in 1.58s ===========================================================

As you can imagine this only for one simple test but since i got a lot whole more you can clearly feel the latency between the version 0.82.0 and any version > 0.82.0 .

is the way i raise expection in my session dependency, has to do something with it ? maybe something changed in on how to handle exceptions using dependency?

@github-actions github-actions bot removed the answered label Nov 10, 2022
@tiangolo
Copy link
Owner

It's hard to tell without seeing all the code and being able to run something. If you can provide a minimal, self-contained code example that shows your problem so that I can replicate it and investigate it here that would be great. 🙏 Otherwise it's just blind attempts at imagining your code and what could be happening.

@OpetherMB
Copy link
Author

Thanks @tiangolo
I've updated uvicorn dependency and it seems to work the same as other previous versions, I had to update uvicorn to the latest version, i guess it's better to install dependencies using fastapi[all] .

@tiangolo
Copy link
Owner

Thanks for reporting back and closing the issue 👍

@github-actions
Copy link
Contributor

Assuming the original need was handled, this will be automatically closed now. But feel free to add more comments or create new issues or PRs.

@tiangolo tiangolo reopened this Feb 27, 2023
Repository owner locked and limited conversation to collaborators Feb 27, 2023
@tiangolo tiangolo converted this issue into discussion #6175 Feb 27, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

3 participants