Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query does not processed #1135

Open
0ralo opened this issue Mar 16, 2024 · 3 comments
Open

Query does not processed #1135

0ralo opened this issue Mar 16, 2024 · 3 comments

Comments

@0ralo
Copy link

0ralo commented Mar 16, 2024

  • asyncpg version: 0.29.0
  • PostgreSQL version: PostgreSQL 15.5 (Debian 15.5-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
  • Do you use a PostgreSQL SaaS? If so, which? Can you reproduce
    the issue with a local PostgreSQL install?
    : I use postgres on another local server
  • Python version: 3.10.0
  • Platform: Windows 10
  • Do you use pgbouncer?: No
  • Did you install asyncpg with pip?: I am not sure, I use Poetry
  • If you built asyncpg locally, which version of Cython did you use?: -
  • Can the issue be reproduced under both asyncio and
    uvloop?
    : I am not sure

Sorry if i made smt wrong, that is my first issue.
I write query that works on postgres and i want to use it in my python code. Before that query there were not problems but now i have. I use python3.10.0 + fastapi + sqlalchemy + asyncpg as driver. The query -> with foo as (select en.name, en.sources, en.id, x.* from crosstab($$ with d as (select source_id, date_trunc('day',dt) dt, sum(count_ev)::bigint val from events.v_stat_count where dt>=date_trunc('day', now()-'15d'::interval) and dt<=now() group by 1,2), s as (select distinct source_id from d)select s.source_id, x.dt, sum(coalesce(val,0))::bigint val from pg_catalog.generate_series(now()::date-interval '15d', now()::date, '1d') x(dt) cross join s left join d on d.dt=x.dt and d.source_id=s.source_id group by 1,2 order by 1,2 desc; $$) as x(source_id uuid, d1 bigint, d2 bigint, d3 bigint, d4 bigint, d5 bigint, d6 bigint, d7 bigint, d8 bigint, d9 bigint, d10 bigint, d11 bigint, d12 bigint, d13 bigint, d14 bigint, d15 bigint) join meta_data.event_name en on en.id =x.source_id order by 1,nlevel(en.sources) )select foo.name, end2.description,foo.id, foo.sources, d1, d2, d3, d4, d5, d6, d7, d8, d9, d10, d11, d12, d13, d14, d15, d1+d2+d3+d4+d5+d6+d7+d8+d9+d10+d11+d12+d13+d14+d15 as sumfrom foo left join meta_data.event_name enon en.sources = foo.sources left join nd.event_name_desc end2on end2.id = en.id
echo=True show
sqlalchemy.engine.Engine [generatedx in 0.00xxs] ()
and all. I write query using
session.execute(text(query))(it stops here) then _.fetchall()
Others queries work. If i change driver to psycopg(3) query works.

@elprans
Copy link
Member

elprans commented Mar 18, 2024

Do you get an error? It's not clear from the description what the actual issue is.

@0ralo
Copy link
Author

0ralo commented Mar 18, 2024

Do you get an error? It's not clear from the description what the actual issue is.

I got infinite loading when I execute this query. It does not even in .fetchall() but in session.execute(text()) line. Also asyncpg in python terminal with no sqlalchemy does not process query either it just stops. No logs no errors

@0ralo
Copy link
Author

0ralo commented Apr 3, 2024

Upd:
I cannot(?) get result even when i put query into view. Select * from view_name also stops in text() block in slqalchemy.
Upd2. I can get result, but it took long time that i was not wait before. (approximately 9 min instead of 0,269 secs in DBeaver for 35 lines)
logs:
2024-04-03 10:11:43,328 INFO sqlalchemy.engine.Engine BEGIN (implicit)
2024-04-03 10:11:43,328 INFO sqlalchemy.engine.Engine select * from nd.get_events_15
2024-04-03 10:11:43,328 INFO sqlalchemy.engine.Engine [generated in 0.00011s] ()
2024-04-03 10:20:01,330 INFO sqlalchemy.engine.Engine ROLLBACK
query took 8m18.12 s

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants