Replies: 2 comments 1 reply
-
TLDR: are there use cases for the multi_requests function other than the baidu engine? A note about async HTTP client and the multi_requests function As its name says, the function sends multiple requests at the same time. The function returns all the responses at once. All other HTTP requests can run in the sync world: there is no parallelism in the engine code. However, in the master branch, all outgoing HTTP requests are sent through async HTTP clients. There are bridges from sync to async and vice versa. There was at a time when SearX(NG) was supposed to switch async entirely: https://github.com/searx/searx/wiki/Milestones#milestone-12---async Async is (nearly|for sure) an anti-pattern in a Flask application: if we need parallelism, the way to go is thread. Also, async makes the HTTP streaming complex. After each change, I ask someone on the team (usually Paul) to deploy it on a public instance and monitor for memory leaks. So #2685 drops all async code and relies only sync HTTP client ... and the multi_requests is dropped too: this function is not used (the Bing engine used it, but not anymore). The only potential use case for the
So my question is: Are there other use cases for the |
Beta Was this translation helpful? Give feedback.
-
One significant improvement in terms of latency and reliability can be achieved by sending multiple requests at once, while waiting for the first answer. This is easily done using asyncio. Works like charm with tor, which is otherwise slow and blocked many times by downstream engines. |
Beta Was this translation helpful? Give feedback.
-
There is WIP / unfinish PR about the network stack, see #2685
I open the discussion to collect ideas about possible improvement on the network stack ?
What would like to be able to do regarding the network?
What bother you in the current implementation?
Beta Was this translation helpful? Give feedback.
All reactions