Skip to content

async events in addons #4207

Closed
Closed
@iTosun

Description

@iTosun

Is your feature request related to a problem? Please describe.

It would be a nice feature if the mitm event's can be run as async functions, this will enable developers of addon's to use async methods.

Describe the solution you'd like

Make the event's async!

Describe alternatives you've considered

I've tried to schedule futures in the current running eventloop but it seems that it doesn't apply changes to the HTTP flow object.

Additional context

Would love to see async possibilities, that can make some advanced addons more efficient and extensible

Activity

lionel126

lionel126 commented on Mar 10, 2021

@lionel126

extremely look forward to

rachmadaniHaryono

rachmadaniHaryono commented on Mar 28, 2021

@rachmadaniHaryono
import asyncio
import os
import signal

from mitmproxy import ctx
from mitmproxy.tools import console, main


class Addons:

    async def f1(self):
        count = 0
        while True:
            await asyncio.sleep(1)
            count += 1
            ctx.log.info('count: {}'.format(count))


opts = main.options.Options()
master = console.master.ConsoleMaster(opts)
if hasattr(main, "proxy"):
    pconf = main.proxy.config.ProxyConfig(opts)
    master.server = main.proxy.server.ProxyServer(pconf)
loop = asyncio.get_event_loop()

# addons section
ao_obj = Addons()
master.addons.add(ao_obj)
loop.create_task(ao_obj.f1())

loop.add_signal_handler(
    signal.SIGINT, getattr(master, "prompt_for_exit", master.shutdown))
loop.add_signal_handler(signal.SIGTERM, master.shutdown)
if os.name == "nt":
    async def wakeup():
        while True:
            await asyncio.sleep(0.2)
    asyncio.ensure_future(wakeup())
master.run()

above example is based on #4259 and for mitmproxy v6.0.2

e: add simple edit for 7.0.0.dev
e2: more change based on mitmproxy.tools.main:run
e3: group addons section

roniemartinez

roniemartinez commented on Apr 7, 2021

@roniemartinez

+1

Prinzhorn

Prinzhorn commented on May 27, 2021

@Prinzhorn
Member

One thing I'm concerned about are race conditions. They exist with @concurrent and maybe we can get rid of them with async events.

I just switched to @concurrent to see how things are going. I just ran into a race condition where server_connected was racing request. This crashed my app because of the assumption that the connection exists when the request arrives (a SQL constraint failed).

websocket_message seems to behave as expected, that means multiple messages from different connections are processed concurrently but on the same connection message B will only be processed when message A is done. That's perfect. I'd love to see connection-wide serialization in all of mitmproxy. That means the request event cannot happen before server_connected is done and so on. Is this feasible?

import time
from mitmproxy.script import concurrent

@concurrent
def server_connected(data):
    print('server_connected')
    print('server_connected: sleeping')
    time.sleep(5)
    print('server_connected: done sleeping')

@concurrent
def request(flow):
    print('request')
    print('request: sleeping')
    time.sleep(5)
    print('request: done sleeping')
curl --insecure --proxy "localhost:8080" https://www.google.com/
server_connected
server_connected: sleeping
request
request: sleeping
server_connected: done sleeping
request: done sleeping
mhils

mhils commented on Aug 27, 2021

@mhils
Member

I'd love to see connection-wide serialization in all of mitmproxy. That means the request event cannot happen before server_connected is done and so on. Is this feasible?

Yes and no. The good news for you is that most events are already serialized, e.g. the respective part in the proxy core is blocking until it a specific hook completes. That won't change, i.e. you'll never see request appear before requestheaders has completed.

The order of request and server_connected is not deterministic. If a request reuses an existing connection, server_connected obviously comes before request. If you set connection_strategy to lazy, request should happen even before server_connect. So no guarantees here. :)

mhils

mhils commented on Aug 27, 2021

@mhils
Member

Regarding the actual suggestion: this is something we definitely plan to implement at some point, we can probably/hopefully use this to ditch .reply entirely. Let's see.

Prinzhorn

Prinzhorn commented on Aug 27, 2021

@Prinzhorn
Member

If you set connection_strategy to lazy, request should happen even before server_connect.

That makes sense but I am explicitly using lazy. I've looked at my exception from back then again and my example above is not accurate. What failed was my NOT NULL constraint for responses.server_connection_id (not request, which I associate with requests.client_connection_id). That happened only with @concurrent and only rarely. Never without @concurrent. So request was processed by me after server_connected because the long running gRPC call in server_connected did not prevent request from firing and somehow request arrived earlier on the other end (I have a talent for race conditions).

But if I understand correctly async hooks solve this because it would "block" in a similar fashion that the current synchronous hooks do. But without blocking unrelated events (e.g. other connections). And since I'm now using asyncio gRPC it should just work and perform better.

mhils

mhils commented on Aug 27, 2021

@mhils
Member

But if I understand correctly async hooks solve this because it would "block" in a similar fashion that the current synchronous hooks do. But without blocking unrelated events (e.g. other connections). And since I'm now using asyncio gRPC it should just work and perform better.

Ah, you are hitting a special case here: We are not waiting for server_connected to complete before notifying the HTTP layer that the connection has been established (I didn't think this would be useful to anyone and it surely isn't faster). If you have a concrete use case here please open a separate issue, this should be straightforward to adjust. In fact at the moment it's a bit more complicated because we don't block.

added a commit that references this issue on Feb 2, 2022
806cce1

8 remaining items

Loading
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/featureNew features / enhancements

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      Participants

      @Prinzhorn@mhils@roniemartinez@lionel126@rachmadaniHaryono

      Issue actions

        async events in addons · Issue #4207 · mitmproxy/mitmproxy