Another potential feature request: in-flow task-concurrency groups. If I have 10 flows each hitting ...
s
Another potential feature request: in-flow task-concurrency groups. If I have 10 flows each hitting 10 third party APIs for data, I will have 10 different rate limits. It'd be really handy to have something like:
Copy code
@task(tags=["fetch_data"])
def fetch_data(date: dt) -> Something:
    return some_http_response_payload

@task
def process_stuff(payloads: list) -> None:
    process_and_save_or_something(payloads)

@flow(task_tag_concurrency=dict(fetch_data=3))
async def some_flow(start: dt, end: dt):
    futures = [fetch_data.submit(d) for d in dates_between(start, end)]
    process_stuff(futures)
Either that or something akin to
Copy code
with task_run_limit(5):
    ...some collection of tasks, of which 5 at most will run concurrently