is there a way to limit task submissions from a ma...
# ask-community
a
is there a way to limit task submissions from a map some_task.map(ten_thousand_item_list) is there a way to stop 10,000 tasks from being submitted at once?
1
k
you can use tags to set task run concurrency limits
a
yeah that stop them from executing
but i dont want to have the actual tasks submit because that wont contribute to orchestration API limit
k
ah that's a good point
a
obviously can batch up the work into N equal chunks then map them
but not ideal
k
yeah I was going to suggest that but it's not the most satisfying answer
a
would be ideal if map could take a kwarg of like "batch_size" or something similar
any idea or should i just go with the chunking?
k
and then try to align the retries on 429s with the runtime of your mapped tasks
but that's pretty imprecise too I think
a
yeah not ideal
I think im just going to do 1000 at a time and live with it, but this would be ideal as an API config on map task
not sure the technical feasibility of it but yeah
k
you can rate limit the submission of tasks using our global concurrency tools
a
and i assume in this context the slow_my_flow references a tag
k
a global concurrency limit
settings -> concurrency -> global tab
a
appreciate it ill give it a whirl now :^)
🙌 1
Copy code
@flow
def _check_file_path_flow():
    mapped_files = get_already_mapped_files()

    for record in mapped_files:
        rate_limit("file_api", occupy=1)
        updated_path = get_updated_path(record)
        update_path_and_name(updated_path)
looks like its submitting sequentially still, granted i just cursory glanced for these values on the link you sent
k
get_updated_path.submit(record)
a
oh lol right
k
you're good lol
a
now its ripping
🙌 1
thanks Kevin
k
you got it
a
fwiw itd be nice to have a tooltip in the UI for
or a link to docs
k
there's a link to docs if you haven't created a concurrency limit yet
a
👍