Thread
#prefect-community
    c

    Chris Eberly

    2 years ago
    hello again, one more question, is there a way to “yield” a batch of results to start feeding to a mapping process? the example would be, in your guys’ x-files tutorial, if we got say 10 episode urls at a time, can we start the scraping process asynchronously and then collect the results at the end? apologies if i’m missing something obvious, or if this is not a good use case
    the context of this is reading ~thousands of urls in batches, and there is no need to wait until that finishes to start processing them
    Chris White

    Chris White

    2 years ago
    Hey Chris! this is our white whale right now; we call it “depth first mapping” and our current bottleneck is described here: https://stories.dask.org/en/latest/prefect-workflows.html#pain-points-when-using-dask
    we definitely revisit this regularly and we have a strong goal of supporting this
    c

    Chris Eberly

    2 years ago
    that’s awesome, let me know if i can help
    Chris White

    Chris White

    2 years ago
    will do!
    feel free to dig into the task mapping code within the
    task_runner.py
    file to get a feel for how this all works