hello again, one more question, is there a way to “yield” a batch of results to start feeding to a mapping process? the example would be, in your guys’ x-files tutorial, if we got say 10 episode urls at a time, can we start the scraping process asynchronously and then collect the results at the end? apologies if i’m missing something obvious, or if this is not a good use case