There are any resources/examples about a ETL pipeline for live data ingestion reading from a Queue and for each new event it creates a new flow that is executed by a central dask cluster with auto scaling? (actually this is a good use case for prefect? or should we stick with batch jobs?)
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.