Yes, the solution for this is
Mapping. You can use mapping to generate child tasks at runtime based on some dynamic state of the world.
You can have one task in your flow that determines this dynamic state e.g. this task may return a list of files to be processed. You can then use mapping to process those tasks in parallel. The actual task (i.e. function) that processes it could additionally perform some different action based on the value of the input it receives, e.g. if the file is a CSV then it may do X, and if this file is parquet then do Y. And as long as you assign a DaskExecutor or LocalDaskExecutor to your flow, it will all run in parallel.