Welcome to the community
@Laurens! You can definitely call Spark submit jobs from Prefect, and there are some tasks in the Task Library that can help you with that -
this PR contains an example in which Databricks Spark Submit jobs are submitted in parallel using Mapping and LocalDaskExecutor.
Whether you want to wait for the results of Spark job in your Prefect flow is certainly configurable. Overall,
Prefect doesn’t require you to use any predefined tasks from our library - you can build your own logic so that Prefect waits for those Spark jobs to continue, or not - it’s up to you to design it as you wish. Taking the
DatabricksSubmitRun as example, it contains
polling_period_seconds
to poll for the results of this spark job.