John Ramirez
05/13/2020, 3:12 PM.map()
or submit a single job and manage running the different models within the single spark job.nicholas
05/13/2020, 3:59 PM.map()
in Prefect if you want to take advantage of the Prefect semantics (like state handlers, retries, conditional branches in the map etc). Depending on how much overhead there is to starting/stopping the Spark job, that'll probably be a big decision point to whether you want to manage it within a single job or not. Hopefully that's at least somewhat helpful