Hey
@Nova Westlake I think Prefect, especially 2.0, can definitely accommodate your use case especially since for the most part when you're building out your workflows you're just writing native python and it's definitely possible to build a workflow that "pauses" due to some unforeseen error and resumes when it's able to.
In regards to this question:
"I think what I'm most stuck on is is Prefect made to manage multiple separate tools that are built as Docker containers? I want to sandbox these data processing steps from each other in terms of their dependencies, library versions etc. But it feels like Prefect is more intended to be a monolithic app? How would you normally manage running tools that need to use python libraries with different versions in prefect, etc?"
There's likely more than one way to accomplish this in 2.0 and it can depend on your use case but as an initial thought I believe you could accomplish this by setting up a flow of flows, where each subflow can be deployed specifying their own docker containers with their own dependencies, more on
deployments here as well the
docker-container infrastructure here. Each flow/container could be triggered from the
orion api client, create_flow_run_from_deployment, which would ensure each flow is running within it's own container you can control the flow states as well from the api, since 2.0 works really well with native python it's definitely possible to include conditional states around the success of the individual flows within you're code to deal with failures as well, there's definitely more to consider here for sure but I think it's definitely possible to accomplish what you're looking for. as you're scoping it out as well definitely feel free to ask any questions you might come across we're more than happy to try and help 😄
It's also worth noting to that for complex scenarios and more production heavy use cases you can also reach out to
cs@prefect.io for paid support which can get a little more hands on with helping you get this pipeline setup