Hi all. Wondering if anybody have experienced runn...
# prefect-community
w
Hi all. Wondering if anybody have experienced running Prefect inside an API context using a third party worker, like Celery or something. What I mean is, declaring the flow/tasks with Prefect Core, and then calling
flow.run()
within a Celery task for example. Anyone?
k
I thought Prefect or Celery was an either-or thing. Havent come across them being used together but I guess you could yep if it works. We’ll see if anyone in the community chimes in
w
Yeah, we actually have been using Prefect with the whole server running in Kubernetes, but that's more like an ETL structure, and that works great. For a new project, I was trying to run away from Celery chains and chords, so it could make sense to declare the flow using Orion. But then I would need something to actually run the flow without the server overhead. But that's not something I've heard or seen anybody doing yet, so not too sure if it's a good idea
we haven't explored orion much, but on 1.0 the server was fairly complex, and not really suited for this kind of approach, way too many pieces to tie together. if Orion turns out to be more flexible, we could be a awesome to use it with a custom executor (Celery, RQ, Huey etc). Monitoring would be great and definitely a game changer when comparing to current tools (thinking of Flower and RQ Dashboard). Having the ability to write proper flows (one flow can be an executor task, or maybe each flow task can be an executor task) is also great, and not many options for that at the moment. Understand this is not exactly what it was designed to do, but could be an option
k
Yeah I think orchestrating with a backend will just have additional overhead so I can see why you want to do this. But I guess the question is why use Celery over bringing over the work to Prefect + Dask? Is it because you were already using Celery?
w
Actually it's because of our experience with 1.0. Everything works great, but its just too much. There were 7-8 services to lookout for on K8S. I can make Celery work with 1 worker and that's it (i already have RabbitMQ/Redis anyways). It definitely works great when the job itself is a whole ETL flow. But when doing background jobs (that do have the "workflow" aspect), as it is, is just too much.
k
Ah yeah I don’t think Prefect 1.0 is meant for these batch jobs. I can see what you’re trying to do though because there are useful Flow level semantics you still get with
flow.run()
like state handling, triggers, etc so I personally think this use case makes sense as long as you can still track observability for failed runs, but I think you have tools log that somewhere anyway so you don’t need the Prefect backend