spehle
02/27/2024, 12:01 PMNate
02/27/2024, 2:40 PMi therefore need at least one deployment py-file for every projectthis isn't necessarily true, I prefer not to run a python file to deploy my flows, so I scan for changed python files in CI and deploy flow entrypoints from those files in CI using
prefect --no-prompt deploy some/file.py:some_flow
where all my deployment config is defined in the prefect.yaml
im not sure I understand the need for a systemd process here, is it in lieu of CI that would deploy on changes pushed to version control?spehle
02/27/2024, 6:26 PMflow.serve
must be actively running.")
• can you elaborate on 'i scan for changed python files in CI'?
• the need for a systemd process is that the 'actively running' script will be running no matter what the server will do (crash and reboot, update and reboot). as iam not the one operating the server, one time ill login and the uptime will be back to 5 hours, meaning there was a reboot that i did not know of. in these cases, prefect server as well as my deployment scripts would have to be started without me doing it manuallyNate
02/27/2024, 7:00 PM.serve
here, that explains the py deployment files and systemd
i was talking about a situation where you have your flows defined in some source files and then you run something like prefect --no-prompt deploy --all
to deploy all your deployment definitions in a prefect.yaml
which is not something you need to have with .serve
, but if you want to be declarative and/or do dynamic infra provisioning then that might be something to explore
even with serve at scale I'd personally want to containerize things, are you doing everything as local processes for now? if so then yeah I see the need for separate venvsspehle
02/27/2024, 7:53 PM