how are you organizing your prefect server environ...
# prefect-getting-started
s
how are you organizing your prefect server environment and deployments? lets say i have a folder with X projects and i have a folder with respective environments (one for each project). i therefore need at least one deployment py-file for every project, in which i can define the venv to use, etc. from what i understood, i need to call each deployment py file in order to deploy it to the prefect server. in order to not do this manually and to restart the process after a reboot i would define it as a systemd service. but then i would have to define X services and manage them manually, which is cumbersome, e.g. given the amount of projects. in contrast, with airflow, i would only join airflow as systemd service and it would scan for the DAGs and run them accordingly. is there something similar possible in prefect, like a meta-deployer thats capable of handling all the folder/import/venv structures?
n
hi @spehle - there's a lot of different valid ways to do this but just to comment:
i therefore need at least one deployment py-file for every project
this isn't necessarily true, I prefer not to run a python file to deploy my flows, so I scan for changed python files in CI and deploy flow entrypoints from those files in CI using
prefect --no-prompt deploy some/file.py:some_flow
where all my deployment config is defined in the
prefect.yaml
im not sure I understand the need for a systemd process here, is it in lieu of CI that would deploy on changes pushed to version control?
s
• i have one virtual environment for every project, therefore i can not deploy them all in one file (documentation says "when serving multiple deployments, the only requirement is that they share a Python environment; they can be executed and scheduled independently of each other") and the process has to run infinitely (documentation says "For remotely triggered or scheduled runs to be executed, your script with
flow.serve
must be actively running.") • can you elaborate on 'i scan for changed python files in CI'? • the need for a systemd process is that the 'actively running' script will be running no matter what the server will do (crash and reboot, update and reboot). as iam not the one operating the server, one time ill login and the uptime will be back to 5 hours, meaning there was a reboot that i did not know of. in these cases, prefect server as well as my deployment scripts would have to be started without me doing it manually
n
gotcha, i didn't know you were using
.serve
here, that explains the py deployment files and systemd i was talking about a situation where you have your flows defined in some source files and then you run something like
prefect --no-prompt deploy --all
to deploy all your deployment definitions in a
prefect.yaml
which is not something you need to have with
.serve
, but if you want to be declarative and/or do dynamic infra provisioning then that might be something to explore even with serve at scale I'd personally want to containerize things, are you doing everything as local processes for now? if so then yeah I see the need for separate venvs
s
thanks for the info. ill check on the prefect.yaml route, from my perspective it could be a solution to my usecase. the containerization and ci implementation looks 'sexy', but ill have to weigh cost/benefit right now