Jackson Freitas
09/19/2025, 1:26 PMMarvin
09/19/2025, 1:26 PMMarvin
09/19/2025, 1:32 PMPREFECT_API_URL
on your host to <http://localhost:4200/api>
.
Minimal docker-compose.yml (server + UI only)
version: "3.9"
services:
prefect-server:
image: prefecthq/prefect:3-latest
command: >
prefect server start
--host 0.0.0.0
--port 4200
--ui
ports:
- "4200:4200"
volumes:
- prefect_data:/root/.prefect
restart: unless-stopped
volumes:
prefect_data:
How to use it
1) Start the server:
docker compose up -d
Open the UI at http://localhost:4200.
2) Point your host CLI at the containerized server:
- One-time profile setting:
prefect config set PREFECT_API_URL="<http://localhost:4200/api>"
- Or per-command:
PREFECT_API_URL="<http://localhost:4200/api>" prefect deploy ...
3) Deploy and run a flow from your host:
- Create a work pool (for local process execution):
prefect work-pool create -n my-pool -t process
- Start a worker (in another terminal on your host):
prefect worker start -p my-pool
- From your project directory, deploy your flow:
prefect deploy ./path/to/flow.py:my_flow -n my-deployment -p my-pool
- Trigger an ad-hoc run:
prefect deployment run "my_flow/my-deployment" --watch
Notes
- The compose file above persists Prefect state to a Docker volume (prefect_data
) so restarts don’t reset your server’s database (SQLite by default).
- If port 4200 is in use, change both the container’s --port
flag and the host port mapping, and update PREFECT_API_URL
accordingly.
- For more advanced/production setups (e.g., Postgres, reverse proxies), see the self-hosting guides.
Docs
- Self-host Prefect Server with Docker: Self-host with Docker
- Self-host with Docker Compose: Docker Compose guide
If you’d like, I can extend the compose file to include a worker service as well so everything runs inside Docker.Jackson Freitas
09/19/2025, 2:21 PMMarvin
09/22/2025, 3:15 PMMarvin
09/22/2025, 3:22 PMprefect deployment ls
- Trigger a run: prefect deployment run 'flow-name/deployment-name'
We verified these CLI commands exist in Prefect 3.x:
- prefect deploy --help
- prefect deployment --help
- prefect worker start --help
3) Example: add a worker to your docker-compose
Below is a worker service that:
- Connects to your Prefect Server container
- Mounts a host folder to /app inside the worker (so your flow can read files there)
- Installs the needed integrations using uv at container start, then starts the worker
Assumptions:
- Your Prefect Server service is named prefect-server
and exposes 4200
- You already created a process work pool named my-process-pool
(if not, see notes below)
services:
# ... your existing prefect-server service here ...
prefect-worker:
image: prefecthq/prefect:3-latest
depends_on:
- prefect-server
environment:
# Point the worker at your Prefect API in the server container
PREFECT_API_URL: <http://prefect-server:4200/api>
PREFECT_LOGGING_LEVEL: INFO
volumes:
# Map a host folder (adjust ./host_app) to /app in the container
- ./host_app:/app
# Install uv (if not present) and your integrations with uv, then start the worker
command: >
bash -lc "
python -m pip install -U uv &&
uv pip install --system
prefect-airbyte
prefect-github
prefect-email
prefect-databricks
&& prefect worker start --pool my-process-pool --name container-worker-1
"
restart: unless-stopped
Notes and quick commands
- Create a process work pool (run on your host, pointed at the same API):
- prefect work-pool create -t process my-process-pool
- Make sure your host CLI is pointed to the same server (if you’ve published 4200):
- export PREFECT_API_URL=<http://localhost:4200/api>
- Register a deployment from your host:
- From your project root: prefect deploy
- Select my-process-pool
when prompted (or set it in your deployment config)
- Trigger runs without entering the container:
- UI: open the deployment and click “Run”
- CLI: prefect deployment run 'flow-name/deployment-name'
When to use this pattern
- The example above uses a process worker, which runs your flow code in the same environment as the worker. That’s ideal when you:
- Need simple local file access via a volume (like /app)
- Want to centrally install Python packages once in the worker container
- If you switch to a Docker or Kubernetes work pool, be sure the runtime image for each flow has these integrations installed (not just the worker image).
If you share your current docker-compose for the server, I can tailor the worker service exactly to your setup.