<@ULVA73B9P> ValueError: Either an image or remote...
# ask-marvin
h
@Marvin ValueError: Either an image or remote storage location must be provided when deploying a deployment.
m
thought for 538.9 seconds
That error means Prefect can’t tell how your workers will get your flow code at run time. In Prefect 3.x, every deployment must provide one of: - a container image that already contains your code, or - a remote storage location (git/object storage/etc.) from which the worker can pull your code Quick fixes - If you’re using the Python API, pass an
image=...
to
deploy(...)
or create the flow from a remote
source
with
Flow.from_source(...)
. - If you’re using the CLI (
prefect deploy
), define either
image:
or a remote
source:
in your
prefect.yaml
. Below are concrete examples for both paths. Python API examples 1) Use a container image (worker will run this image)
Copy code
from prefect.flows import Flow

Flow.from_source(
    ".",  # local project root used for the image build context
    entrypoint="flows/etl.py:etl"
).deploy(
    name="etl-k8s",
    work_pool_name="kubernetes-pool",
    image="<http://ghcr.io/acme/prefect-etl:0.1.0|ghcr.io/acme/prefect-etl:0.1.0>",  # image your workers can pull
    build=True,   # build the image
    push=True     # push to the registry
)
2) Use a remote git repository for code
Copy code
from prefect.flows import Flow

Flow.from_source(
    "<https://github.com/your-org/your-repo>",   # or git@github.com:org/repo.git
    entrypoint="flows/etl.py:etl"
).deploy(
    name="etl-from-git",
    work_pool_name="process-pool"  # or any pool type
)
3) Use remote object storage (S3/GCS/Azure) for code
Copy code
from prefect.filesystems import RemoteFileSystem
from prefect.flows import Flow

storage = RemoteFileSystem(basepath="<s3://my-bucket/prefect-project>")  # gs://..., az://... also work
Flow.from_source(
    storage,
    entrypoint="flows/etl.py:etl"
).deploy(
    name="etl-from-s3",
    work_pool_name="process-pool"
)
CLI (prefect deploy) examples In your project root, set up a
prefect.yaml
with either a remote source or an image. A) Remote source (git) — no image required
Copy code
name: my-project
prefect-version: ">=3.0.0"
source: <https://github.com/your-org/your-repo>

deployments:
- name: etl-from-git
  entrypoint: flows/etl.py:etl
  work_pool:
    name: process
Then run:
Copy code
prefect deploy
B) Remote source (object storage)
Copy code
name: my-project
prefect-version: ">=3.0.0"
source: <s3://my-bucket/prefect-project>  # <gs://bucket/prefix> or <az://container/prefix> also work

deployments:
- name: etl-from-s3
  entrypoint: flows/etl.py:etl
  work_pool:
    name: process
C) Container image (typical for Docker/Kubernetes/ECS)
Copy code
name: my-project
prefect-version: ">=3.0.0"

deployments:
- name: etl-k8s
  entrypoint: flows/etl.py:etl
  work_pool:
    name: kubernetes-pool
  image: <http://ghcr.io/acme/prefect-etl:0.1.0|ghcr.io/acme/prefect-etl:0.1.0>
  build: true    # build the image from your local project
  push: true     # push it to your registry
Then:
Copy code
prefect deploy
Valid ways to provide a “remote storage location” - Git: public HTTPS or private HTTPS/SSH URLs (e.g., https://github.com/org/repo, git@github.com:org/repo.git) - Object storage: s3://bucket/prefix, gs://bucket/prefix, az://container/prefix - Other fsspec-compatible remotes (via RemoteFileSystem), e.g., SMB Common pitfalls - Local-only code with remote workers: If your workers run in Kubernetes/ECS/etc., they can’t see your laptop’s filesystem. Use an image or remote
source
. - Image not pushed: If you set
image
but don’t push it to a registry your workers can access, runs will fail. Use
push=True
or push the image yourself. - Private git repo without credentials: Ensure your workers have access (SSH keys, tokens, or environment credentials). - Incorrect entrypoint: Must be
path/to/file.py:flow_function_name
. - Mixing old 2.x patterns: In 3.x, do not use
Deployment.build_from_flow()
or
prefect deployment build
. Use
Flow.from_source(...).deploy(...)
or
prefect deploy
. Docs and references - Storing flow code (sources, git, object storage): Store flow code for deployments
- Prefect YAML overview: prefect.yaml configuration Can you share: - Whether you’re using the Python API or the CLI? - Your
prefect.yaml
(if using CLI) or the code snippet you’re running (if using Python)? - Your work pool type (process, docker, kubernetes, etc.)? I’ll tailor the fix to your setup.