<@ULVA73B9P> I have the source for my deployment s...
# ask-marvin
j
@Marvin I have the source for my deployment set to a local directory, and a worker on the same machine with access to that directory. I have added a new file locally that isn't yet tracked in the git repository. When I run my deployment, my run fails as it cannot see the file, despite it being present in the local directory. Why is this?
m
thought for 403.1 seconds
Short answer: because your deployment is running from a Git checkout, not your working directory. In Prefect 3.x, if you create a deployment from a path that’s inside a Git repository, Prefect records the repo URL and a specific commit and adds a git_clone pull step. At runtime, the worker clones that commit into a fresh working directory and runs from there. Untracked files in your local working copy are not part of that commit, so they aren’t present when the run executes—even if the worker is on the same machine. How to confirm - Inspect the deployment; you should see a git_clone pull step and a pinned commit:
Copy code
prefect deployment inspect "<flow-name>/<deployment-name>"
Docs: Deployment versioning and Pull steps (git_clone, set_working_directory, etc.) Ways to fix - Commit the new file and redeploy. Since runs pull a specific commit, the file must be tracked and committed to be included. - If you really want to run against a local working copy (not recommended for reproducibility), override the pull steps to just set the working directory on the worker instead of cloning Git. Example
prefect.yaml
snippet:
Copy code
deployments:
  - name: local-dev
    entrypoint: path/to/flow.py:flow
    pull:
      - prefect.deployments.steps.pull.set_working_directory:
          directory: /absolute/path/to/your/working/copy
Note: this only works if the worker can access that exact path and you accept the lack of version pinning. If your project is not in Git (or you point to a directory outside any Git repo), Prefect won’t add a git_clone step and will instead package the local source for the run; in that case, local files are included. If helpful, share your
prefect.yaml
and the output of
prefect deployment inspect
and I can point to the exact setting causing this.
j
@Marvin There does not seem to be a git_pull step present. Here is my full inspect output:
Copy code
{
    'id': '02cb2db8-0828-49fb-9dea-2c53e5887f2f',
    'created': '2025-09-19T22:22:01.365984Z',
    'updated': '2025-09-22T22:29:23.422000Z',
    'name': 'build_trenton_TEST',
    'version': '42f082c725070d78eaac3071cabbe76a',
    'description': 'None',
    'flow_id': 'd9966598-165d-422a-83e4-54441da4bf67',
    'concurrency_limit': None,
    'global_concurrency_limit': None,
    'concurrency_options': None,
    'paused': False,
    'schedules': [],
    'job_variables': {},
    'parameters': {},
    'pull_steps': [{'prefect.deployments.steps.set_working_directory': {'directory': '/home/intel/repos/hfw-ci-prefect'}}],
    'tags': [],
    'labels': {'prefect.flow.id': 'd9966598-165d-422a-83e4-54441da4bf67'},
    'work_queue_name': 'default',
    'last_polled': '2025-09-22T22:29:23.422004Z',
    'parameter_openapi_schema': {'title': 'Parameters', 'type': 'object', 'properties': {}},
    'path': None,
    'entrypoint': 'src/jobs/trenton/build.py:FLOW_build',
    'storage_document_id': None,
    'infrastructure_document_id': None,
    'created_by': None,
    'updated_by': None,
    'work_queue_id': None,
    'enforce_parameter_schema': True,
    'work_pool_name': 'build',
    'status': 'READY',
    'automations': []
}
m
thought for 114.1 seconds
Thanks for sharing the inspect output — that clarifies it. You’re not using a git clone at runtime. Your deployment’s pull steps only do:
Copy code
{'prefect.deployments.steps.set_working_directory': {'directory': '/home/intel/repos/hfw-ci-prefect'}}
So Prefect is just chdir’ing into that path and running from the local filesystem. There’s no packaging or copying of files here. Given that, your new untracked file should be visible on disk — unless one of these is happening: 1) The worker is containerized (Docker/K8s) - Even on the same machine, a container cannot see host files unless you mount them. - If your “build” work pool is Docker or Kubernetes, the path /home/intel/repos/hfw-ci-prefect inside the container won’t contain your host files (and thus your new file). - Action: Check your pool type:
Copy code
prefect work-pool inspect build
- If it’s Docker/K8s, add a volume mount for that host path or switch to a Git-based source. 2) Python is importing an installed package instead of your working tree (src/ layout) - Your entrypoint is inside src/… which is fine, but imports inside your code may resolve to an already-installed package from site-packages. New files you just added won’t be present there unless you reinstall. - Common fixes: - Add a pull step to install your project in editable mode so the runtime sees local files:
Copy code
deployments:
      - name: build_trenton_TEST
        entrypoint: src/jobs/trenton/build.py:FLOW_build
        pull:
          - prefect.deployments.steps.set_working_directory:
              directory: /home/intel/repos/hfw-ci-prefect
          - prefect.deployments.steps.utility.run_shell_script:
              script: pip install -e .
- Or set PYTHONPATH so Python can import directly from your src directory:
Copy code
job_variables:
      env:
        PYTHONPATH: /home/intel/repos/hfw-ci-prefect/src
3) It’s a data/config file and your code is looking in a different directory - Confirm the runtime cwd is what you expect and reference files via absolute paths derived from cwd or the file location. 4) Permissions - If the worker runs as another user, ensure that user can read the file. Next steps I recommend - Share the output of: -
prefect work-pool inspect build
(to confirm worker type) - The exact error you see (ImportError, FileNotFoundError, etc.) - If you are using Docker/K8s, confirm/mount the host path into the container. - If you’re on a process worker, try the editable install step above or set PYTHONPATH. Docs - Pull steps (set_working_directory, run_shell_script): Pull steps - Running flows in local processes: Run flows in local processes If you paste the error and pool inspect output, I can point you to the precise fix.
j
@Marvin I figured out the error. My directory structure is "/home/user/repos/project/src/common", and my importing file was "/home/user/repos/project/src/job_code". I specified my source as "/home/user/repos/". I changed my source to "/home/user/repos/src" and it started to work as expected.
m
thought for 26.5 seconds
Nice catch — that explains it. Because your deployment’s pull step is just set_working_directory, Prefect runs your flow with the working directory set to whatever you choose as the “source.” With a src-layout project, Python can only import your modules if either: - the working directory is the src folder, or - PYTHONPATH includes src, or - the project is installed (e.g., pip install -e .) By changing the source to /home/user/repos/src, your modules (common, job_code) became importable from the runtime CWD, so the new file showed up as expected. If you’d prefer to keep the source at the project root, here are equivalent options: - Add PYTHONPATH so Python can import from src:
Copy code
deployments:
  - name: your-deployment
    entrypoint: src/job_code/flow.py:flow
    pull:
      - prefect.deployments.steps.set_working_directory:
          directory: /home/user/repos/project
    job_variables:
      env:
        PYTHONPATH: /home/user/repos/project/src
- Or install your project in editable mode at runtime:
Copy code
deployments:
  - name: your-deployment
    entrypoint: src/job_code/flow.py:flow
    pull:
      - prefect.deployments.steps.set_working_directory:
          directory: /home/user/repos/project
      - prefect.deployments.steps.utility.run_shell_script:
          script: pip install -e .
Docs for pull steps: Pull steps (set_working_directory, run_shell_script) If you want fully reproducible runs across machines, consider using a Git source and pinning a commit; otherwise your current approach is fine for local dev.