Chris Arderne12/22/2022, 7:56 AM
(more details in thread!) 1. 🏗️ Structuring project: We're using v2 with a GCS Storage block, a KubernetesJob block and a Flow Deployment (all three defined in Python files). Because we're using GCS Storage, the flow files are not
so we can only use relative imports in the flow like
, which can make local development and bugchecking a bit more error-prone. We will probably shift to just using the Docker images local files, but haven't yet experimented with this in Prefect v2. Are there any best-practises around this? (Project structure in thread) 2. 🏃♀️ Running flows: Prefect
from .utils import foo
command in the CLI, which made it possible to run a local flow. Prefect`v2` doesn't seem to have this? We can't add a
prefect run ...
to the Flow file, because it has relative imports so running it doesn't work. As a workaround I've added a
if __name__ == "__main__"
in a parent directory that imports and runs it, but this isn't very ergonomic. In either case if I want to pass parameters, I must create my own CLI for doing so... Have I missed something in the CLI, or is there a plan to add this? Or is there a better pattern I should be using instead?
. ├── Dockerfile ├── requirements.txt └── src ├── blocks.py ├── deployments.py ├── flows │ ├── __init__.py │ ├── flow.py │ └── utils │ └── __init__.py └── run.py
as it means configuring a bunch of stuff in a CLI argument and seems like a messier CI/CD story, what am I missing there? Version controlling the deployment YAMLs and editing them directly doesn't seem like an option, as they'll immediately get out of date with the blocks? What's the best pattern here? (What I'm doing just seems weird and inconsistent, to do the blocks via the CLI and the deployments by running a file, and has implications for the kinds of imports each file can do.)
prefect deployment ...
prefect block register -f src/blocks.py python src/deployments.py