Leo Kacenjar
01/30/2023, 8:01 PMprefect deployment build /workspace/flows/endpoints.py:run_flow --path /workspace --name endpoints -q dev -ib docker-container/buddy -o endpoints.yaml
And then try to run it, I get an error that makes it seem like the flow code is still trying to be copied from somewhere.
20:31:19.125 | ERROR | Flow run 'loyal-jaguarundi' - Flow could not be retrieved from deployment.
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 269, in retrieve_flow_then_begin_flow_run
flow = await load_flow_from_flow_run(flow_run, client=client)
File "/usr/local/lib/python3.10/site-packages/prefect/client/utilities.py", line 47, in with_injected_client
return await fn(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/prefect/deployments.py", line 175, in load_flow_from_flow_run
await storage_block.get_directory(from_path=deployment.path, local_path=".")
File "/usr/local/lib/python3.10/site-packages/prefect/filesystems.py", line 147, in get_directory
copytree(from_path, local_path, dirs_exist_ok=True)
File "/usr/local/lib/python3.10/shutil.py", line 559, in copytree
return _copytree(entries=entries, src=src, dst=dst, symlinks=symlinks,
File "/usr/local/lib/python3.10/shutil.py", line 513, in _copytree
raise Error(errors)
Christopher Boyd
01/31/2023, 2:46 PMLeo Kacenjar
01/31/2023, 4:42 PMI’m not sure what you’re looking to accomplish here - is there an issue with adding the code into the docker container directly?Our team leverages feature branch based development. So our individual data projects and bug fixes live on short or long lived git branches. Our data engineers work locally on their laptops. We use Gitlab CI for automated testing. We have some automated QA that runs all of our pipelines and compares output between branches. Right now we use the same image for local development, CI/CD and production. This gives us really great parity between environments and makes debugging/dependency management pretty easy. We don't bake the code into the container because we'd need to create a copy of the image every time we create a feature branch. It also makes all flows with different branches a bit more laborious. I'm also not sure how that would work for local development.
If there is, where is your code located normally?Our code is stored in a Gitlab repository. It's checked out locally for development, during cd and on our production server.
Christopher Boyd
01/31/2023, 4:48 PMLeo Kacenjar
01/31/2023, 4:50 PMChristopher Boyd
01/31/2023, 4:51 PMLeo Kacenjar
01/31/2023, 4:51 PMChristopher Boyd
01/31/2023, 4:53 PMLeo Kacenjar
01/31/2023, 4:56 PMChristopher Boyd
01/31/2023, 4:57 PM>>> storage = S3.load("dev-bucket") # load a pre-defined block
>>> deployment = Deployment.build_from_flow(
... flow=my_flow,
... name="s3-example",
... version="2",
... tags=["aws"],
... storage=storage,
... infra_overrides=dict("env.PREFECT_LOGGING_LEVEL"="DEBUG"),
>>> )
>>> deployment.apply()
But instead of S3.load(“dev-bucket”) you can create one on the fly to use as your storageLeo Kacenjar
01/31/2023, 4:59 PM