Hi Everyone, I'm having an issue with deployment ...
# prefect-community
b
Hi Everyone, I'm having an issue with deployment storage. This deployment is using the
process
infrastructure and the
s3
block for filesystem storage. I build my deployment, the yaml is generated, and my flow code is uploaded to s3. I have prefect orion running locally, and this deployment applied to my local prefect. I then start a local agent within my python virtual environment. In orion, I click
Run
on my deployment. The flow run is moved to my local work queue, my local agent picks up the flow run, and builds the infrastructure. The agent then runs the prefect engine, which begins pulling my flow code from s3. In doing so, I receive this flow run failure log:
Copy code
Flow could not be retrieved from deployment.
Traceback (most recent call last):
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\engine.py", line 254, in retrieve_flow_then_begin_flow_run
    flow = await load_flow_from_flow_run(flow_run, client=client)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\client.py", line 104, in with_injected_client
    return await fn(*args, **kwargs)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\deployments.py", line 55, in load_flow_from_flow_run
    await storage_block.get_directory(from_path=None, local_path=".")
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\filesystems.py", line 399, in get_directory
    return await self.filesystem.get_directory(
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\filesystems.py", line 260, in get_directory
    return self.filesystem.get(from_path, local_path, recursive=True)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\fsspec\spec.py", line 801, in get
    self.get_file(rpath, lpath, **kwargs)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\fsspec\spec.py", line 769, in get_file
    outfile = open(lpath, "wb")
FileNotFoundError: [Errno 2] No such file or directory: 'C:/Users/BSTEFA~1/AppData/Local/Temp/tmp5t7vj96aprefect/aurora\\aurora_constants.py'
Expectation I would expect the prefect engine to use s3 file paths for pulling down the flow code, but it seems to be using local file paths. Is this an issue with my deployment/flow setup, or is this a bug? Environment • Windows 10 • Python 3.10.5 • Prefect 2.2.0
1
👀 1
b
Hey Blake, thanks for reaching out! We also received your case through the chat bot. We'll continue this dialogue through email if that's okay with you.
b
Hi Bianca, thank you for looking at this. Happy to continue with email!
p
Hi @Bianca Hoch I believe I am running into a similar issue: https://prefect-community.slack.com/archives/CL09KU1K7/p1661204463771199
r
Hey everyone! I am also running into this very weird error of absolute local paths showing up in remote flow runs. I have a hunch that it might be caused by the Deployment not “accepting” the storage that I try to configure and thus falling back to the local storage. I am defining the deployment via Python like this:
Copy code
storage_block = GCS.load("dev")
infrastructure_block = KubernetesJob.load("dev")

deployment = Deployment.build_from_flow(
    flow=orion_flow,
    name="sample-deployment",
    version=1,
    work_queue_name="kubernetes",
    storage=storage_block,
    infrastructure=infrastructure_block,
)
deployment.apply()
But when I inspect the deployment that was created, the storage does not seem to be configured at all. Would be super happy about any hints whatsoever :)
1
j
This PR has been merged and I think should fix your problem. I expect the next release will be out by early next week.
🚀 2
gratitude thank you 2
r
Perfect, thanks
👍 1
p
Thank you so much! I patched my env with that change and I finally got my environment to work!
🙌 1