https://prefect.io logo
#prefect-community
Title
# prefect-community
b

Blake Stefansen

09/26/2022, 2:32 PM
Hey y'all, I'm getting this weird error when running a prefect 2 deployment from a custom agent running in an ubuntu docker container. My flow code cannot be found when downloading from my s3 storage block during a deployment run. I normally upload the flow code when building my deployment. I can confirm that my flow file definitely exists in my s3 bucket. Any ideas?
๐Ÿ‘€ 1
k

Khuyen Tran

09/26/2022, 2:41 PM
I havenโ€™t seen this error before. Do you get the same error when running the deployment in your local machine?
j

Jeff Hale

09/26/2022, 2:46 PM
Hi Blake. Can you please move the code and error to this thread to keep the main channel clean?
b

Blake Stefansen

09/26/2022, 2:47 PM
Deployment
Copy code
storage = S3.load("isp-flow-filesystem")  # load a pre-defined block
deployment = Deployment.build_from_flow(
    flow=isp_flow,
    name="isp_flow_deployment",
    tags=[],
    storage=storage,
    skip_upload=False,
    work_queue_name="kubernetes",
    apply=True
)
Flow Run
Copy code
Flow could not be retrieved from deployment.
Traceback (most recent call last):
  File "<frozen importlib._bootstrap_external>", line 879, in exec_module
  File "<frozen importlib._bootstrap_external>", line 1016, in get_code
  File "<frozen importlib._bootstrap_external>", line 1073, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpm90cai6aprefect/flows\\prefect_isp_flow.py'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/prefect/engine.py", line 257, in retrieve_flow_then_begin_flow_run
    flow = await load_flow_from_flow_run(flow_run, client=client)
  File "/usr/local/lib/python3.10/dist-packages/prefect/client/orion.py", line 82, in with_injected_client
    return await fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/prefect/deployments.py", line 70, in load_flow_from_flow_run
    flow = await run_sync_in_worker_thread(import_object, str(import_path))
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/asyncutils.py", line 57, in run_sync_in_worker_thread
    return await anyio.to_thread.run_sync(call, cancellable=True)
  File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/importtools.py", line 193, in import_object
    module = load_script_as_module(script_path)
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/importtools.py", line 156, in load_script_as_module
    raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'flows\\prefect_isp_flow.py' encountered an exception
๐Ÿ™ 1
@Khuyen Tran So if I run a deployment with my agent running locally, I get this error. My docker agent is running ubuntu, but my local agent is using windows 10
Copy code
Encountered exception during execution:
Traceback (most recent call last):
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\engine.py", line 1216, in orchestrate_task_run
    result = await run_sync(task.fn, *args, **kwargs)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\prefect\utilities\asyncutils.py", line 57, in run_sync_in_worker_thread
    return await anyio.to_thread.run_sync(call, cancellable=True)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "C:\Users\BSTEFA~1\AppData\Local\Temp\tmprw8ls9cjprefect\flows\prefect_isp_flow.py", line 42, in download_isp_file
    s3.download_file(
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\boto3\s3\inject.py", line 190, in download_file
    return transfer.download_file(
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\boto3\s3\transfer.py", line 320, in download_file
    future.result()
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\futures.py", line 103, in result
    return self._coordinator.result()
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\futures.py", line 266, in result
    raise self._exception
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\tasks.py", line 139, in __call__
    return self._execute_main(kwargs)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\tasks.py", line 162, in _execute_main
    return_value = self._main(**kwargs)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\download.py", line 642, in _main
    fileobj.seek(offset)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\utils.py", line 378, in seek
    self._open_if_needed()
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\utils.py", line 361, in _open_if_needed
    self._fileobj = self._open_function(self._filename, self._mode)
  File "C:\Users\BStefansen\.virtualenvs\lbx-broadband-pipeline-4k_4F-H8\lib\site-packages\s3transfer\utils.py", line 272, in open
    return open(filename, mode)
FileNotFoundError: [Errno 2] No such file or directory: './data\\TX_CSV_MOCK.csv.570556bc'
Is it possible that uploading my flow code from windows would mess up the filepaths when trying to download everything on ubuntu? Windows uses
\\
sometimes for filepaths, and I wonder if that's the problem
Nevermind, it looks like the
data
folder never got uploaded with the flow code. Perhaps because the folder is empty? I don't see it in my
prefectignore
file. Sorry @Khuyen Tran yes, the deployment does work on my local windows agent now
It looks like the data folder issue is unrelated as expected. I still get the same error with my ubuntu docker agent
Copy code
Flow could not be retrieved from deployment.
Traceback (most recent call last):
  File "<frozen importlib._bootstrap_external>", line 879, in exec_module
  File "<frozen importlib._bootstrap_external>", line 1016, in get_code
  File "<frozen importlib._bootstrap_external>", line 1073, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpqwjpl7japrefect/flows\\prefect_isp_flow.py'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/prefect/engine.py", line 257, in retrieve_flow_then_begin_flow_run
    flow = await load_flow_from_flow_run(flow_run, client=client)
  File "/usr/local/lib/python3.10/dist-packages/prefect/client/orion.py", line 82, in with_injected_client
    return await fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/prefect/deployments.py", line 70, in load_flow_from_flow_run
    flow = await run_sync_in_worker_thread(import_object, str(import_path))
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/asyncutils.py", line 57, in run_sync_in_worker_thread
    return await anyio.to_thread.run_sync(call, cancellable=True)
  File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/importtools.py", line 193, in import_object
    module = load_script_as_module(script_path)
  File "/usr/local/lib/python3.10/dist-packages/prefect/utilities/importtools.py", line 156, in load_script_as_module
    raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'flows\\prefect_isp_flow.py' encountered an exception
I can see that the
tmp
folder does exist on the ubuntu agent, but not
tmpqwjpl7japrefect
I believe that
tmpqwjpl7japrefect
should be automatically created when the flow code is downloaded?
So, if anyone's curious, I figured it out. The issue is with the deployment entry point. I've resorted to using the
yaml
generated from the CLI When creating a yaml on Windows, the file path uses
\
This is not compatible with linux. Therefore, I manually change the filepath to use
/
instead.
Copy code
entrypoint: flows/prefect_isp_flow.py:isp_flow
My suggestion would be that prefect should generate their deployment entrypoint with an OS agnostic filepath syntax. This resolved my issue and took a lot of digging to figure out.
๐Ÿ™Œ 1
๐ŸŽ‰ 1
๐Ÿ™ 1
j

Jeff Hale

09/27/2022, 1:28 PM
Thank you, Blake. We have an open issue that you can follow on the topic here. I referenced this thread, but feel free to add any additional info
7 Views