Hey community, I have previously worked with Dock...
# ask-community
b
Hey community, I have previously worked with Dockerstorage for all my flows but as my repo has grown it has slown down my CI process, so I am switching to S3 stroage. I have a docker image baked in with all the dependancies and files etc for the repo. I have set up my storage like so:
Copy code
S3(
            bucket="es-prefect-flows-staging",
            stored_as_script=True,
        )
Now after registering my flow I am getting an error after the flow is downloaded: in ๐Ÿงต
โœ… 1
Failed to load and execute Flow's environment: ImportError("cannot import name 'some_flow' from 'flows.tab.tasks' (/flows/tab/tasks.py)")
I built this image locally and the tasks file is located at
/flows/tab/tasks.py
wondering how you suggest to debug something like this?
maybe something to do with the
PYTHONPATH
but not sure how I can set this on the
S3
storage object?
Do I need to set it on the run config?
answered my own question ๐Ÿ˜†
z
๐Ÿ˜„ nice job
๐Ÿ˜† 1