Hey guys, stuck on something rather weird so I'm t...
# prefect-community
b
Hey guys, stuck on something rather weird so I'm trying to figure out how to resolve it hopefully someone here might be able to help. After migrating a flow to use
GitLab
as a Storage - things seem to be working fine except for flows which actually import things from a GitLab subdirecroty. So I keep getting
ModuleNotFoundError("No module named 'tasks'")
. My Storage repository file structure looks a bit like this:
Copy code
root-of-repo/
├── flow_hello.py
└── tasks/
    ├── __init__.py
    └── hello.py
And the in-code flow config is looking like so:
Copy code
flow.storage = GitLab(
    repo="XXXXX",
    host="XXXXX",
    path="flows/flow_hello.py",
    secrets=["GITLAB_ACCESS_TOKEN"],
    ref="master",
)
But it seems to be failing to resolve the import so I'm a little bit confused. Am I somehow able to add it to the path of the agent despite it being a GitLab storage?
z
The Github storage pulls a single file from your repo to load your flow object so unless the rest of your module happens to be available wherever your agent is running you’ll run into a missing module error.
Welcome by the way 🙂
b
Hehe thank you! Appreciate the rapid response. So - going to ask an annoying follow-up question - what's the recommendation when productionizing. Our setup so far is running a local agent which gets updated once there are changes detected and re-registers flows which were changed. The issue we've hit with that though is that when taking down the agent to update dependencies - ongoing tasks would not recover but instead have a Heartbeat timeout, so we thought the GitLab Storage would be more flexible and not involve us bringing the agent down but seems like another wall we're hitting here. Any advice would be very much welcome 🙂
z
I don’t think you should need to redeploy the agent when there are changes, it launches new flows runs in a separate process.