Prefect 2 question. Remote storage. I'm trying to ...
# ask-community
i
Prefect 2 question. Remote storage. I'm trying to build & deploy pipeline according to documentation https://docs.prefect.io/concepts/deployments/#build-the-deployment. I run command as explain in doc. And I see that Prefect put in bucket WHOLE folder that I'm in (whole workdir) - take a look at the screen. If I will run command from my $HOME Path it will upload whole HOME path in remote bucket? If think.. by logic, it should upload only flow, or flow dir - path that defined in command, but not all environment around. Should I open issue for that? or it there is any idea under that behavior?
1
o
Hey, You're right, it does upload everything in the folder you're in (and below), unless you have a .prefectignore file where you've specified patterns to ignore. See the section about prefectignore files here: https://docs.prefect.io/concepts/deployments/#build-the-deployment
i
but maybe better to do at least as in Docker - like possibility to provide context in command?
that to build
o
I agree. 🙂 Our solution right now is to upload more files than we actually need, so each flow has their own copy of.. every flow and related module, in their own storage paths. Not optimal, wastes space and bandwidth, but it works okay for now. I'm expecting the build command to be improved over time.
❤️ 3
🙏 2
💯 1
a
supporting flow code in Docker image is on immediate roadmap
🚀 2
j
@Oscar Björhn just to better understand, basically you are creating a S3 Storage block for each one of your flows, is that right? I am having some issues with the deployment similar to the one Iuliia reported, looking for alternatives 🙂
o
Yeah, exactly. They all have the same base storage path but different subfolders, that way a developer working on flow_one won't overwrite the py files related to flow_two
👏 2
🙌 1
smart 3
j
Thanks Oscar, this seems to solve the problem, that's a good hacky solution
o
Nice! 🙂