Is there a way to import a custom model when using...
# ask-community
a
Is there a way to import a custom model when using
GitHub
storage? Reason for this is two-fold. Since we would be using the same block of code for just about every flow, it would help keep it semi-DRY. Second, it would allow us to pass different configs depending on environment variables (
dev
vs
prod
)
Copy code
from prefect_utils import (
  RUN_CONFIG,
  STORAGE
)
flow.storage=STORAGE
flow.run_config=RUN_CONFIG
It works with the
S3
storage. But when trying to use
GitHub
I get the below.
Copy code
Failed to load and execute Flow's environment: ModuleNotFoundError("No module named 'prefect_utils'")
The module is in the same folder as the flow (
flows/
). Is this an issue because when the flow is ran it is being ran from the parent directory? thus an issue with relative paths?
So it works on the
S3
Storage using the below code but not on
GitHub
Copy code
import sys
import os
sys.path.insert(0, os.path.abspath('..'))

import prefect
from prefect import task, Flow
from flows.prefect_utils import (
  RUN_CONFIG,
  STORAGE
)
One guess is that when the repo is cloned it is a directory. So in the event of
RepoA
after the clone, the path would be
RepoA.flows.prefect_utils
, Though I’m not sure how to remedy this
Directory looks like
Copy code
pipeline/
├─ flows/
│  ├─ __init__.py
│  ├─ my_flow.py
│  ├─ prefect_utils.py
├─ notebooks/
│  ├─ jupyter_notebook_to_run
So I guess, the first question is, between the two Storage classes, do they both start at the same place (ie. in the
pipeline
directory) or does the
GitHub
Storage class start one directory above the
pipeline
directory?
n
Hi @Alex Welch - unfortunately not; using GitHub storage doesn't replicate the directory structure of your entire repo but pulls your flow script only.
I do think this is something we could do so if you're interested in opening a feature request for that, you could do so here
a
aw man…
is that the same in the event of the
DockerRun
? or could i issue a command to do the repo cloning and then run prefect at that point
n
I think typically you'd want to use docker storage or something similar, which would let you bundle all of your dependencies into the image
a
does it build the image every flow?