https://prefect.io logo
o

Ovo Ojameruaye

12/17/2021, 10:44 PM
Hi, I am experiencing some weird behavior using prefect server. I can run a flow on prefect core, it executes completely but when I spin up the UI and run the exact same flow on a local agent (same machine and environment), I get an ModuleNotFoundError. I have modules I import as part of my flow
k

Kevin Kho

12/17/2021, 10:45 PM
I honestly can’t think of anything other then an environment mismatch. What env is your agent running in cuz I know it’s one of pyenv or pipenv that doesn’t work (I use conda myself)
o

Ovo Ojameruaye

12/17/2021, 10:53 PM
I started the agent in the same conda environment
Is there a different way to make sure the local agent is running in the same environment. It looks that way from the logs. It's running in an venv call "prefect"
k

Kevin Kho

12/18/2021, 1:43 AM
If you use
prefect agent local start
in conda, then it should be from the same environment
o

Ovo Ojameruaye

12/18/2021, 5:36 PM
Hmmm, That's what I currently do.
Sorted this out. I had to explicitly append the path to the module I wanted to import within every task I imported it rather than once in the entire script. I guess this is related to how the task are passed to the executors
Copy code
sys.path.append(DATA_DIR) #moved inside task
logger = prefect.context.get("logger")

@task()
def get_i():
    sys.path.append(DATA_DIR)
    fetch = importlib.import_module("logistics-pipeline")
    //
	//
    <http://logger.info|logger.info>("Get  Task Completed")
a

Aram Panasenco

12/20/2021, 11:51 PM
Hey Ovo, I was experiencing the same issue, and Anna Geller's comment in a thread from 2 months ago helped me resolve it: https://prefect-community.slack.com/archives/CL09KU1K7/p1635253346416400?thread_ts=1635243185.382700&amp;cid=CL09KU1K7
❤️ 1
r

Ricardo Gaspar

02/09/2022, 2:04 PM
@Anna Geller @Kevin Kho is there a way to send the custom package code to S3 as part of the flow registration upload? I’d like to be able to run both locally and on ECR/EC2 (via UniversalRun). Currently I’m using S3 storage. Is the only option to use docker images?
k

Kevin Kho

02/09/2022, 2:15 PM
No there is no way to do that for now. Recently though
cloudpickle 2.0
has support for deepcopying of modules so maybe it will become possible. For now though, you need to either have the custom modules installed on the EC2 (if they don’t change often), or package it in a Docker container. I am not seeing how you would get this working for both ECS and EC2 simultaneously because I think both will require configuration exposed by their respective RunConfigs
😉 1
6 Views