o

    Ovo Ojameruaye

    9 months ago
    Hi, I am experiencing some weird behavior using prefect server. I can run a flow on prefect core, it executes completely but when I spin up the UI and run the exact same flow on a local agent (same machine and environment), I get an ModuleNotFoundError. I have modules I import as part of my flow
    Kevin Kho

    Kevin Kho

    9 months ago
    I honestly can’t think of anything other then an environment mismatch. What env is your agent running in cuz I know it’s one of pyenv or pipenv that doesn’t work (I use conda myself)
    o

    Ovo Ojameruaye

    9 months ago
    I started the agent in the same conda environment
    Is there a different way to make sure the local agent is running in the same environment. It looks that way from the logs. It's running in an venv call "prefect"
    Kevin Kho

    Kevin Kho

    9 months ago
    If you use
    prefect agent local start
    in conda, then it should be from the same environment
    o

    Ovo Ojameruaye

    9 months ago
    Hmmm, That's what I currently do.
    Sorted this out. I had to explicitly append the path to the module I wanted to import within every task I imported it rather than once in the entire script. I guess this is related to how the task are passed to the executors
    sys.path.append(DATA_DIR) #moved inside task
    logger = prefect.context.get("logger")
    
    @task()
    def get_i():
        sys.path.append(DATA_DIR)
        fetch = importlib.import_module("logistics-pipeline")
        //
    	//
        <http://logger.info|logger.info>("Get  Task Completed")
    Aram Panasenco

    Aram Panasenco

    9 months ago
    Hey Ovo, I was experiencing the same issue, and Anna Geller's comment in a thread from 2 months ago helped me resolve it: https://prefect-community.slack.com/archives/CL09KU1K7/p1635253346416400?thread_ts=1635243185.382700&amp;cid=CL09KU1K7
    r

    Ricardo Gaspar

    7 months ago
    @Anna Geller @Kevin Kho is there a way to send the custom package code to S3 as part of the flow registration upload? I’d like to be able to run both locally and on ECR/EC2 (via UniversalRun). Currently I’m using S3 storage. Is the only option to use docker images?
    Kevin Kho

    Kevin Kho

    7 months ago
    No there is no way to do that for now. Recently though
    cloudpickle 2.0
    has support for deepcopying of modules so maybe it will become possible. For now though, you need to either have the custom modules installed on the EC2 (if they don’t change often), or package it in a Docker container. I am not seeing how you would get this working for both ECS and EC2 simultaneously because I think both will require configuration exposed by their respective RunConfigs