Thread
#prefect-community
    s

    sidravic

    5 months ago
    Hello folks, I'm running flows on Prefect 1.2 using ECSRun and my setup uses the approach as described in this issue under the Conclusions pt. 5. It uses the
    task_definition_arn
    with the containers named as
    flow
    While I'm able to trigger the flows, the flow crashes with the error
    copilot/flow/8d31faa7f1ba   File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
    copilot/flow/8d31faa7f1ba     return _bootstrap._gcd_import(name[level:], package, level)
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
    copilot/flow/8d31faa7f1ba   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
    copilot/flow/8d31faa7f1ba ModuleNotFoundError: No module named '/root/'
    However, i've ensured the flows folder from my project are under the PYTHONPATH and I can't entirely figure out what (if anything) that cloudpickle is trying to do to access those flows at the time of execution.
    Anna Geller

    Anna Geller

    5 months ago
    Part 5 of this conclusion section doesn't include
    task_definition_arn
    , though - it's mainly focused on
    task_role_arn
    and
    execution_role_arn
    . - can you share your ECSRun?
    Re your error "ModuleNotFoundError: No module named '/root/' " - this is a Dockerfile issue. Can you share your Dockerfile? Is it based on Prefect base image? You may also check this example
    I can't entirely figure out what (if anything) that cloudpickle is trying to do to access those flows at the time of execution.
    You don't necessarily need to use cloudpickle - check this doc and this example using script storage rather than pickle
    s

    sidravic

    5 months ago
    Yes. My Dockerfile is this
    FROM prefecthq/prefect:latest-python3.8
    
    ENV ANALYTICS_APP_DIR /app
    
    WORKDIR ${ANALYTICS_APP_DIR}
    
    RUN apt-get update -y \  
        && apt-get update && apt-get install -y curl build-essential libssl-dev libffi-dev lib32ncurses5-dev git libsnappy-dev postgresql-client\
        && apt-get -y install telnet vim unzip
    
    ADD poetry.lock $ANALYTICS_APP_DIR/
    ADD pyproject.toml $ANALYTICS_APP_DIR
    
    ENV PYTHONPATH=$ANALYTICS_APP_DIR
    ENV POETRY_VIRTUALENVS_PATH=$ANALYTICS_APP_DIR/
    
    RUN pip install poetry
    
    RUN poetry install
    
    ADD . $ANALYTICS_APP_DIR/
    
    
    CMD ["/bin/bash", "-c", "./ops/launch-ecs-agent.sh"]
    And the ECSRun looks like this
    ecs_run_config = ECSRun(
            labels=[f"{cfg('ENV')}"],
            task_definition_arn=task_definition_arn,
            task_role_arn=task_role_arn,
            execution_role_arn=execution_role_arn,
            cpu=1024,
            memory=2048,
            run_task_kwargs=dict(            
                cluster=f"{cfg('ECS_CLUSTER_NAME')}",
                launchType=f"{cfg('LAUNCH_TYPE')}",
                overrides=dict(
                    containerOverrides=[
                        dict(name="flow",
                            command=["poetry", "run", "prefect", "run", "hello-flow"],
                            cpu=1024,
                            memory=2048,                        
                        )
                    ]
                ),
            ),
        )
    My ECS agent container image also contains the flow code. So I use the same image and task_definition_arn in my ECSRun code and I override the command using the overrides argument
    Thank you, the modules approach to storage got me to a working flow. Thank you very much Anna. Also thanks for the repos. Very very helpful.