Hi - I am being lazy about re-configuring another ...
# show-us-what-you-got
i
Hi - I am being lazy about re-configuring another docker build push interface. Is it possible to use
prefect.environments.storage.docker
as the mechanism to build and push containers while using Non-Docker Storage for Containerized Environments? (I ran into a bug deploying this, and wondering if this might be a reason)
Copy code
flow.storage = S3(
    bucket="s3-prefect-flow-storage",
    secrets=["AWS_CREDENTIALS"],
    )
Copy code
docker = Docker(
    registry_url=ecr_repo_url,
    python_dependencies=[
        "pandas",...],
    dockerfile=docker_flpth,
    image_name="annoying_docker",
    image_tag="latest",
    local_image=True
    )
docker.build(push=True)
j
Hi @itay livni personally I think the image you set in the
environment.metadata
should be built independently using something like Docker’s CLI. The
Docker
storage has a lot of logic baked into it around flow storage and therefore does some extra things for storing the flow that you will not need in your metadata image
👍 1
upvote 1
i
Although I do like the health checks. It's like a pre-push hook.
j
Yeah they are nice 🙂 you could call them directly https://github.com/PrefectHQ/prefect/blob/master/src/prefect/environments/storage/_healthcheck.py although some might not be viable without being inside your flow’s image
i
@josh 1. I built and pushed the docker image to ecr using the aws cli (docker-gist) 2. Ran the deploy script (gist) 3. Started the `FargateAgent (start-agent-gist) successfully. 4. Invoked a quick run The flow got scheduled but did not run. No ECS tasks began and the agent doesn't seem to know that a quick run was invoked. -- Thanks
j
The default label is
s3-flow-storage
i
@josh Thanks. That worked. The new error from the Cloud is
Failed to load and execute Flow's environment: ValueError('Flow is not contained in this Storage')
I changed storage and agent configuration to
labels=s3-flow-storage
.
j
How are you registering the flow?
i
@josh
Copy code
pushlog = etl_moc_flow.register(
    project_name="market-on-close", 
    build=True,
    set_schedule_active=False
    )
j
Hmm interesting, and this flow has both s3 storage and an image name in the environment metadata? FWIW I just did this and it was working for me 🙂
i
@josh the enviroment is simply configured like this;
Copy code
etl_moc_flow.environment = LocalEnvironment(
    metadata={"image": image}
    )
Is there anything else I have to add?
j
No that should be it! Would you mind opening an issue so we can track?
i
Will do. Thanks.