Hi Everyone, I'm using the prefect-agent (Docker ...
# ask-community
l
Hi Everyone, I'm using the prefect-agent (Docker agent) within Azure as VM. And I connected it to my Prefect Cloud, but I'm unsure how to send my flow triggers to the Docker Agent. So far all my registered flows still go to the Local Agent, but when I'm not inside my VM the Local Agent goes offline, while it looks like the Docker Agent is always online (keeps pinging it every 10-ish seconds with heartbeat) My simple flow is inside the 2nd image. I also tried using, but the flow still went to my Local agent My DockerAgent is using the 'prod' label.
Copy code
from prefect import task, Flow
from prefect.run_configs import DockerRun


@task
def say_hello(x=1):
    return f"Test {x}'


with Flow("hello-world") as flow:
    first = say_hello()
    second = say_hello(2)

flow.config_run = DockerRun()
flow.register(project_name='test', labels=['prod'])
So question: How do I send the flow created inside my Azure VM to the VM's DockerAgent using the Prefect Cloud orchestration?
I was able to trigger the flow onto my Docker Agent. Since it has the correct labels now: But I currently get an error when trying to trigger the flow. Is this because I don't refer to an image in my code? Since I try to have the Docker Agent run a 'normal' python script instead of a Docker Image? I hoped I was able to trigger my code like the Local Agent, but with the Docker Agent since that one is always online, Local Agent goes offline when I break my ssh VM connection.
k
Hey @Luuk, let’s walk through what happens when you register. When you register, the 
Flow
 gets saved in 
Storage
. When an agent runs the 
Flow
 , it retrieves the 
Flow
 from 
Storage
 and runs it. 
Storage
 can be S3 or Github, or Docker. The default is 
Local
 storage which saves the file locally and Prefect keeps track of the
Local
path that contains the Flow. The agent is looking the file locally (inside the container in your case), but of course it won’t find it there because that path doesn’t exist. You would need to use a Storage accessible by the container like S3 Storage or Github Storage. Or you could put the file in the Docker container and point to it. I have an example here