Hello guys I'm having issues getting my Local Agen...
# prefect-community
m
Hello guys I'm having issues getting my Local Agent to execute my flow runs I first register two test flows like this:
Copy code
# CREATE STORAGE

storage=storage.build()

# ADD FLOWS TO STORAGE
storage.add_flow(test_flow)
storage.add_flow(test_flow2)

storage=storage.build()

test_flow.storage=storage
test_flow2.storage=storage

# REGISTER FLOWS IN UI/DB
test_flow.register(build=False)
test_flow2.register(build=False)
And they indeed appear in the UI (image bellow) Then I start the agent in my cli (image bellow) But when I try to run one of these flows, they stay permanently schedulled (image bellow) What am I doing wrong? Ty
z
Hi @Manuel Mourato, taking a look now. Will have an answer for you in just one sec. 🙂
🙏 1
m
@Zachary Hughes it’s another Manuel 🙂 // @Manuel Mourato
z
Darn autocorrect-- thank you.
m
np…just added my last name to my display name to avoid confusion
z
@Manuel Mourato It looks like you might have an issue with label mismatches. From your screenshots, it looks like your local agent is spinning up with labels, but your flow doesn't appear to have those labels.
m
Hm, the way I start up the agent is the same as in the documentation
prefect agent start
But I will try to give one of those labels to the flows and see if it works
z
You should be able to resolve this by labeling your test flow to match your local labels. If you're trying to achieve more than a test flow, happy to walk you through that as well. https://docs.prefect.io/orchestration/agents/local.html#local-agent
It looks like you're custom-building your storage as well. Are you using the default local storage, or are you using Docker storage?
m
I am using a Docker image as storage, yes 🙂 Adding the label corresponding to my local machine made the agent work. But I am getting an "unsupported storage" exception. I see that for these cases we should use a Docker Agent , which if I understand correctly, it checks if the storage used was Docker, and if so it starts a process inside a new container to run the flow, is this correct? If so, does the new container use the python_dependencies passed to the storage container?
z
Aha, there we go. If you're using Docker storage, you'll need to use a Docker Agent, so definitely agreed there. 🙂 The Docker Agent polls for work, and when it finds it, uses the Docker daemon to spin up a new container for running the flow. I believe passing your
python_dependencies
to your Docker storage should ensure your flow has the specified components.
m
Hmm, I see. In my case some dependencies are not pip installable, but I can create a dockerfile with the dependencies I want and other custom variables, so no harm done. Thank you so much for your help @Zachary Hughes
z
Glad I could help! Don't hesitate to reach out with any other questions you may have.