Hey guys! I'm really stuck here, but sure i'm mis...
# prefect-server
a
Hey guys! I'm really stuck here, but sure i'm missing something easy within the documentation. I have a prefect server running on an independant linux machine and developing on a windows environment. I want to deploy my flow (from my windows environment) to my prefect server. Local storage doesn't seem to be an option as my linux machine has no access to the network that my .py file is on. So... git storage? We use Azure Devops as our repo but i'm honestly lost at how I can now deploy or connect that repo to my server. Any help would be greatly appreciated.
k
hey adam, I have a somewhat similar setup. Our backend is deployed on AWS and I am doing local development work on my macbook. I am using S3 storage for my flows. I use the following command to register
env PREFECT__SERVER__HOST=<https://prefect.dev.MYCOMAPNY.net> prefect register -f --project "test" -p flow.py
a
Hey @Kyle McChesney, thanks... i'll try that. How do you have the storage setup? Local?
k
that url is the url for our server. It’s likely possible to configure this in your
.prefect
config if you have one.
I am using AWS s3 for storage
a
Ok cool, so you're using the S3 storage class?
k
yea, example
Copy code
from prefect import Flow
from prefect.storage import S3

with Flow(
    'flow',
    storage=S3('my-bucket'),
) as flow:
    ...
👍 1
I just gave AWS access to said bucket for the ec2 instances running the backend server
👍 1
where do you want your flows to actually run?
a
On the local prefect server
k
as far as I know you need some other kind of compute to actually run the flows. The server itself is not capable of doing it
a

https://www.youtube.com/watch?v=HuwA4wLQtCM

watching this video from Duke University they have an Azure VM which acts as their server, they don't seem to mention any other compute, or again... am I missing a trick?
k
https://medium.com/the-prefect-blog/the-prefect-hybrid-model-1b70c7fd296 - this blog covers the high level ideas. I skimmed this video and it does seem like they do skip over agent configuration some what. If you’ve used airflow I think the scheduler is a similar concept (https://airflow.apache.org/docs/apache-airflow/stable/concepts/scheduler.html). Basically the agent(s) query the server, and once there is a schedule Flow run on the backend that “matches” the agent, the agent initiates the flow run. Its basically just a long running process that pings the server on some interval, and manages the actual execution of the flow.
a
Yeah I guess I figured if the same machine with the server on also has an agent on there's no reason it shouldn't be able to run flows. I'll read that article now, cheers!
k
I mean, you could certainly do that. Just start an agent process on the same “box” as the one running the server. I guess the key is that you need to run an agent process somewhere and register it on the backend. Its not “built in” to the server process.
👍 1
(I might not have been super clear on that)