Hi, I was looking into Prefect Orion and have a qu...
# prefect-community
m
Hi, I was looking into Prefect Orion and have a question: I want my flow to get triggered every time a new file gets uploaded to a blob storage. Also the name of the file should be passed to the flow as a parameter. What would be the best way to solve this? Creating a new deployment every time seems a bit overkill to me.
a
Are you on AWS? You could trigger your Prefect flow from a lambda function any time new file arrives in your S3 bucket, incl. passing the file name
m
I am on Azure, but I could use serverless functions nevertheless, thank you for the idea. I am wondering how exactly would I trigger the flow, since I did not found anything regarding Flow Runs except Deplyoments in the Orion docs
a
you would point your serverless function to your PREFECT_API_URL and your PREFECT_API_KEY and run your flow - no need for a deployment in this scenario we'll have recipes in the future that explain this pattern more
๐Ÿ™Œ 1
even though, you could trigger a flow run from a deployment from AWS Lambda
m
Okay I did not found any references to Flow Runs in the concept docs section, but found them in the API docs section now, thank you ๐Ÿ™‚
Is there any advantage of using deployments over flow runs if I donโ€™t have a schedule?
a
the advantage is that you can call it via an API call
I didn't mean using FlowRun explicitly, I meant just running your flow function from a serverless function, as you would run any Python function locally
m
Ah I See Then I would use the serverless function as execution environment for a ConcurrentFlowRunner, did I get that right?
๐Ÿ‘ 1
But I would have to create a new deployment if my parameter values change, right?
a
it will get clearer once we release a recipe / blog post, but the idea is: you run it as any Python script/application and use Prefect mainly for observability/visibility
m
And how would I use a serverless function to trigger a flow run inside my kubernetes cluster, and pass parameters from this serverless function to the flow?
a
I meant it quote literally: you trigger a Python function from AWS lambda:
Copy code
from prefect import task, flow
from prefect import get_run_logger


@task
def say_hi(user_name: str):
    logger = get_run_logger()
    <http://logger.info|logger.info>("Hello %s!", user_name)


@flow
def hello(user: str = "world"):
    say_hi(user)


if __name__ == "__main__":
    hello(user="from AWS Lambda")
the value for user parameter could be passed from Lambda input
m
And what if I want my flow to run inside a kubernetes cluster due to a custom docker image I need, how could I do that? And still trigger it from a lambda function
a
Then you trigger it from a Kubernetes pod :)