Hi, I was looking into Prefect Orion and have a qu...
# prefect-community
Hi, I was looking into Prefect Orion and have a question: I want my flow to get triggered every time a new file gets uploaded to a blob storage. Also the name of the file should be passed to the flow as a parameter. What would be the best way to solve this? Creating a new deployment every time seems a bit overkill to me.
Are you on AWS? You could trigger your Prefect flow from a lambda function any time new file arrives in your S3 bucket, incl. passing the file name
I am on Azure, but I could use serverless functions nevertheless, thank you for the idea. I am wondering how exactly would I trigger the flow, since I did not found anything regarding Flow Runs except Deplyoments in the Orion docs
you would point your serverless function to your PREFECT_API_URL and your PREFECT_API_KEY and run your flow - no need for a deployment in this scenario we'll have recipes in the future that explain this pattern more
๐Ÿ™Œ 1
even though, you could trigger a flow run from a deployment from AWS Lambda
Okay I did not found any references to Flow Runs in the concept docs section, but found them in the API docs section now, thank you ๐Ÿ™‚
Is there any advantage of using deployments over flow runs if I donโ€™t have a schedule?
the advantage is that you can call it via an API call
I didn't mean using FlowRun explicitly, I meant just running your flow function from a serverless function, as you would run any Python function locally
Ah I See Then I would use the serverless function as execution environment for a ConcurrentFlowRunner, did I get that right?
๐Ÿ‘ 1
But I would have to create a new deployment if my parameter values change, right?
it will get clearer once we release a recipe / blog post, but the idea is: you run it as any Python script/application and use Prefect mainly for observability/visibility
And how would I use a serverless function to trigger a flow run inside my kubernetes cluster, and pass parameters from this serverless function to the flow?
I meant it quote literally: you trigger a Python function from AWS lambda:
Copy code
from prefect import task, flow
from prefect import get_run_logger

def say_hi(user_name: str):
    logger = get_run_logger()
    <http://logger.info|logger.info>("Hello %s!", user_name)

def hello(user: str = "world"):

if __name__ == "__main__":
    hello(user="from AWS Lambda")
the value for user parameter could be passed from Lambda input
And what if I want my flow to run inside a kubernetes cluster due to a custom docker image I need, how could I do that? And still trigger it from a lambda function
Then you trigger it from a Kubernetes pod :)