Hey all, We've been test driving prefect for a mon...
# ask-community
s
Hey all, We've been test driving prefect for a month or so now - h ave some basic flows/sub-flows, tasks and deployments, etc. I feel like I'm missing something conceptually though. When running deployments via the command line (prefect deployment run) it is possible to pass parameters to the flow via the --param option. It is also possible to override the parameter in the UI for the deployment. What I'm missing though is how do you pass different parameters via scheduling - there doesn't appear to be a way to do this? A simple use case: We have a rather complex python script that accepts the name of a configuration as part of our ETL pipeline. If we essentially wrap this script in a flow (sub-flows and tasks where applicable), is it not possible to schedule the same deployment X number of times passing in a different 'config' for each scheduled deployment. What is the best approach to use otherwise? (edited)
n
hey @Steve Gee you can have many deployments of the same flow so if you have some flow like
Copy code
@flow
def run_complex_script(config_to_use: str):
you can create a deployment from this flow for each unique set of parameters you want to schedule the flow with alternatively, if you're able to add logic at the beginning of your flow that discovers the right config at runtime (instead of passing it in), you could just create one deployment and assign many schedules to this deployment in the future we're likely to add the ability to have something like a default set of parameters associated with each schedule on a deployment
s
Thanks Nate! Just to make sure I understand correctly, when you say multiple deployments that implies creating each via the 'to_deployment' method of the flow and then serving each of the created deployments via the server method. How well does this scale? If for instance I have a hundred of these, what are the implications on concurrency?
n
if you're using
serve(*many_deployments)
then yeah that's what I mean if you really have
a hundred of these
I would suggest that you might be able to
add logic at the beginning of your flow that discovers the right config at runtime
but no I wouldnt want to serve 100 dynamically created deployments on the same process
s
Thanks again, Nate! I'll look into these ideas. One more question if you don't mind: Does prefect cloud help address some of these concerns with triggers and automations? Does each deployment 'run' run in it's own thread/process if you were to trigger it with a different param?