Mike Gallaspy
11/21/2024, 9:16 PMNate
11/21/2024, 9:22 PMIn [1]: from prefect import flow, task
In [2]: @flow
...: def f(): task(lambda x: x +1).map(range(200)).result()
In [3]: f()
since this takes about 3 seconds to finish in prefect 3.xMike Gallaspy
11/21/2024, 9:23 PMNate
11/21/2024, 9:24 PMMike Gallaspy
11/21/2024, 9:25 PMNate
11/21/2024, 9:40 PMNate
11/21/2024, 9:40 PMNate
11/21/2024, 9:40 PMNate
11/21/2024, 9:40 PMNate
11/21/2024, 9:41 PMas_completed
should help with thatMike Gallaspy
11/21/2024, 9:43 PMtask.submit
Mike Gallaspy
11/21/2024, 9:43 PMMike Gallaspy
11/21/2024, 9:43 PMNate
11/21/2024, 9:45 PMfuture.wait()
was blocking the main thread for each future, I could be wrongMike Gallaspy
11/21/2024, 9:46 PMNate
11/21/2024, 9:46 PMMike Gallaspy
11/21/2024, 9:50 PMNate
11/21/2024, 9:53 PMas_completed
you know its done so the future.result()
call which would normally be blocking should happen instantly
but overall yeah the list comp would be blocking bc I'm exhausting as_completed
instead of doing something for
each completed future that pops outNate
11/21/2024, 9:57 PMMike Gallaspy
11/21/2024, 10:00 PMMike Gallaspy
11/21/2024, 10:01 PMMike Gallaspy
11/21/2024, 10:01 PMMike Gallaspy
11/21/2024, 10:54 PMMike Gallaspy
11/21/2024, 10:55 PMNate
11/21/2024, 10:55 PMprefect config view
say? i.e. are you running against an ephemeral server, oss server, or cloud?Mike Gallaspy
11/21/2024, 10:56 PMš you are connected to:
<http://127.0.0.1:4200>
PREFECT_PROFILE='local'
PREFECT_API_URL='<http://127.0.0.1:4200/api>' (from profile)
Nate
11/21/2024, 10:56 PMprefect server start
going someplace?Mike Gallaspy
11/21/2024, 10:56 PMNate
11/21/2024, 10:57 PMMike Gallaspy
11/21/2024, 10:57 PMNate
11/21/2024, 10:58 PMMike Gallaspy
11/21/2024, 11:00 PMMike Gallaspy
11/21/2024, 11:00 PMMike Gallaspy
11/21/2024, 11:20 PMMike Gallaspy
11/21/2024, 11:20 PMNate
11/21/2024, 11:31 PMserve(*many_tasks)
⦠you can horizontally scale these (e.g. N pods) arbitrarily without race conditions because of consumer groups
⦠all the task features apply, i.e caching, results, retries etc
⢠from somewhere like a webapp, you can some_task.delay(**task_kwargs)
to "background" that task without blocking
so this is good for cases where you want to offload a bunch of work to happen concurrently somewhere on static infra, but the caller (or delay-er) doesn't need the result of that background task
am I going off the rails here or does that sound like something you're interested in?Mike Gallaspy
11/21/2024, 11:56 PMMike Gallaspy
11/21/2024, 11:56 PMMike Gallaspy
11/22/2024, 12:00 AMNate
11/22/2024, 12:33 AMwhat a metaphorical consumer/producer pattern ishaha fair enough. well cool, those examples are almost all docker-compose and should be mostly up to date, lmk if you have any specific questions