https://prefect.io logo
Title
a

alvin goh

12/08/2019, 7:40 AM
So here's a use case that i'm having trouble with... I have a flask server running with waitress, and i want to run a prefect flow whenever i receive a request. At the same time, i want to enable caching in the prefect flow, which does not work IF i run waitress on more than 1 thread (seems like cache is local to the thread memory...), requests only hit the cache if it goes to the same worker thread. Is there a way to let the workers on different thread share the same cache? I'm not entirely sure what happens when waitress spawns threads...
j

Jenny

12/08/2019, 2:06 PM
Hi Alvin, thanks for the question! I don't know enough about Flask and Waitress to answer this but let me see what I can find out.
j

Jeremiah

12/08/2019, 2:21 PM
Hi Alvin, I’m not familiar with Waitress so I can’t speak to it’s specific implementation. However, Prefect tries in all ways to be backend- and execution- agnostic, and we require all functionality to work in a global, distributed environment. As a result, we ship very few stateful features in Core except when we know for sure that they can work in a shared memory space, as with a single thread. However, for caching support across processes, time, and even flows, you may want to take a look at Prefect Cloud, which was designed to support that use case.