Hi, everyone! I'm new to prefect and have a questi...
# ask-community
l
Hi, everyone! I'm new to prefect and have a question regarding some use cases that we saw whilst testing it in our environment: - Use of connection pool to connect to databases: - We actually saw this link here (https://github.com/PrefectHQ/prefect/issues/1876) and tried using it, but when we tried to register the flow to the server, it threw the error: " flow.register(project_name='name') TypeError: can't pickle _thread._local objects " - For us this is a major issue as within our flow, ideally we would have some light queries in a lot of tasks, using connection pooling is mandatory for this use case in particular. If anyone could point us in the right direction it would be very helpful! Thanks in advance, Luis
k
Hi @Luis Henrique, the error seems like you’re using a DaskExecutor. Is that right?
l
We're actually running it locally everything
Local server, local executor
Local agent
And thanks for the quick response @Kevin Kho!
k
I see, are you returning the connection in a task? What storage are you using?
l
We tried returning it in a task, when that didn't work we proceeded with the suggestion in the link above (using a global variable). And this worked fine for running the code directly, but when we tried to register the flow it threw the pickle error.
As for the storage, we're running everything in the sort of "default" configuration (meaning we only set the configurations from tutorials and such, nothing too deep)
k
I see, can you try
Copy code
from prefect.storage import Local
flow.storage = Local(stored_as_script=True)
So that it avoids serialization?
l
We didn't try that! Thanks for the idea! We'll test it and see what happens. Many thanks!
I'll post back here whether it worked or not
a
@Luis Henrique - Any update? I am experiencing a very similar issue...
l
Hi, @Anastasia Belov! Yes, the solution provided by @Kevin Kho worked but it needed some extra work for deployment. What we'll do here is to actually get the database requests in an API so that we can bypass this on prefect
And then the flow will only have do make an http call or lambda invocation
Just clarifying what I meant by work: through storing the flow as a script we were able to share the same connection throughout many tasks in the same file.
k
Note this only works for LocalExecutor @Anastasia Belov because the parallel executors would require serialization of Task outputs. You would need to make the connectionn inside those tasks