Hello everyone! I built a simple ETL and it is wor...
# prefect-server
t
Hello everyone! I built a simple ETL and it is working perfectly. When I run the agent in the cloud I get an error saying "cannot pickle psycopg2", any clues?
n
Hi @Tomás Emilio Silva Ebensperger - it sounds like you've baked in part of the
psycopg2
module as part of your flow's metadata, perhaps by referencing it outside of a task block. Since a flow's metadata (right now) needs to be pickleable so that it can be sent to the API, you'll need to make sure you're not inadvertently introducing non-pickleable objects.
t
makes sense, thank you
👍 1
🙌 1
One question though, the thing is our pipeline sues a OOP design using a lof of composition and sharing attributes among classes. how would you go about this issue in the cloud since i have my db object constantly querying the db all along the pipeline.
n
Ah, good question -> I personally use OOP by default when writing flows for similar reasons; for instance you can write a task following normal class inheritance models like this:
Copy code
class CheckInstance(Task):
    def list_instances(self, service, project, zone):
        result = service.instances().list(project=project, zone=zone).execute()
        return result["items"] if "items" in result else None

    def run(self, credentials, instance, project, zone):
        service = googleapiclient.discovery.build(
            "compute", "v1", credentials=credentials
        )

        instances = self.list_instances(service, project, zone)

        super(CheckInstance, self).run()
        return any(i["name"] == instance for i in instances)
In a case like that, there's nothing preventing you from creating instances of your object that also inherit the
Task
class, and calling
super(Instance, self).run()
to call the parent class run method
and thereby letting you share methods and attributes
t
thanks!
One trick that solved everything was to avoid declaring a sef.conn and self.cursor, for the database class, so i am pretty much creating and destroying the connection every time I query th DB and that solved the issue
🙌 1