I have retry for my task as follows ```@task(max_r...
# prefect-community
s
I have retry for my task as follows
Copy code
@task(max_retries=3, retry_delay=timedelta(seconds=5))
def write_delta_lookup_table():
......
My image build failed for the reason
ValueError: Task <Task: write_delta_lookup_table> has retry settings but some upstream dependencies do not have result types. See <https://docs.prefect.io/core/concepts/results.html> for more details.
Why the task retry require result types. I am not passing values from one task to another. Also https://docs.prefect.io/core/concepts/tasks.html#overview retry example doesnt show the use of result type. What am I missing here. ?? (Any documentation with simple retry usecase will be very helpful)
c
Hi @Saranya Elumalai - what version of Prefect are you running on? I believe we removed this check recently. Regardless, if you add:
Copy code
from prefect.engine.results import LocalResult

# when you initialize your Flow
Flow(..., result=LocalResult(), ...)
to your Flow initialization, this check will not fail
s
@Chris White Using prefect core 0.11.2. Does this version still check?
c
If you’re running into the error it must - the patch I provided above or upgrading your Core version should resolve the issue for you
s
thanks, let me try localresult and update you
👍 1
@Chris White The build is successful, but the task failed with the following error
Copy code
Unexpected error: TypeError("can't pickle SSLContext objects")

 Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/prefect/engine/runner.py", line 48, in inner
    new_state = method(self, state, *args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/prefect/engine/task_runner.py", line 977, in get_task_run_state
    result = self.result.write(value, filename="output", **prefect.context)
Here is my code
Copy code
with Flow(f"store-etl",result=LocalResult()) as flow:
company = write_delta_lookup_table()
.......
Copy code
@task(max_retries=3, retry_delay=timedelta(seconds=10))
def get_S3_connection():
What am I missing here
c
that code looks like an old version of Prefect - you should confirm that your Docker image also has the newest version of Prefect installed. Regardless, tasks must return objects that are capable of being serialized by
cloudpickle
, else there is no way to persist them to disk. I see that you are creating an S3 client; for situations like this we have a WIP for shareable non-pickelable objects here: https://github.com/PrefectHQ/prefect/pull/3139
s
@Chris White 1. Which code looks like old version 2. Both tasks get_S3_connection() and write_S3_data() facing the same issue when added retry. What are the recommended task return types?
c
The traceback looks like it came from an old version of prefect; anything that is capable of being serialized by cloudpickle is a valid return type - most python objects satisfy this. Database connections and client objects, or anything that maintains a reference to such objects, are generally not serializable
a
hey @Saranya Elumalai and @Chris White i'm facing a similar issue. i have a function that returns the client object of mongodb. I am unable to pass it to other tasks. is there any work around?
c
Hi @Anish Chhaparwal at the moment I suggest you recreate the client within each task to avoid this issue. This PR: https://github.com/PrefectHQ/prefect/pull/3139 will ultimately provide a better pattern for these sorts of situations in the future!
s
Yes @Chris White @Anish Chhaparwal thats what I did
👍 1
a
okay. Thanks for the quick response. appreciate it. :)