Hello, Is there a way to set the cache_policy to N...
# ask-community
j
Hello, Is there a way to set the cache_policy to None by default? Most of our task needed to be rerun and the caching is causing multiple issues on our tasks.
1 of the issue we are encountering is that some tasks fails to compute the cache key. This causing some of the task to be skipped as well when this happen and the final status of the flow run is successful.
Copy code
Error encountered when computing cache key - result will not be persisted.
Traceback (most recent call last):
  File "/opt/venv/lib/python3.11/site-packages/prefect/task_engine.py", line 155, in compute_transaction_key
    key = self.task.cache_policy.compute_key(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.11/site-packages/prefect/cache_policies.py", line 168, in compute_key
    policy_key = policy.compute_key(
                 ^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.11/site-packages/prefect/cache_policies.py", line 168, in compute_key
    policy_key = policy.compute_key(
                 ^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.11/site-packages/prefect/cache_policies.py", line 296, in compute_key
    return hash_objects(hashed_inputs, raise_on_failure=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.11/site-packages/prefect/utilities/hashing.py", line 66, in hash_objects
    return stable_hash(cloudpickle.dumps((args, kwargs)), hash_algo=hash_algo)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.11/site-packages/cloudpickle/cloudpickle.py", line 1529, in dumps
    cp.dump(obj)
  File "/opt/venv/lib/python3.11/site-packages/cloudpickle/cloudpickle.py", line 1295, in dump
    return super().dump(obj)
           ^^^^^^^^^^^^^^^^^
TypeError: cannot pickle 'DeserializingConsumer' object
b
Hi Jezreel! Outside of setting the
cache_policy
to
None
in the task decorator, I don't believe there's an out-of-the-box way to set it globally. If you wouldn't mind, could you open a feature request here?
Out of curiosity, when you see that error you shared, is it during the first execution of a flow run? Or does it happen when retrying a flow run?
j
Actually, I am not fully sure because when we first run it, the flow crashed because of OOM error after we updated it to v3 version. When we increased the memory, we start seeing the error above. We also tried setting the cache_policy to None for the task and it run successfully and when we unset the cache_policy and run it again, we can encounter the error again.
I created a discussion here @Bianca Hoch https://github.com/PrefectHQ/prefect/discussions/16240
🚀 1
thank you 1