Hi everyone, m running a task that always fails. The task is cached and json serializer is being used.
I was expecting an uncaught exception to trigger a failed task state and failed task states should not be caught.
Is it expected behaviour for the json serializer to attempt serialize an exception?
Ill add a sample code soon.
Deceivious
03/24/2023, 2:29 PM
Copy code
from datetime import timedelta
from prefect import flow, task
from prefect.filesystems import LocalFileSystem
from prefect.serializers import JSONSerializer
from prefect.tasks import task_input_hash
@flow(name="sdfsdf")
def main():
tas1()
@task(
name="1",
cache_key_fn=task_input_hash,
cache_expiration=timedelta(minutes=5),
persist_result=True,
result_storage=LocalFileSystem(base_path=".strage-Temp"),
result_serializer=JSONSerializer(jsonlib="json"),
)
def tas1():
raise 1 / 0
if __name__ == "__main__":
main()
Deceivious
03/24/2023, 2:36 PM
While the failed task is in failed state, it is not because of the exception directly but because the exception cannot be serialized which is a side effect of the exception. Unsure if this is how it should be/
z
Zanie
03/24/2023, 2:39 PM
That sounds like a bug. We can add support to our JSONSerializer implementation for exceptions — can you open an issue just for that (without the caching stuff)?
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.