This message was deleted.
# ask-marvin
s
This message was deleted.
m
Your message was too long by 553 tokens - please shorten it and try again. For reference, here's your message at the allowed limit:
I have a Prefect flow that spawns a number of tasks and am seeing this sort of failure for one or two of those tasks on some executions. Provide some context for what I am seeing in the message.
```Encountered exception during execution: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 2035, in orchestrate_task_run await client.set_task_run_name( File "/usr/local/lib/python3.10/site-packages/prefect/client/orchestration.py", line 1951, in set_task_run_name return await self._client.patch( File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1951, in patch return await self.request( File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1559, in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 282, in send response = await self._send_with_retry( File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 216, in _send_with_retry response = await request() File "/usr/local/lib/python3.10/site-packages/httpx/_client.py
j
@Marvin I have a Prefect flow that spawns a number of tasks and am seeing this sort of failure for one or two of those tasks on some executions. Provide some context for what I am seeing in the message.
Copy code
prefect.task_runs
Finished in state Failed(message='Task run encountered an exception KeyError: 493', type=FAILED, result=UnpersistedResult(type='unpersisted', artifact_type='result', artifact_description='Unpersisted result of type `KeyError`'))
m
The message you're seeing indicates a failure in your Prefect flow, specifically with a task run that has thrown a
KeyError
. This exception usually occurs when you try to access a dictionary key that doesn't exist. In the context of Prefect, the
KeyError: 493
possibly indicates that there's an attempt to access a result or a piece of data using a key labeled '493', which is not present in the expected dictionary. The
Failed
state is self-explanatory: it's one of the types of failure states that a task can be in when something has gone wrong during its execution. There's a myriad of reasons for a task to fail, ranging from user-defined exceptions to system errors. The
UnpersistedResult
part of the message implies that the result of the task has not been saved to any persistent storage. In Prefect, results can be persisted to various storage mediums to enable caching and reuse. "Unpersisted result of type `KeyError`" suggests that the exception was not saved, but Prefect still reported the nature of the result as a
KeyError
. This might happen because of how the flow is configured or due to the specific logic of the task. In summary, your task tried to access a non-existent key in a dictionary, resulting in a
KeyError
, and the result (or exception) was not persisted, meaning it wasn't written to a long-term storage solution. To resolve this, you will need to debug your task's logic to ensure all necessary keys are present in relevant dictionaries and to review your flow's configuration regarding result persistence.
The message you're seeing indicates that a task within your Prefect flow has thrown an exception specifically a
KeyError
with the key being
493
. A
KeyError
typically means that the Python dictionary being accessed does not have an item with the specified key. It's a common error and could be due to reasons such as a missing configuration, incorrect data being passed to the task, or other similar issues in your task's logic. As for the
UnpersistedResult
, this suggests that the outcome of the task that raised the exception was not saved to long-term storage (like a database or a file system) and was only maintained in memory during the flow run. Results are unpersisted by default when working with Prefect Core's server or cloud, to avoid unintentional data storage or exposing sensitive data. To resolve this issue, you may want to check your task's code, especially around the areas where it uses the key
493
, and ensure that pertinent data is being passed correctly and exists as expected. Additionally, if you want to persist results, you could configure your tasks to use one of the result handlers provided by Prefect for persisting task outputs.
The message you're receiving indicates that a task within your Prefect flow has failed due to an exception—a
KeyError
, to be more precise, associated with the key
493
. A
KeyError
typically arises when a dictionary in Python is accessed using a key that does not exist within that dictionary. This could be the result of a missing piece of information expected in your data, a mistyped key, or accessing the dictionary before the key has been set. The term
UnpersistedResult
refers to any result data that has not been saved to a long-term storage system, such as a database or cloud storage. Instead, the result of the task is stored in memory for the duration of the flow run but is not available for future use or reference. This means that the result of the failed task—namely the
KeyError
exception—was not preserved beyond the current flow's execution. In order to address the issue, you would need to debug the task logic to understand why it's attempting to access a key that does not exist in the relevant dictionary. You may need to review the data source, the algorithm, or the state of the input data when the task is executed to pinpoint the cause of the error. If necessary, updating the error handling within the task to manage such cases might also be useful to improve the robustness of your flow.
j
@Marvin This error is happening in a portion of the stack that is not in my code:
Copy code
Jan 20th, 2024
Encountered exception during execution:
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 2035, in orchestrate_task_run
    await client.set_task_run_name(
  File "/usr/local/lib/python3.10/site-packages/prefect/client/orchestration.py", line 1951, in set_task_run_name
    return await self._client.patch(
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1951, in patch
    return await self.request(
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1559, in request
    return await self.send(request, auth=auth, follow_redirects=follow_redirects)
  File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 282, in send
    response = await self._send_with_retry(
  File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 216, in _send_with_retry
    response = await request()
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1646, in send
    response = await self._send_handling_auth(
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1674, in _send_handling_auth
    response = await self._send_handling_redirects(
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1711, in _send_handling_redirects
    response = await self._send_single_request(request)
  File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1748, in _send_single_request
    response = await transport.handle_async_request(request)
  File "/usr/local/lib/python3.10/site-packages/httpx/_transports/default.py", line 371, in handle_async_request
    resp = await self._pool.handle_async_request(req)
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 268, in handle_async_request
    raise exc
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 251, in handle_async_request
    response = await connection.handle_async_request(request)
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 103, in handle_async_request
    return await self._connection.handle_async_request(request)
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/http2.py", line 185, in handle_async_request
    raise exc
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/http2.py", line 144, in handle_async_request
    await self._send_request_body(request=request, stream_id=stream_id)
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/http2.py", line 261, in _send_request_body
    await self._send_end_stream(request, stream_id)
  File "/usr/local/lib/python3.10/site-packages/httpcore/_async/http2.py", line 280, in _send_end_stream
    self._h2_state.end_stream(stream_id)
  File "/usr/local/lib/python3.10/site-packages/h2/connection.py", line 883, in end_stream
    frames = self.streams[stream_id].end_stream()
KeyError: 493
m
Your message was too long by 480 tokens - please shorten it and try again. For reference, here's your message at the allowed limit:
This error is happening in a portion of the stack that is not in my code:
```Jan 20th, 2024 Encountered exception during execution: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 2035, in orchestrate_task_run await client.set_task_run_name( File "/usr/local/lib/python3.10/site-packages/prefect/client/orchestration.py", line 1951, in set_task_run_name return await self._client.patch( File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1951, in patch return await self.request( File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1559, in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 282, in send response = await self._send_with_retry( File "/usr/local/lib/python3.10/site-packages/prefect/client/base.py", line 216, in _send_with_retry response = await request() File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1646, in send response = await self._send
n
can you show the part of the trace where your code brought you to what you shared above?
also what task runner are you using?
aha looks like this
ok my first two questions are moot - are your flows long running?
j
Thanks for jumping in Nate. This is not a long-running flow, max duration is just a few minutes. The flow spawns tasks that UNLOAD data from Redshift to S3 across 72 variants of query/file. The quickest take seconds, the longest under 5 minutes.
I suspect that the missing key is something that fails in the creation of those tasks. I was just going to add some retry logic to see if that smooths over.
n
interesting - i'm not sure i have a great recommendation at this point but let me know if this remains a problem for you after adding retries. will keep my eye out for potential CoEs here
1