I have a flow that is passing pandas DataFrames be...
# ask-community
l
I have a flow that is passing pandas DataFrames between tasks. When it runs in Prefect Cloud, I get this logging error that I don't see locally:
Copy code
April 22nd 2020 at 12:02:34pm EDT | CloudHandler
CRITICAL 
Failed to write log with error: Object of type ndarray is not JSON serializable
Has anyone else seen this before?
n
Hi @Luke Orland ! Prefect uses
cloudpickle
to serialize payloads between tasks - that error could be because the DataFrame isn't serializable. I'll also check on the logs side to be sure there's nothing happening in Cloud that could be causing that.
Ah @Luke Orland - it looks like you're trying to send a dataframe to the Cloud Logs; since logs are also required to be JSON serializable, that may be the source of your error
c
Dataframes are not usually serializable and the the individual data points themselves are normally of type
np.int64
or similar which are not serializable also. So you need to make sure both things are converted to serializable types.
upvote 2
l
makes sense. I am in fact logging the DataFrames. That was a mistake. Thanks all!
😄 2