Thanks Kevin, you’re a LEGEND!
I solved this by just passing the `serializer=JSONSerializer()`to the
S3Result
Object when initialising
k
Kevin Kho
02/17/2022, 12:13 AM
Ah ok that sounds good!
f
Farid
02/17/2022, 2:14 AM
Do you have an example on how I can inherit and expand a task from the task library?
I want to use the output of
task 1
to use inside the query which is then passed to SnowflakeQuery task. This doesn’t work right now because I think `task 1`'s output is only usuable inside another task not when it’s used to format a query in the flow defintion, ie:
Copy code
with Flow() as flow:
table_name = task_1()
SnowflakeQuery(query=f"select * from {table_name};")
k
Kevin Kho
02/17/2022, 2:17 AM
In this specific case, I would just tell you to make an intermediate task and pass table_name to that to format the query
✅ 1
Kevin Kho
02/17/2022, 2:18 AM
But you can subclass any Task in the Task library and then override the
run
method with your own
Kevin Kho
02/17/2022, 2:19 AM
SnowflakeQuery(query=task(lambda table_name: f"select * from {table_name};")(table_name))
f
Farid
02/17/2022, 2:27 AM
I would add an intermediary task but just fyi the lambda method did not work:
Copy code
prefect.exceptions.ClientError: [{'path': ['get_or_create_task_run_info'], 'message': 'Expected type UUID!, found ""; Could not parse UUID: ', 'extensions': {'code': 'INTERNAL_SERVER_ERROR', 'exception': {'message': 'Expected type UUID!, found ""; Could not parse UUID: ', 'locations': [{'line': 2, 'column': 101}], 'path': None}}}]
k
Kevin Kho
02/17/2022, 2:29 AM
That’s the first I’ve seen..but it’s ok it’s not as pleasant anyway from a coding perspective even if we fix the lambda cuz you get an ugly name in your DAG like
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.