Question about mapped tasks and custom state handlers for them - I've got a state handler for when a mapped task fails, however because the object passed to the state handler is a Task and not a TaskRun, the task.name is the generic task name, and not the mapped task name. Similarly in the state handler I don't have the task_run_id either. Is there any way to get the task_run_name in the state handler for a mapped task?
Forgot to mention - if I try
object.task_run_name
it prints the format string used to generate the mapped task name, not the actual task name. I'm using the pattern here where my
task_run_name = "{table_name}"
- and I get
{table_name}
output in the state handler, because that's what it's set as at the Task level, rather than it being computed at the TaskRun level
k
Kevin Kho
08/04/2021, 6:49 PM
Hey @David Elliott, you can use the
map_index
and the
task_full_name
which includes the
map_index
. Would that work for you? You can also use the task result to see the error message. Is the purpose for debugging?
π 1
d
David Elliott
08/04/2021, 8:41 PM
I can't see those two options in there - I logged the
dir(object)
passed in the state handler and it gave me basically all the elements of a Task (here) which doesn't include map_index or task_full_name, and on trying to access those keys I get a 'no attribute' error.
I think it's probably not possible, would be a nice addition though (to pass in the TaskRun rather than (or in addition to) the Task)
it's for outputting to slack but with the specific name of the mapped task
k
Kevin Kho
08/04/2021, 8:42 PM
Itβs not in the task, it will be in the context so
Copy code
prefect.context.get("map_index")
inside the state handler.
d
David Elliott
08/04/2021, 9:28 PM
Ahh let me try that π
David Elliott
08/04/2021, 9:35 PM
You're a star, thank you @Kevin Kho!!
I used the
prefect.context.task_run_id
with GQL to pull the
task_run_name
π
had totally not considered using the context π€¦
π 1
k
Kevin Kho
08/04/2021, 9:37 PM
Thank you! This is better than just having the map_id in my opinion. If you have this, you should be good.
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.