Ryan Sattler
09/30/2021, 6:33 AMpart1, part2 = my_tuple_task(input)
otherresult1 = othertask1(part1)
otherresult2 = othertask2(part2)
However now we want to map over multiple inputs to have multiple parallel pipelines of the above:
whole_tuple_result = my_tuple_task.map(inputs)
otherresult1 = othertask1.map(whole_tuple_result) # tasks must break up tuple inside the function
otherresult2 = othertask2.map(whole_tuple_result)
Is there a way to maintain the elegant tuple-destructuring while still mapping over the result? Just trying to do it directly gives the error
TypeError: Task is not iterable. If your task returns multiple results, pass nout to the task decorator/constructor, or provide a Tuple return-type annotation to your task.
(we’ve already set that which is why the first example works)Kevin Kho
Kevin Kho
Ryan Sattler
09/30/2021, 6:41 AM