Nicolas Bigaouette
10/17/2020, 6:21 PMFlow
seems to support this. But I have a specific requirement where a one (or more) command(s) are actually RPCs. Those RPCs are customs; the remote command to be executed is added to a queue and the remote worker polls that queue. After having executed a command, the remote worker will POST
its result to the backend. There is multiple instances of the backend running (for redundancy and scaling) so the result will probably not be received by the process that created the RPC call. This creates a discontinuity in the code; The task driving the workflow has to stop when enqueuing its remote command and another instance will pick up the work at a later time.
I can probably model my sequences with prefect just fines, except I am not sure how to handle the discontinuity described above. I though I could raise a "pause" signal in the task that initiated the custom RPC. Then the backend instance receiving the POST
could send a "continue" signal to the paused task. Is that even possible? Can such a workflow be modeled by prospect?
Thanks!!nicholas
mutation SetTaskRunStates {
set_task_run_states(input: { states: [
task_run_id: <<TASK_RUN_ID>>,
state: {
type: 'Resume',
message: <<you can use this field to store some message about why it was resumed/what resumed it>>
}
] }) {
states {
id
}
}
}
Nicolas Bigaouette
10/19/2020, 2:13 PMPOST
results and calls the graphql mutation on the prefect server to resume the task. I could either embed the POST
result in the message or go through my DB for that information, keeping track of of the task id at every step.nicholas
Paused
task runs in case you want to handle them without storing the task_run_id
first:
query PausedTaskRuns {
task_run(where: { state: _eq: "Paused" }) {
id
name
}
}
Nicolas Bigaouette
10/19/2020, 2:24 PMnicholas