Shuchita Tripathi
04/18/2022, 7:12 PMdef task1():
@task
def run_terraform_lt():
tf = Terraform(working_dir="law_tf")
tf.init()
tf.apply()
task2.py
def task2():
@task
def run_terraform_rt():
tf = Terraform(working_dir="rt_tf")
tf.init()
tf.apply()
The input is similar to this:
{
"task1": {
"id": "foo",
"task_name": "task1"
},
"task2": {
"id": "bar",
"task_name": "task2",
}
}
I am getting the task name from this dictionary. Based on the value of "task_name", I have to create a flow combining all tasks.
I am creating a flow where I am trying to add them, but the tasks are not getting added in the flow. Anyone has any idea on how this can be achieved? Here is the snippet of my flow creation code. It is inside a for loop through which I am extracting the task name and other variables.Fina Silva-Santisteban
04/18/2022, 8:54 PMwith Flow('Parent Flow') as flow:
dates = ['2021-01-04', '2021-01-05', '2021-01-06'] // a list of dates
for i in range(len(dates)):
//run child flow with i as parameter
When I check the Prefect UI it seems like the child flows are running in parallel (I’m using threads) which is in general great but in this case I’d like the child flows to be run sequentially. Is there a way to force the parent flow to run them that way?? 🤔RAISS Zineb
04/18/2022, 11:39 PMJacob Blanco
04/19/2022, 4:23 AMOmar Sultan
04/19/2022, 8:13 AMSergey Gerasimov
04/19/2022, 8:24 AMJoshua Greenhalgh
04/19/2022, 12:31 PMMini Khosla
04/19/2022, 1:31 PMJoshua Greenhalgh
04/19/2022, 1:33 PMDekel R
04/19/2022, 2:11 PMPedro Machado
04/19/2022, 2:20 PMwait_for_flow_run
to wait for a subflow run. The subflow is a long-running job that can take more than 12 hours and sometimes it times out.
I noticed that wait_for_flow_run
uses watch_flow_run
which raises an exception if the flow has been running for more than 12 hours. The 12 hours timeout is hardcoded.
See https://github.com/PrefectHQ/prefect/blob/afda99411f91582ad187bf33671268d8d3c3c2c0/src/prefect/backend/flow_run.py#L95
We plan to upgrade to 2.0 shortly after it's released. Is this timeout limitation for subflows also implemented in Orion?Hugo Shi
04/19/2022, 2:42 PMShuchita Tripathi
04/19/2022, 3:24 PMwiretrack
04/19/2022, 3:34 PMPrasanth Kothuri
04/19/2022, 3:51 PMTask 'copy_from_s3_to_sftp': Exception encountered during task execution!
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/prefect/engine/task_runner.py", line 876, in get_task_run_state
value = prefect.utilities.executors.run_task_with_timeout(
File "/usr/local/lib/python3.9/dist-packages/prefect/utilities/executors.py", line 454, in run_task_with_timeout
return task.run(*args, **kwargs) # type: ignore
File "flows/k8s/my_flow_name.py", line 46, in copy_from_s3_to_sftp
SystemError: unknown opcode
Adi Gandra
04/19/2022, 3:58 PMprefect agent kubernetes install -k {key} --mem-request 4G --mem-limit 6G --cpu-request 2 --rbac | kubectl apply -f -
Nothing seems to happen
I just get the message:
deployment.apps/prefect-agent configured
<http://role.rbac.authorization.k8s.io/prefect-agent-rbac|role.rbac.authorization.k8s.io/prefect-agent-rbac> unchanged
<http://rolebinding.rbac.authorization.k8s.io/prefect-agent-rbac|rolebinding.rbac.authorization.k8s.io/prefect-agent-rbac> unchanged
Any idea’s on how to successfully upgrade my prefect agent?Chris Reuter
04/19/2022, 4:28 PMPhilip MacMenamin
04/19/2022, 7:41 PMupstream_tasks
on a ShellTask
This works:
brt_commands = create_brt_command.map(adoc_fp=updated_adocs)
brt_commands_logged = log(item=brt_commands, desc="BRT commands")
brts = shell_task.map(
command=brt_commands, upstream_tasks=[tomogram_fps]
)
This fails:
brt_commands = create_brt_command.map(adoc_fp=updated_adocs)
brt_commands_logged = log(item=brt_commands, desc="BRT commands")
brts = shell_task.map(
command=brt_commands, upstream_tasks=[tomogram_fps, brt_commands_logged]
)
Philip MacMenamin
04/19/2022, 8:12 PMexport PREFECT__LOGGING__LEVEL=INFO
I see:
python3 tmp/shell_task.py
[2022-04-19 21:09:30+0100] INFO - prefect.FlowRunner | Beginning Flow run for 'My Flow'
[2022-04-19 21:09:30+0100] INFO - prefect.TaskRunner | Task 'MyShellTask': Starting task run...
[2022-04-19 21:09:30+0100] INFO - prefect.MyShellTask | lsto echo
[2022-04-19 21:09:30+0100] INFO - prefect.TaskRunner | Task 'MyShellTask': Finished task run for task with final state: 'Success'
[2022-04-19 21:09:30+0100] INFO - prefect.TaskRunner | Task 'problem': Starting task run...
[2022-04-19 21:09:30+0100] INFO - prefect.TaskRunner | FAIL signal raised: FAIL('Oh no!')
[2022-04-19 21:09:30+0100] INFO - prefect.TaskRunner | Task 'problem': Finished task run for task with final state: 'Failed'
[2022-04-19 21:09:30+0100] INFO - prefect.FlowRunner | Flow run FAILED: some reference tasks failed.
If I
export PREFECT__LOGGING__LEVEL=ERROR
I see nothing. Ideally I'd like to only see messages about broken stuff. Ideas?Jason
04/19/2022, 10:53 PMMakefile
would spin up a few Docker containers that one could test flows on. How can I abstract between a local and ECS agent in my flows to allow something like a env
var to swap between?Jai P
04/20/2022, 12:58 AMJosh
04/20/2022, 3:21 AMError during execution of task: ClientError([{'path': ['create_task_run_artifact'], 'message': 'Task run <task_run_id> not found', 'extensions': {'code': 'INTERNAL_SERVER_ERROR'}}])
Ahmed Ezzat
04/20/2022, 7:06 AMChris Reuter
04/20/2022, 2:10 PMJason
04/20/2022, 2:55 PMprojects
from both Python and Jupyter Notebook:
from projects.example.flows.hello_world
from projects import Config
But when I run the same import from Prefect, I get the following error:
pipenv run prefect run -m projects.example.flows.hello_world
No module named 'projects'
Constantino Schillebeeckx
04/20/2022, 3:19 PMChris Reuter
04/20/2022, 6:55 PMLeigh-Ann Friedel
04/20/2022, 7:31 PMMars
04/20/2022, 8:09 PMflow.run()
for local dev, then edit the code and switch to flow.register()
. I don’t like editing the code to switch between local dev and deployment, and it doesn’t feel like your typical Python webapp dev loop. Is there a better way to do this?Jason
04/20/2022, 8:10 PMValueError: Local Secret "AWS_ACCOUNT_ID" was not found.