Prasanth Kothuri
01/20/2022, 4:53 PMTomek Florek
01/20/2022, 5:13 PM@task
like _trigger=all_successful , log_stdout=True, max_retries_ etc. How can I set it up for these tasks? Would maybe using functions with decorators be advised as best practice here instead?Tim Enders
01/20/2022, 6:57 PMMatt Alhonte
01/20/2022, 8:40 PMMathijs Miermans
01/20/2022, 10:26 PMTom Shaffner
01/21/2022, 12:04 AMMathijs Miermans
01/21/2022, 12:26 AMSuresh R
01/21/2022, 8:03 AMKamil Gorszczyk
01/21/2022, 11:30 AMshijas km
01/21/2022, 11:59 AMGuilhelm PANAGET
01/21/2022, 1:23 PMKirk Quinbar
01/21/2022, 2:01 PMPhilipp Eisen
01/21/2022, 2:31 PMAlexander Kloumann
01/21/2022, 2:53 PMlogger = prefect.context.get("logger")
<http://logger.info|logger.info>("An info message.")
I tried running it from the interface on Prefect Cloud but that doesn't do anything either. What am I missing here?Stephen Herron
01/21/2022, 3:13 PMimport pandas
This doesn’t seem to work unless I specifically supply the custom image (and the task execution arn). I would have expected if run_config:image is null it would use the one from the task_definition? Does it default to something other than the container def?Luis Aguirre
01/21/2022, 5:02 PM*data*: {encoding: 'orion', blob: '{"encoding": "file", "blob": "file:///tmp/78442fec16214918a437e768fa384762"}'}
in the state
section. Is there something I should configure to get the whole response? ThanksPrateek Saigal
01/21/2022, 5:12 PMflow.schedule = CronSchedule("49 21 * * 1-5",start_date=pendulum.datetime(2022, 1, 1, tz="Asia/Kolkata"),)
flow.register(
project_name=project_name,
idempotency_key=flow.serialized_hash(),
labels=["test"],
)
What could be the reason for this?Leon Kozlowski
01/21/2022, 5:57 PMSuresh R
01/21/2022, 6:50 PMSeth Coussens
01/21/2022, 7:07 PMLeon Kozlowski
01/21/2022, 7:34 PMJason Motley
01/21/2022, 8:12 PMMatthew Millendorf
01/21/2022, 8:12 PMTim Enders
01/21/2022, 9:04 PMwith Flow() as flow:
pattern?Michael Bell
01/21/2022, 9:26 PMprefect
and dask-cloudprovider
right now. It seems dask-cloudprovider[aws]
relies on aiobotocore
which pins to a very specific botocore
version and that's causing conflicts when trying to set up my environment. Anyone have any experience with this?Madison Schott
01/21/2022, 9:33 PMAlex To
01/21/2022, 11:10 PMbareflow
)
Our use case is slightly different: we will be using the tool for container execution orchestration in which each task simply invokes a container (within our k8s cluster or ECS) or a job (databrick job). Each container is an atomic unit of work written in any language. This architecture de-couples orchestration from the actual functional task (container) and avoid recoding of hundreds of existing tasks/containers. This has been working well for us using our internal tool.
Our flow would be simply as task1: call container-A; task2 call Databrick-job-B; task3: call container-C after task1 and task2 are completed.
My questions are:
1. Based on the documentations, my best option is to run local agent on some EC2 instances with localDashExecutor. Using any other agent type would require additional resources and add more latency. e.g: With Kubernetes agent, one pod for the Flow run and another for the actual container; Latency = double spin up time. The downside is scaling problem with local agent. Do I understand it correctly? Any other approach?
2. Any plan to add ECSRunTask to AWS Tasks? This is to run any arbitrary task defined outside of prefect context in ECS. Similar to Airflow ECS Operator? I am surprised it's not already on the list.
ThanksTim Enders
01/22/2022, 3:09 PMChris K.
01/22/2022, 4:09 PMChris K.
01/22/2022, 4:11 PM