Daniel Komisar
03/29/2022, 1:19 PMAlex Prokop
03/29/2022, 3:14 PMWieger Opmeer
03/29/2022, 3:27 PMKevin Kho
Raimundo Pereira De Souza Neto
03/29/2022, 5:35 PMprefect = "2.0a13"
, I did read this link, but I'm using async flows.
More infos in the thread.Ben Welsh
03/29/2022, 6:33 PMBen Welsh
03/29/2022, 6:34 PMScott Aefsky
03/29/2022, 7:38 PMtask_definition_path
, and running into an issue. My flows run on different ECR images, so I don't want to specify the image in the Task Definition file. However, when I leave it out, and specify it in the constructor:
run_config = ECS_RUN(image=ECR_IMAGE, task_definition_path='s3://<path>'
I get an error that container.image should not be empty. My task definition file that I'd like to use is:
networkMode: awsvpc
cpu: 1024
memory: 2048
containerDefinitions:
- name: flow
logConfiguration:
logDriver: awslogs
options:
awslogs-group: test_prefect_log_group
awslogs-region: us-east-1
awslogs-stream-prefix: ecs-prefect
awslogs-create-group: "True"
Is there any way to use a Task Definition file without specifying an image? Or is there another/better way to make sure my ECS tasks get logged? Thanks for any help.Wei Mei
03/29/2022, 8:51 PMdate
and today
context in a task, but getting None.
date = prefect.context.get("date")
today = prefect.context.get("today")
def get_data():
print(f"{date}")
print(today)
Dominick Olivito
03/30/2022, 12:27 AMprint("hello world 1")
if not skip_task2:
print("hello world 2")
if not skip_task3:
print("hello world 3")
print("hello world 4")
here's an attempt in Prefect:
@task()
def print_task(value: str) -> str:
logger = prefect.context.get("logger")
<http://logger.info|logger.info>(value)
return value
with Flow("case_skip_flow") as flow:
skip_task2 = Parameter("skip_task2", True)
skip_task3 = Parameter("skip_task3", True)
task1 = print_task("hello world 1")
with case(skip_task2, False):
task2 = print_task("hello world 2", upstream_tasks=[task1])
with case(skip_task3, False):
task3 = print_task(
"hello world 3", upstream_tasks=[task1, task2], task_args={"skip_on_upstream_skip": False}
)
task4 = print_task("hello world 4", upstream_tasks=[task3], task_args={"skip_on_upstream_skip": False})
i'm getting into trouble with the upstream skip status for task3
.
⢠if I set skip_on_upstream_skip
to True
, then when task2
is skipped, so is task3
regardless of the value for skip_task3
⢠if I set skip_on_upstream_skip
to False
, then task3
runs regardless of the value for skip_task3
how can i implement a pattern like this?Maria
03/30/2022, 6:35 AMflow.run_config = DockerRun(image="my_image", host_config={"binds":["/var/run/docker.sock:/var/run/docker.sock"]})
But I also need to pull an image when its not available and this step fails since I am not authenticated from inside my flow container.
I can probably create a shell task that does docker login for me before image pull, but I am wondering if there are better options?Michael Smith
03/30/2022, 8:00 AMRamonV
03/30/2022, 8:03 AMYeachan Park
03/30/2022, 8:12 AMep
03/30/2022, 9:00 AMAndreas Ntonas
03/30/2022, 9:17 AMJacob Blanco
03/30/2022, 9:18 AMMatthias
03/30/2022, 11:05 AMEddie Atkinson
03/30/2022, 11:13 AMprefect/task/{ECS-task-id}
. My aim is to display those logs in our webapp to show admin users the status of the flows theyâve started
Ideally Iâd be able to use the ID of a flow run and use that to query the ECS task it ran on and then use that to query the logs of the flowBennett Lambert
03/30/2022, 11:13 AMAndreas Nord
03/30/2022, 11:16 AMJose Daniel Posada Montoya
03/30/2022, 1:27 PMAdam Roderick
03/30/2022, 4:38 PMFuETL
03/30/2022, 5:01 PMMia
03/30/2022, 5:40 PMfrom prefect.storage import GitLab
storage = GitLab(repo="org/repo",
path="/hello/hello_k8s.py",
ref="prefect")
And when I try to run the flow, I get 404 error file not found but the file is there. Is there a way to print out what path prefect runner is using?Shuchita Tripathi
03/30/2022, 5:45 PMChris Reuter
03/30/2022, 6:45 PMFina Silva-Santisteban
03/30/2022, 6:52 PMflow.set_dependencies(
task=SnowflakeQuery,
keyword_tasks=dict(query='''SELECT * FROM dummy_table;'''
)
)
flow.set_dependencies(
task=save_query_result_as_df,
keyword_tasks=dict(result_set=SnowflakeQuery)
)
The task save_query_result_as_df
currently only does a print()
of the result_set
. Iâm confused about a few things:
⢠I didnât provide the SnowflakeQuery
with any authentication. Why doesnât it throw an error about that?
⢠The print statement prints out <class 'prefect.tasks.snowflake.snowflake.SnowflakeQuery'>
, which makes me think the snowflakequery task wasnât run? If it wasnât run it would at least explain why it didnât throw an error đ
but how can I make it run?Scott Aefsky
03/30/2022, 6:54 PMAnna Geller