Elio
06/27/2022, 2:19 PMhttps://puu.sh/J8fy5/f9869c73c7.png▾
Alex Cannon
06/27/2022, 2:36 PMflow.py
flow with storage=Github(..., ref='main')
2. I checkout a new branch my-feature-branch
, and make some edits to the flow
3. I re-register the flow from this new branch, but still leave ref='main'
...so when the Flow is run, the version registered to the Prefect backend is from my-feature-branch
, but the agent running the flow will have pulled down the version from the main
branch?
Thanks in advance!Joshua Greenhalgh
06/27/2022, 3:22 PMprefect.exceptions.ClientError: [{'path': ['secret_value'], 'message': 'An unknown error occurred.', 'extensions': {'code': 'INTERNAL_SERVER_ERROR'}}]
Andreas Nord
06/27/2022, 3:37 PM[2022-06-27 17:16:15+0200] ERROR - prefect.DbtShellTask | /bin/bash: C:UsersUSERAD~1AppDataLocalTempprefect-6rbqzzlx: No such file or directory
This line in prefect.tasks.shell.py generates a strange path:
with tempfile.NamedTemporaryFile(prefix="prefect-") as tmp:
Ilhom Hayot o'g'li
06/27/2022, 4:14 PMAndreas
06/27/2022, 5:06 PMfuture
to another task, the task automatically receives the future result value as a variable which is the expected behavior. However this is not the case when we have a flow with subflows and we pass a prefect state from a flow run (that has a return value) to a flow downwards (where we should manually call `.result().result()`to extract the flows result's value) Is this the expected behavior? If yes this seems a little bit counter intuitive to me, where in the case of passing data from tasks Prefect automatically extracts the result value but not when this result is coming from a flow runKevin Mullins
06/27/2022, 5:08 PMFailed to retrieve task state with error: ClientError([{'path': ['get_or_create_task_run_info'], 'message': 'Expected type UUID!, found ""; Could not parse UUID: ', 'extensions': {'code': 'INTERNAL_SERVER_ERROR', 'exception': {'message': 'Expected type UUID!, found ""; Could not parse UUID: ', 'locations': [{'line': 2, 'column': 101}], 'path': None}}}])
@Mason Menges Pointed me to https://discourse.prefect.io/t/how-to-fix-the-error-prefect-exceptions-clienterror-message-expected-type-uuid-found-could-not-parse-uuid/832 to assist.
I ended up having to register the flow 3 times for it to start working. The first registration was the only time there were changes to the flow, I had to register an additional two times with zero changes to get the error to stop. Is there an issue related to this article or anything that can be done to dig into what exactly is going on?Constantino Schillebeeckx
06/27/2022, 6:01 PMArthur Jacquemart
06/27/2022, 7:32 PMWei Mei
06/27/2022, 7:35 PMdbt_project.yml file [ERROR not found]
Nicholas Thompson
06/27/2022, 10:38 PMscheduler="processes"
. The input I'm passing to the mapped task is ordered, ideally so that tasks assigned to the first items in the list are run first, however this doesn't seem to be the case. With my machine that has six cores, on a given task run I can see that the mapped tasks with map_index
values 0, 1, 3, 13, 14, and 15 are run first, but ideally I would want this first batch of tasks to have map index values of 0, 1, 2, 3, 4 and 5.
tldr: is it possible to ensure that when mapping a task, the mapped tasks get run in the same order they appear as in the input array?Yupei Chen
06/27/2022, 10:41 PMprefect auth list-tenants
I get below:
ValueError: badly formed hexadecimal UUID string
Nvm user error.Matthew Seligson
06/28/2022, 12:22 AMIsara Ovin
06/28/2022, 8:41 AMzsh: segmentation fault python app.py
when im trying to run a flow using LocalDaskExecutor
using threads can someone please helpAndreas
06/28/2022, 9:43 AMAbhishek Mitra
06/28/2022, 9:48 AMFlorian Guily
06/28/2022, 10:10 AMDominik Wagner
06/28/2022, 10:10 AMdbt source freshness
a. If I get an error, fail the flow
2. run dbt build
I’ve figured out a way to fail the flow by scanning the result from PrefectFuture but somehow this doesn’t feel right (simplified code snippet in 🧵).
Two questions 🙏:
1. Is there a “better” way to handle this?
2. I’d also like to add some kind of notification when the flow fails (email and/or slack) - I can’t find any integration within prefect itself, so would the recommended route be to send a webhook directly in the flow?Rajvir Jhawar
06/28/2022, 10:24 AMRohit
06/28/2022, 11:17 AMredsquare
06/28/2022, 11:34 AMcustomizations
to the k8sflowrunner cc @Kevin Kho
KubernetesFlowRunner(namespace="prefect", customizations=
[
{ "op": "add", "path": "/spec/ttlSecondsAfterFinished","value": 10}
])
FuETL
06/28/2022, 12:44 PMFailed to load and execute Flow's environment: StorageError('An error occurred while unpickling the flow:\n ImportError("cannot import name \'MyService\' from partially initialized module \'myflows.services.myservice\' (most likely due to a circular import) (/app/src/myflows/services/myservices.py)")\nThis may be due to a missing Python module in your current environment. Please ensure you have all required flow dependencies installed.')This is happening randomly the same flow with same args can work without any issue. There a way to debug this i'm using S3 storage
Jessica Smith
06/28/2022, 12:59 PMAbhishek Mitra
06/28/2022, 1:10 PMMadhup Sukoon
06/28/2022, 4:02 PMYupei Chen
06/28/2022, 5:00 PMflow.register(project_name='tutorial')
When using script based flow storage, is it okay to leave this line on the bottom? Or will it attempt to re-register the flow on scheduled flow run?
Or using CLI:
prefect register --path hello_flow.py --project tutorial --label my-label
Dylan
06/28/2022, 5:17 PMJason Motley
06/28/2022, 6:05 PMAdam
06/28/2022, 6:15 PMYupei Chen
06/28/2022, 6:56 PMFor example, suppose we want to construct a flow with one root task; if this task succeeds, we want to run task B. If instead it fails, we want to run task C.
However when I run my flow, it looks like the root task is being run twice.
This is not what I expected, how can I achieve the root task only running once?
Edit: Figured it out. Store the mapped result into a variable and pass that into the upstream_tasks parameter.Yupei Chen
06/28/2022, 6:56 PMFor example, suppose we want to construct a flow with one root task; if this task succeeds, we want to run task B. If instead it fails, we want to run task C.
However when I run my flow, it looks like the root task is being run twice.
This is not what I expected, how can I achieve the root task only running once?
Edit: Figured it out. Store the mapped result into a variable and pass that into the upstream_tasks parameter.