Jacob Blanco
05/23/2022, 12:58 AMJeff Kehler
05/23/2022, 2:57 AMBigQueryTask
from prefect.tasks.gcp
. I want to be able to pass as Parameter
value into this task but I am unable to do so. It appears the request being sent to BigQuery contains the following which generates a JSON error
{'value': <Parameter: min_date>}
The above value is not valid JSON so therefore explains why the Google API won't accept this request.Sumant Agnihotri
05/23/2022, 4:18 AMRaymond Yu
05/23/2022, 5:14 AMwait_for_flow_run
for a long running DatabricksSubmitRun in another flow even when the Databricks job runs to completion without an issue. We noticed this can occasionally result in the error enclosed below that causes no heartbeat to be detected. Has anyone encountered this? Any ideas on what may be causing this and how to address the issue?
Error during execution of task: ClientError([{'path': ['flow_run'], 'message': 'request to <http://hasura:3000/v1alpha1/graphql> failed, reason: read ECONNRESET', 'extensions': {'code': 'INTERNAL_SERVER_ERROR', 'exception': {'message': 'request to <http://hasura:3000/v1alpha1/graphql> failed, reason: read ECONNRESET', 'type': 'system', 'errno': 'ECONNRESET', 'code': 'ECONNRESET'}}}])
Jacob Blanco
05/23/2022, 7:27 AMSander
05/23/2022, 7:31 AMVadym Dytyniak
05/23/2022, 8:29 AMVadym Dytyniak
05/23/2022, 8:29 AMFailed to load and execute flow run: FlowStorageError('An error occurred while unpickling the flow:\n AttributeError("Can\'t get attribute \'_make_function\' on <module \'cloudpickle.cloudpickle\' from \'/usr/local/lib/python3.10/dist-packages/cloudpickle/cloudpickle.py\'>")\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - cloudpickle: (flow built with \'2.1.0\', currently running with \'2.0.0\')')
Vadym Dytyniak
05/23/2022, 8:30 AMprefect[aws]==1.2.1
Vadym Dytyniak
05/23/2022, 8:35 AMValentin Baert
05/23/2022, 8:44 AMFlorian Guily
05/23/2022, 10:25 AMref
parameter but it is refering to a "SHA-1 value, tag, or branch name". Where can i find this SHA1 value of a given branchname ?Todd de Quincey
05/23/2022, 10:28 AMOlivér Atanaszov
05/23/2022, 11:19 AMIlhom Hayot o'g'li
05/23/2022, 12:08 PMMatthew Seligson
05/23/2022, 12:46 PMale
05/23/2022, 1:23 PMdict
, but got the following error:
At least one upstream state has an unmappable result.
Looking at the docs, it seems that map
should work with an Iterable
.
Afaik, dict
is an Iterable
in Python, so I’m a bit confused 😅
I’m using Prefect 0.15.16
Mateo Merlo
05/23/2022, 2:11 PMJoshua Greenhalgh
05/23/2022, 2:36 PMtry:
scheduled_start_time: DateTime = prefect.context.scheduled_start_time
except AttributeError:
raise Exception("No start/end time params and no schedule")
Joshua Greenhalgh
05/23/2022, 2:36 PMJessica Smith
05/23/2022, 3:07 PMJonathan Mathews
05/23/2022, 3:31 PMRob McInerney
05/23/2022, 4:36 PMwith Flow(FLOW_NAME, storage=set_storage(FLOW_NAME), run_config=set_run_config(),) as flow:
dataset = Parameter("dataset", default={})
dataset_is_valid = check_dataset_is_valid(dataset)
with case(dataset_is_valid, True):
access_token = get_access_token()
response = refresh_dataset(dataset, access_token)
result = check_result(dataset, response, access_token)
flow.set_reference_tasks([response, result])
with case(dataset_is_valid, False):
flow.set_reference_tasks([dataset_is_valid])
Ilhom Hayot o'g'li
05/23/2022, 5:58 PMJosh
05/23/2022, 8:59 PMThis may be due to one of the following version mismatches between the flow build and execution environments:\n - python: (flow built with '3.7.13', currently running with '3.7.12')")
The CI/CD system registering my flow I guess is using python 3.7.13. But how do I know what the execution environment python is if it’s executing from a docker image in a prefect docker agent?Josh
05/23/2022, 10:47 PMHaoboGu
05/24/2022, 2:27 AMDylan Lim
05/24/2022, 2:52 AMGuillaume Latour
05/24/2022, 6:57 AM/home/<user>/.prefect/results
folder with <user>
beeing the user launching the prefect server & agent on the other server (and who are not present in the docker)
I've added this line into the configuration: flows.checkpointing = false
, relaunched the server and agent but nothing has changed: the results folder is still beeing filled with intermediate tasks results.
Am I doing something wrong? Is it possible to prevent this intermediate backups?
Ty in advanceEmma Rizzi
05/24/2022, 7:17 AM