redsquare
12/13/2022, 1:46 PM/usr/local/lib/python3.10/runpy.py:126: RuntimeWarning: 'prefect.engine' found in sys.modules after import of package 'prefect', but prior to execution of 'prefect.engine'; this may result in unpredictable behaviour
chara
12/13/2022, 1:50 PMTask 'TaskPreprocessing': Exception encountered during task execution!
Traceback (most recent call last):
File "my-file.py", line 105, in run_job
job.RunNamespacedJob(
File "/usr/local/lib/python3.8/site-packages/prefect/utilities/tasks.py", line 456, in method
return run_method(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/prefect/tasks/kubernetes/job.py", line 730, in run
job = api_client_job.read_namespaced_job_status(
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/api/batch_v1_api.py", line 1393, in read_namespaced_job_status
return self.read_namespaced_job_status_with_http_info(name, namespace, **kwargs) # noqa: E501
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/api/batch_v1_api.py", line 1480, in read_namespaced_job_status_with_http_info
return self.api_client.call_api(
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 348, in call_api
return self.__call_api(resource_path, method,
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 180, in __call_api
response_data = self.request(
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 373, in request
return self.rest_client.GET(url,
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/rest.py", line 239, in GET
return self.request("GET", url,
File "/usr/local/lib/python3.8/site-packages/kubernetes/client/rest.py", line 233, in request
raise ApiException(http_resp=r)
kubernetes.client.exceptions.ApiException: (500)
Reason: Internal Server Error
HTTP response headers: HTTPHeaderDict({'Audit-Id': '<Audit-Id>', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'X-Kubernetes-Pf-Flowschema-Uid': '<X-Kubernetes-Pf-Flowschema-Uid>', 'X-Kubernetes-Pf-Prioritylevel-Uid': '<X-Kubernetes-Pf-Prioritylevel-Uid>', 'Date': 'Tue, 13 Dec 2022 11:02:40 GMT', 'Content-Length': '150'})
HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"rpc error: code = Unavailable desc = transport is closing","code":500}
Miel Vander Sande
12/13/2022, 3:10 PMAshley Felber
12/13/2022, 3:19 PMFred Birchall
12/13/2022, 4:04 PM.submit
a task and use one of the parameters as the task name? I know we can currently do my_task.submit(name="My Task")
and this will show up as my_task-a1b2c3-0
in the UI, but what I would like to do is:
from prefect import flow, task
@flow(name="My Flow")
def my_flow():
my_task("Hello")
my_task(name="World")
@task(name_from_param="name")
def my_task(name):
pass
Then in the Prefect UI I will see:
Running Flow: My Flow
Running Task: Hello-a1b2c3-0
Running Task: World-a1b2c3-1
A bit of background for context, we run all of our tasks either using AWS Lambda or DBT. For Lambda, I’ve built a central function (which is decorated as a task) with a signature run_lambda(lambda_name, payload={})
, and most users are using it like run_lambda.submit("MyLambdaFunction")
but then in the logs all the task names are run_lambda-[ascii]-0
which is annoying when things go pop, as you have to click into the task run where I have added <http://logger.info|logger.info>("Lambda name: %s", lambda_name)
for traceability. For all of my personal Flows I have the following convention run_lambda.with_options(name="Lambda MyLambdaFunction").submit("MyLambdaFunction")
, which gives me the desired Prefect Task names in the UI, but getting everyone else to follow suit is another matter. I’ve attempted many solutions to dynamically name a task mainly focused on custom decorators, but they have all failed in one way or another… So I wanted to see if the Prefect community has any ideas or solutions! ThanksThomas Opsomer
12/13/2022, 5:19 PMSlackbot
12/13/2022, 5:35 PMJosh
12/13/2022, 5:35 PMJason
12/13/2022, 5:40 PMClaire Herdeman
12/13/2022, 5:46 PMAshley Felber
12/13/2022, 5:54 PMAshley Felber
12/13/2022, 7:21 PMSenthil M.K.
12/13/2022, 8:14 PMKalise Richmond
12/13/2022, 8:27 PMKalise Richmond
12/13/2022, 8:39 PMscott
12/13/2022, 9:40 PMcreate_flow_run
in Prefect 1.0 https://docs-v1.prefect.io/api/latest/tasks/prefect.html#create-flow-run ? Trying to translate to prefect 2, and this method create_flow_run
is somewhat confusing - a task that creates a flow within a flow?Parwez Noori
12/14/2022, 3:14 AMEdmund Tian
12/14/2022, 4:49 AMYoung Ho Shin
12/14/2022, 5:41 AMPREFECT_LOGGING_HANDLERS_CONSOLE_LEVEL=999 run_flow.py
would do the job, but this seems to turn off the logs on the web UI as well as the console.Matt Alhonte
12/14/2022, 5:48 AMRuntimeError: There is no current event loop in thread 'ThreadPoolExecutor-0_0'.
instead of propagating the actual error (which it used to do!). It does this both with custom tasks and with the official Jupyter task. This kinda popped up out of left field - any idea what's going on? (it's important cuz we have Slack Alerting based on those errors). On Prefect 1.3.1
btwredsquare
12/14/2022, 12:25 PMSlackbot
12/14/2022, 12:53 PMRavi Kumar
12/14/2022, 1:33 PMDaniel Bernstein
12/14/2022, 1:56 PMbotocore.errorfactory.ResourceNotFoundException: An error occurred (ResourceNotFoundException) when calling the GetLogEvents operation: The specified log stream does not exist.
however, the task is generating logs that appear in AWS CloudWatch. Maybe the agent is looking for logs under a different name than the task is outputting them to?
I prefer not to revert back to the early versions that appear to be working in the dev environment but I cannot seem to get the newer versions of prefect and prefect-aws to function as expected. @Taylor CurranJeff Hale
12/14/2022, 3:02 PMNathan R
12/14/2022, 4:36 PMscott
12/14/2022, 4:52 PMtrigger = all_finished
in prefect 1? This thread suggests that it’s using .submit()
https://discourse.prefect.io/t/how-can-i-trigger-downstream-tasks-based-on-upstream-task-s-state/106 on the task that you want to always run regardless of upstream task failures?Dan Marling
12/14/2022, 4:52 PMGabrielle Bennett
12/14/2022, 4:54 PMKalise Richmond
12/14/2022, 5:36 PM