Konstantin
01/18/2022, 3:39 PMJohan Wåhlin
01/18/2022, 3:46 PMSamay Kapadia
01/18/2022, 3:51 PMThe secret KUBERNETES_API_KEY was not found
if I’m running the prefect agent inside the cluster? According to this doc it will attempt an in cluster connection but my hello world task seems to keep failing 😭Jason Motley
01/18/2022, 3:55 PMmax_retries
feature but for an entire flow? I.e. if the flow fails for some reason, retry it 5 minutes later.Muddassir Shaikh
01/18/2022, 4:04 PM@task(task_run_name="{task_name_from_tup(details)}", max_retries=3, retry_delay=timedelta(minutes=1))
def processing(details):
//some code//
Yusuf Khan
01/18/2022, 5:40 PMFrank Oplinger
01/18/2022, 10:05 PMPhilipp Eisen
01/19/2022, 12:02 AMConsider scattering large objects ahead of time
with client.scatter to reduce scheduler burden and
keep data on workers
future = client.submit(func, big_data) # bad
big_future = client.scatter(big_data) # good
future = client.submit(func, big_future) # good
I was wondering, what would be the prefect pattern here to scatter the object ahead of time?Jacob Blanco
01/19/2022, 1:52 AMAkharin Sukcharoen
01/19/2022, 7:46 AMThomas Pedersen
01/19/2022, 8:14 AMSamay Kapadia
01/19/2022, 8:59 AM{
"kind": "Status",
"apiVersion": "v1",
"metadata": {},
"status": "Failure",
"message": "jobs.batch \"dummy\" is forbidden: User \"system:serviceaccount:default:default\" cannot get resource \"jobs/status\" in API group \"batch\" in the namespace \"default\"",
"reason": "Forbidden",
"details": {
"name": "dummy",
"group": "batch",
"kind": "jobs"
},
"code": 403
}
For context, I’ve applied the yaml from prefect agent kubernetes install --rbac
so all the permissions should work in theory. I'm stuck at what could be wrongMuddassir Shaikh
01/19/2022, 9:23 AM[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'Tuple': Finished task run for task with final state: 'Success'
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'List': Starting task run...
[2022-01-19 14:45:49+0530] INFO - prefect.TaskRunner | Task 'List': Starting task run...
[2022-01-19 14:45:50+0530] INFO - prefect.TaskRunner | Task 'List': Starting task run...
[2022-01-19 14:45:50+0530] INFO - prefect.TaskRunner | Task 'List': Starting task run...
Muddassir Shaikh
01/19/2022, 9:25 AMTony Waddle
01/19/2022, 10:14 AMYueh Han Huang
01/19/2022, 10:16 AMMuddassir Shaikh
01/19/2022, 12:01 PMFile "/home/infra/prefect_server/lib/python3.8/site-packages/prefect/client/client.py", line 603, in _send_request
response = <http://session.post|session.post>(
File "/home/infra/prefect_server/lib/python3.8/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/home/infra/prefect_server/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/infra/prefect_server/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/infra/prefect_server/lib/python3.8/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=4200): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f3be9b9e700>: Failed to establish a new connection: [Errno 111] Connection refused'))
Example: My GUI server is hosted on Machine A and one agent is Machine B, the code present on Machine B is to be run and registered on Machine B but should show its task on the Machine A GUI.Samay Kapadia
01/19/2022, 1:13 PMdummy
runs on the aks-spot
node but prefect-job
runs on the aks-system
node (and I don’t want it running on the system node pool). Is there a way to configure tolerations and affinities for the prefect-job
pod?Thomas Opsomer
01/19/2022, 2:44 PMLuis Aguirre
01/19/2022, 3:09 PMMichail Melonas
01/19/2022, 3:10 PMTomek Florek
01/19/2022, 3:41 PMTom Shaffner
01/19/2022, 4:23 PMJake
01/19/2022, 4:42 PMMuddassir Shaikh
01/19/2022, 5:35 PMSuresh R
01/19/2022, 6:41 PMbrian
01/19/2022, 7:03 PMbrian
01/19/2022, 7:57 PMTrigger was "all_successful" but some of the upstream tasks failed.
but all the upstream tasks were successful. Am I missing something?Martim Lobao
01/19/2022, 11:33 PMTony Yun
01/19/2022, 11:51 PMTony Yun
01/19/2022, 11:51 PMTask 'RunNamespacedJob': Exception encountered during task execution!
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/urllib3/response.py", line 697, in _update_chunk_length
self.chunk_left = int(line, 16)
ValueError: invalid literal for int() with base 16: b''
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/urllib3/response.py", line 438, in _error_catcher
yield
File "/usr/local/lib/python3.7/site-packages/urllib3/response.py", line 764, in read_chunked
self._update_chunk_length()
File "/usr/local/lib/python3.7/site-packages/urllib3/response.py", line 701, in _update_chunk_length
raise InvalidChunkLength(self, line)
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)
Kevin Kho
01/19/2022, 11:56 PMTony Yun
01/19/2022, 11:58 PM