I'm trying to create a custom worker template. How...
# ask-community
s
I'm trying to create a custom worker template. How can I see the "rendered" json for a deployment run for troubleshooting?
s
Think this comes out if logging level is set to debug?
s
Seems like not? Here is how I'm running the worker:
Copy code
PREFECT_DEBUG_MODE=True prefect worker start --pool my-cloud-run-pool-v2
I'm getting DEBUG logs, but the rendered JSON is not included. The error I'm trying to troubleshoot is:
Copy code
13:47:19.659 | DEBUG   | GlobalEventLoopThread | prefect._internal.concurrency - Service <prefect.logging.handlers.APILogWorker object at 0x10955b090> added item {'name': 'prefect.flow_runs.worker', 'level': 40, 'message': 'Failed to submit flow run \'f168d7cd-7b90-429d-9c46-bd620c47b573\' to infrastructure.\nTraceback (most recent call last):\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect/workers/base.py", line 896, in _submit_run_and_capture_errors\n    result = await self.run(\n             ^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect_gcp/workers/cloud_run_v2.py", line 406, in run\n    await run_sync_in_worker_thread(\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect/utilities/asyncutils.py", line 91, in run_sync_in_worker_thread\n    return await anyio.to_thread.run_sync(\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/anyio/to_thread.py", line 33, in run_sync\n    return await get_asynclib().run_sync_in_worker_thread(\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread\n    return await future\n           ^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 807, in run\n    result = context.run(func, *args)\n             ^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect_gcp/workers/cloud_run_v2.py", line 514, in _create_job_and_wait_for_registration\n    self._create_job_error(\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect_gcp/workers/cloud_run_v2.py", line 614, in _create_job_error\n    raise exc\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect_gcp/workers/cloud_run_v2.py", line 506, in _create_job_and_wait_for_registration\n    JobV2.create(\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/prefect_gcp/models/cloud_run_v2.py", line 161, in create\n    response = request.execute()\n               ^^^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper\n    return wrapped(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/Users/seandavis/Documents/git/infra/prefect-infra/venv/lib/python3.11/site-packages/googleapiclient/http.py", line 938, in execute\n    raise HttpError(resp, content, uri=self.uri)\ngoogleapiclient.errors.HttpError: <HttpError 400 when requesting <https://run.googleapis.com/v2/projects/omicidx-338300/locations/us-central1/jobs?jobId=prefect-benign-gecko&alt=json> returned "InvalidJSON payload received. Unknown name "csi" at \'job.template.template.volumes[0]\': Cannot find field.". Details: "[{\'@type\': \'<http://type.googleapis.com/google.rpc.BadRequest\|type.googleapis.com/google.rpc.BadRequest\>', \'fieldViolations\': [{\'field\': \'job.template.template.volumes[0]\', \'description\': \'Invalid JSON payload received. Unknown name "csi" at \\\'job.template.template.volumes[0]\\\': Cannot find field.\'}]}]">', 'timestamp': '2024-01-27T18:47:19.651749+00:00', 'flow_run_id': 'f168d7cd-7b90-429d-9c46-bd620c47b573', 'task_run_id': None} to batch (size 3729/3000000)
But I recognize that this could very much be a misconfiguration on my part.