Oscar Björhn
08/09/2022, 4:49 PMMars
08/09/2022, 6:42 PMPREFECT_API_URL
?) Are there additional agent settings in 2.0 I should be aware of for running a production k8s deployment, like the work queue name? If so, do they have envvars, and where in the docs can I read about them?Patrick Tan
08/09/2022, 7:42 PMMatt Delacour
08/09/2022, 7:55 PMSimon Macklin
08/09/2022, 8:16 PMRoss Teach
08/09/2022, 8:26 PMMatt Delacour
08/09/2022, 8:42 PMJohn Archer
08/09/2022, 11:35 PMmanifest.json
file that is created from the deployment build
cli command. I see you can set the --output
for the deployment.yaml
file. I have a number of flows in a single repo and would like to keep the structure clean, I know I can run the command from within the directory that I have the flow in but am looking to run the commands from the project root. This is also so I can simplify CI/CD jobs in the future. Any help would be appreciated.Thuy Tran
08/10/2022, 4:46 AM00:35:13.071 | DEBUG | Flow run 'logical-lemming' - Resolving inputs to 'update-record'
00:35:14.896 | DEBUG | prefect.client - Connecting to API at <http://127.0.0.1:4200/api/>
00:35:17.563 | DEBUG | prefect.agent - Checking for flow runs...
00:35:22.591 | DEBUG | prefect.agent - Checking for flow runs...
--- Orion logging error ---
The log worker encountered a fatal error.
Traceback (most recent call last):
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/site-packages/prefect/logging/handlers.py", line 82, in _send_logs_loop
anyio.run(self.send_logs)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/site-packages/anyio/_core/_eventloop.py", line 70, in run
return asynclib.run(func, *args, **backend_options)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 292, in run
return native_run(wrapper(), debug=debug)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/asyncio/runners.py", line 47, in run
_cancel_all_tasks(loop)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/asyncio/runners.py", line 56, in _cancel_all_tasks
to_cancel = tasks.all_tasks(loop)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/asyncio/tasks.py", line 53, in all_tasks
tasks = list(_all_tasks)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/_weakrefset.py", line 65, in __iter__
for itemref in self.data:
RuntimeError: Set changed size during iteration
Worker information:
Approximate queue length: 0
Pending log batch length: 0
Pending log batch size: 0
00:35:27.618 | DEBUG | prefect.agent - Checking for flow runs...
00:35:32.655 | DEBUG | prefect.agent - Checking for flow runs...
00:35:37.680 | DEBUG | prefect.agent - Checking for flow runs...
00:35:42.713 | DEBUG | prefect.agent - Checking for flow runs...
00:35:46.427 | DEBUG | prefect.flows - Parameter 'target_collection' for flow 'update-record' is of unserializable type 'Collection' and will not be stored in the backend.
00:35:46.427 | DEBUG | prefect.flows - Parameter 'sys_collection' for flow 'update-record' is of unserializable type 'Collection' and will not be stored in the backend.
00:35:46.427 | DEBUG | prefect.flows - Parameter 'metadata_collection' for flow 'update-record' is of unserializable type 'Collection' and will not be stored in the backend.
00:35:47.751 | DEBUG | prefect.agent - Checking for flow runs...
00:35:48.177 | INFO | Flow run 'logical-lemming' - Created subflow run 'knowing-avocet' for flow 'update-record'
00:35:48.395 | ERROR | Flow run 'knowing-avocet' - Received invalid parameters
Traceback (most recent call last):
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/site-packages/prefect/engine.py", line 433, in create_and_begin_subflow_run
parameters = flow.validate_parameters(parameters)
File "/Applications/anaconda3/envs/datapipeline/lib/python3.9/site-packages/prefect/flows.py", line 275, in validate_parameters
raise validation_err
prefect.exceptions.ParameterTypeError: 1 validation error for UpdateRecord
end_date
str type expected (type=type_error.str)
00:35:48.435 | INFO | Flow run 'logical-lemming' - Created task run 'update_record_in_db-a12634ec-0' for task 'update_record_in_db'
00:35:48.435 | INFO | Flow run 'logical-
Vadym Dytyniak
08/10/2022, 8:15 AMWARNING - agent | Job 'prefect-job-623be8f6' is for flow run '958c388f-f155-4ee8-a279-40c546b99808' which does not exist. It will be ignored.
Benjamin.bgx
08/10/2022, 9:16 AMblock = Azure(azure_storage_connection_string="paste_the_string_here")
block.save("dev")
and with that commande line I finalize the storage information for the flow :
prefect deployment build flow.py:flowname \
--name deploy_name --tag dev -sb azure/dev
Can I do the same things from the UI ? In this case, I just need to use the command line and reference the block define in the UI.
Am I right with all this assumptions ?
And so, how I specify the storage for the persistent logs ?
Thank you ! 🙂Vincent Chéry
08/10/2022, 11:36 AMSudharshan B
08/10/2022, 12:11 PMChristian Vogel
08/10/2022, 12:18 PM(begin_task_run pid=81126) from prefect.packaging.docker import DockerPackager
(begin_task_run pid=81126) ImportError: cannot import name 'DockerPackager' from partially initialized module 'prefect.packaging.docker' (most likely due to a circular import) (/home/christian/Documents/ray_and_prefect/temp/temp-venv/lib/python3.9/site-packages/prefect/packaging/docker.py)
Paul Lucas
08/10/2022, 12:35 PMtrigger=any_failed
which works fine when one of those does fail. But when those upstream tasks are all successful, I seem to get a TriggerFailed
on that task and therefore the flow run is FAILED
. Is that expected behaviour and if so, how do I instead get it to just skip that task instead of marking it as Failed and therefore the whole flow as failed? Thanks in advanceKelvin
08/10/2022, 12:37 PMLucien Fregosi
08/10/2022, 12:54 PM--schedule
option (like we do for storage block for instance) ?
In my automated process I can’t edit the deployment.yaml file as I re-build this file every time in case of something has changedVlad Tudor
08/10/2022, 12:59 PMdocker-compose.yaml
from running prefect server config
(posted below). I should now add a new service at the end that runs a Local Agent and a service that registers my flow at startup - is that correct?
For now if I add a service that runs prefect agent local start
I get ValueError: You have not set an API key for authentication.
when running docker-compose up
.Matt Delacour
08/10/2022, 1:13 PMNikhil Joseph
08/10/2022, 1:43 PMharis khan
08/10/2022, 1:47 PMMars
08/10/2022, 2:35 PMSoren Daugaard
08/10/2022, 2:41 PM✗ poetry run prefect cloud login -k my-key
Traceback (most recent call last):
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/prefect/cli/_utilities.py", line 41, in wrapper
return fn(*args, **kwargs)
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/prefect/utilities/asyncutils.py", line 193, in wrapper
return run_async_in_new_loop(async_fn, *args, **kwargs)
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/prefect/utilities/asyncutils.py", line 140, in run_async_in_new_loop
return anyio.run(partial(__fn, *args, **kwargs))
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/anyio/_core/_eventloop.py", line 70, in run
return asynclib.run(func, *args, **backend_options)
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 292, in run
return native_run(wrapper(), debug=debug)
File "/Users/username/.pyenv/versions/3.9.13/lib/python3.9/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/Users/username/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
return future.result()
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 287, in wrapper
return await func(*args)
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/prefect/cli/cloud.py", line 264, in login
current_workspace = get_current_workspace(workspaces)
File "/Users/username/Library/Caches/pypoetry/virtualenvs/prefect-eval-ZjHwTZGp-py3.9/lib/python3.9/site-packages/prefect/cli/cloud.py", line 64, in get_current_workspace
current_workspace_id = re.match(
AttributeError: 'NoneType' object has no attribute 'groups'
An exception occurred.
I also get an error if I manually set the key via:
prefect config set PREFECT_API_URL="<https://api.prefect.cloud/api/accounts/[our-account-id]/workspace/[our-workspace-id]>"
prefect config set PREFECT_API_KEY="mykey"
And then use prefect profile use my-profile
to login.
How do I get my CLI setup to work against our cloud workspace?Khuyen Tran
08/10/2022, 2:47 PMLukáš Pravda
08/10/2022, 3:09 PMtag = prefect.context.get("artifacts_tag_id")
msgs: list = prefect.context.get("artifacts_content") or []
msgs.append(str(err))
content = create_err_artifact_from_list_of_errs(msgs)
if tag:
update_markdown_artifact(tag, content)
else:
tag = create_markdown_artifact(content)
prefect.context["artifacts_tag_id"] = tag
prefect.context["artifacts_content"] = msgs
but I found out that mutating the context from within a task is not recommended. So I wonder if there is any other prefect way of sharing these two ‘variables’ throughout the life of a prefect flow.
Thank you very much!Jonathan Mathews
08/10/2022, 4:10 PMMohamed Alaa
08/10/2022, 4:34 PMJimmy Le
08/10/2022, 4:50 PMPatrick Tan
08/10/2022, 5:54 PMKhuyen Tran
08/10/2022, 6:49 PM