Denys Volokh
07/28/2022, 3:50 AMJeffrey Lai
07/28/2022, 5:33 AMViet Nguyen
07/28/2022, 6:43 AM_from_ prefect.utilities.asyncio _import_ Sync
hi everyone, I wonder if what is the equivalent thing for this in prefect 2.0 ? I'm trying to implement something similar to as_completed()
from Dask, thank you.Michiel Verburg
07/28/2022, 7:17 AMRajvir Jhawar
07/28/2022, 9:22 AMAndreas Nigg
07/28/2022, 9:26 AMtest
|___ deployment.ymal
|___ kayak_database_flows.py
My build command is (run from inside the folder "test")
prefect deployment build kayak_database_flows.py:integrate_kayak_details --name integrate_kayak_db_details -t rmesbtest1 -sb gcs/gcs-prefect-stprage
Then i apply the deployment and run the deployment. Everything fine so far, the flow get's picked up by the corresponding agent
However, the agent downloads ALL of the content of the GCS bucket. So really all of the content in this bucket, not only the stuff related to this deployment. Is this expected?
If so, do I need a separate GCS storage block for any of the deployments?David
07/28/2022, 10:03 AMPaul Lucas
07/28/2022, 10:14 AMAndreas
07/28/2022, 10:45 AMprefect deployment create deployments.yaml
we were able to deploy them all at once in Prefect Server. I have seen that there are a lot of changes in GA compared to latest beta and I took a look at the updated docs about it. However I think that some (important to me) functionality is missing compared to what we had in beta. Is there an easy way in the new GA deployments procedure to have multiple flows in a yaml file or an alternative way to deploy multiple flows at once without having to call prefect deployment build ......
and prefect deployment apply deployment.yaml
manually for each single flow that we have?Tarek
07/28/2022, 11:21 AMlabels='prod'
however the registered flow also uses the id of the docker instance it is built into (See screenshot): Is there anyway to delete that, so that only the labels I give are used?haris khan
07/28/2022, 11:31 AMChu
07/28/2022, 11:42 AMash
07/28/2022, 12:06 PMexit error 2, Instance terminated
on dashboard and on pod logs we are getting No such command "execute"
Riccardo Tesselli
07/28/2022, 12:09 PMJehan Abduljabbar
07/28/2022, 12:26 PMNelson Griffiths
07/28/2022, 12:59 PMLucien Fregosi
07/28/2022, 1:00 PMAlix Cook
07/28/2022, 1:24 PMT B
07/28/2022, 1:32 PMlocal file system
storage/block: my-local-storage.
However when I try to build with the parameter "prefect deployment build --storage-block local-file-system/my-local-storage" I get an exception :
File "/usr/local/lib/python3.10/site-packages/prefect/filesystems.py", line 114, in put_directory
shutil.copytree(from_path, local_path, dirs_exist_ok=True)
NameError: name 'from_path' is not defined
Looking at the code : https://github.com/PrefectHQ/prefect/blob/2.0.0/src/prefect/filesystems.py#L114
it seems the variable "from_path" is indeed not defined.Troels Bjørnskov
07/28/2022, 1:47 PMprefect_test_harness
located? It tries to connect to a server, I want to skip that stepTroels Bjørnskov
07/28/2022, 1:47 PM@flow
decorator.Riccardo Tesselli
07/28/2022, 1:53 PMimage_pull_policy
, is it going to be added in the upcoming releases?Chris Reuter
07/28/2022, 2:08 PMMatthew Seligson
07/28/2022, 2:21 PMSander
07/28/2022, 2:21 PMRiccardo Tesselli
07/28/2022, 2:25 PMprefect deployment build flows/hello_world.py:hello_world --name "Hello world dev" --tag kubernetes --tag dev --infra kubernetes-job --infra-block base-dev
and I got this:
File "/Users/tessellir/Projects/il-da-orchestrator/.venv/lib/python3.8/site-packages/prefect/blocks/core.py", line 501, in load
block_type_slug, block_document_name = name.split("/", 1)
ValueError: not enough values to unpack (expected 2, got 1)
An exception occurred.
I’ve ofc created a block in Prefect cloud of type KubernetesJob called base-dev, what am I missing?Sushma Adari
07/28/2022, 2:27 PMTim Enders
07/28/2022, 2:27 PMMathijs Carlu
07/28/2022, 2:57 PMMansour Zayer
07/28/2022, 3:28 PMA -> B (trigger: A finished) -> C (trigger: B successful)
I want C
to ignore what happens to A
, but only run if B
is successful. B
will run after A
is finished (no matter fail or success)
I looked into state and state_handlers, seemed too complicated for such a (seemingly) simple task.
edit: In other words, I don't want the trigger function to aggregate ALL the upstream tasks. Just consider the task that I've explicitly set as upstream
Thanks