Abhas P
09/29/2021, 3:04 PMTony Yun
09/29/2021, 3:57 PMBrian Phillips
09/29/2021, 4:16 PMflow
object without having MY_BUCKET
secret available (e.g. for generating documentation), while still materializing the config correctly when the flow is run or registered. Curious if anyone has seen similar use cases or solutions?
from prefect.client import Secret
from prefect.storage import S3
from lazy_object_proxy import Proxy
flow = Flow(
...,
storage=Proxy(lambda: S3(Secret('MY_BUCKET').get())),
)
Jason Bertman
09/29/2021, 7:38 PMimport os
from flows.wherever import flow
from prefect.executors import LocalDaskExecutor
from prefect.storage import Local
context = {
# ...
}
flow.executor = LocalDaskExecutor(scheduler="threads", num_workers=8)
flow.storage = Local()
state = flow.run(
param="something"
context=context,
)
This works fine, but doesn't seem to fly with flows that call StartFlowRun
, since it's trying to reach out to a server. Anyone have a way to do dependent flows locally? Not sure if the new local runner can do it, I haven't had the chance to try it yet 🙂Leon Kozlowski
09/29/2021, 10:21 PMprefect build …
to first build the flow and output a flow.json
to then use in my register
command, but the build is failing. with the error message docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
I believe this is a side effect of trying to build a docker image whilst running in a docker container, has anyone ever run into this issue?wiretrack
09/29/2021, 11:49 PMimagePullSecrets
using the Helm chart? I can make it work when I manually write the deployment
and pass the --image-pull-secrets
argument, but I can’t make it work passing the secret argument on the helm chart. Any ideas what I might be doing wrong? I doing something like imagePullSecrets: mysecretname
in the Helm (inside the “job”, inside the “agent” part)Aaron Ash
09/30/2021, 4:21 AMAntti Tupamäki
09/30/2021, 5:00 AMRyan Sattler
09/30/2021, 6:33 AMpart1, part2 = my_tuple_task(input)
otherresult1 = othertask1(part1)
otherresult2 = othertask2(part2)
However now we want to map over multiple inputs to have multiple parallel pipelines of the above:
whole_tuple_result = my_tuple_task.map(inputs)
otherresult1 = othertask1.map(whole_tuple_result) # tasks must break up tuple inside the function
otherresult2 = othertask2.map(whole_tuple_result)
Is there a way to maintain the elegant tuple-destructuring while still mapping over the result? Just trying to do it directly gives the error
TypeError: Task is not iterable. If your task returns multiple results, pass nout to the task decorator/constructor, or provide a Tuple return-type annotation to your task.
(we’ve already set that which is why the first example works)Qin XIA
09/30/2021, 7:48 AMItalo Barros
09/30/2021, 1:03 PMFailed to load and execute Flow's environment: FlowStorageError("An error occurred while unpickling the flow:\n TypeError('code() takes at most 15 arguments (16 given)',)\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - prefect: (flow built with '0.15.6', currently running with '0.15.3')\n - python: (flow built with '3.9.7', currently running with '3.6.12')",)
I already tried to use LocalRun(working_dir="C:\Project_2") but doesn't seem to work either. How can I specify to the second agent that he needs to use the desired "env_2"?Dan Zhao
09/30/2021, 1:27 PMRuslan Aliev
09/30/2021, 1:29 PMKevin Weiler
09/30/2021, 2:14 PMRenameFlowRun
prefect task to rename my flow based on a value in the context. I noticed that if I try something like:
rename_flow = RenameFlowRun(flow_name=f"{prefect.context.today}")
It tries to evaluate the context at deploy time, instead of at runtime. Is there a reasonable way to do this?
@Rob Douglas ended up just pulling the guts out of the run()
method of RenameFlowRun
and putting it in a task. That works, but it seems a bit dodgy.
PS - the documentation for this is wrong, as the example shows importing FlowRenameTask
instead of RenameFlowRun
Max Kureykin
09/30/2021, 2:50 PMBob Cavezza
09/30/2021, 3:07 PMJessica Smith
09/30/2021, 8:25 PMDarshan
09/30/2021, 10:33 PMAndreas Tsangarides
10/01/2021, 9:30 AMMatt Duck
10/01/2021, 12:06 PMTony Yun
10/01/2021, 2:36 PMThe secret KUBERNETES_API_KEY was not found
when trying to run a Kubernetes Task. I thought I’m using in-cluster connection so it shouldn’t be required? Otherwise is there any recommendation on creating a ``KUBERNETES_API_KEY` ?
according to: https://docs.prefect.io/api/latest/tasks/kubernetes.html#createnamespaceddeployment
Attempt to use a Prefect Secret that contains a Kubernetes API Key. If=kubernetes_api_key_secret
then it will attempt the next two connection methods. By default the value isNone
so providingKUBERNETES_API_KEY
acts as an override for the remote connection. 2. Attempt in-cluster connection (will only work when running on a Pod in a cluster) 3. Attempt out-of-cluster connection using the default location for a kube config fileNone
Jessica Smith
10/01/2021, 3:28 PMWill
10/01/2021, 4:23 PMJacob Goldberg
10/01/2021, 4:33 PMSuccessfully pulled image XXXXX
and agent | Completed deployment of flow run XXXXX
, but the flows content is not being updated . When I change the runtime Labels on Prefect cloud and execute the same (updated) flow in another environment (docker agent running on mac osx) the updates i made to the flow are reflected…Matthew Seligson
10/01/2021, 7:37 PMPierre Pasquet
10/01/2021, 9:46 PMprefect server
and prefect agent
from a directory where version 3.8.9
is active (I am using pyenv
), so I don't know where this version 3.7.10
comes from. How can I tell which python version my agent uses? I assume, according to the error message, that the faulty version 3.7.10
is tied to the agent.
Failed to load and execute Flow's environment: StorageError("An error occurred while unpickling the flow:\n TypeError('code() takes at most 15 arguments (16 given)')\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - python: (flow built with '3.8.9', currently running with '3.7.10')")
Anze Kravanja
10/01/2021, 10:00 PMTim Finkel
10/01/2021, 10:35 PMEssential container in task exited
. I’ve been trying to debug this for a bit and am not sure how to proceed. Any advice?Nikhil Acharya
10/02/2021, 1:14 PMflow.run(parameters=dict(studentId=student, data=data))
Tadej Svetina
10/03/2021, 8:25 AM