Numline1
08/29/2022, 6:20 PMNumline1
08/29/2022, 6:20 PMMaggie Dart-Padover
08/29/2022, 7:25 PMOSError: could not get source code
. del the_task
does remove it from my environment, but then if I try to run the task definition again I still get the error. I'm not totally sure if this is a bug or expected behavior. Is my workflow here just wrong for prefect?Sam Garvis
08/29/2022, 7:27 PMprefect deployment build name.name -n name_dev -t dev_wq_k8s -sb gcs/dev --work-queue=dev_wq_k8s -ib kubernetes-job/dev-k8s-job
I get this error
Job is missing required attributes at the following paths: /apiVersion, /kind, /metadata, /spec (type=value_error)
Even though in the Prefect 2.0 UI for creating a k8s job block, it says Job (Optional)
ash
08/29/2022, 8:41 PMRichard Freeman
08/29/2022, 10:30 PMNace Plesko
08/29/2022, 11:04 PMShellTask
is not logging all the outputs from the script that it's running. I am looking at https://docs-v1.prefect.io/api/latest/tasks/shell.html#shelltask and seems like just setting stream_output=True
should do the job, but I'm still seeing just Command failed with exit code 1
. I also tried setting return_all=True
and log_stderr=True
, but still the same behavior.
Has anyone ran into this issue in the past?Walter Cavinaw
08/30/2022, 1:01 AMYousef Hosny
08/30/2022, 1:47 AMdeployment.apply()
doesn't work from a jupyter notebook cell, but does work from .py
script file.
Also, is there anyway to deploy a flow to prefect cloud using jupyter notebook ?Tommy Nam
08/30/2022, 2:22 AMAnat Tal Gagnon
08/30/2022, 4:17 AMFaheem Khan
08/30/2022, 7:33 AMAnat Tal Gagnon
08/30/2022, 8:25 AMAdrien Besnard
08/30/2022, 9:12 AMprefect_dbt
and prefect_airbyte
collections exists, I was wondering if it makes sense to have something like a dedicated block for Airbyte (which allow us to store the server_url
but also invoke a trigger_sync
function) and the same question with a DBT block (and we store all the information that we can find in the profiles.yml
, for exemple)?
• What is the best way to log from a function of a Block
subclass? Can we use the get_run_logger()
or is preferable to use some sort of callbacks that are going to be invoked inside a @task
or something?
Thanks!Aditya Sharma
08/30/2022, 10:47 AMParwez Noori
08/30/2022, 11:20 AMTarek
08/30/2022, 12:59 PMJosh Paulin
08/30/2022, 1:24 PMAnat Tal Gagnon
08/30/2022, 1:39 PMTim Enders
08/30/2022, 1:56 PM_pickle.PicklingError: Pickling client objects is explicitly not supported.
Clients have non-trivial state that is local and unpickleable.
Pickle error on clients in Prefect 2.0?Marcelo Ortega
08/30/2022, 2:35 PMkarteek
08/30/2022, 3:53 PMMatt Melgard
08/30/2022, 3:54 PMRuntimeError: File system created with scheme 'gcs' from base path '<gcs://mattmelgard-test-bucket-01>' could not be created. You are likely missing a Python module required to use the given storage protocol.
This is the CLI command I’m using to build the deployment:
prefect deployment build tutorial-flow/flow/tutorial-flow.py:my_flow -n tutorial-flow-k8s -t test -i kubernetes-job -sb gcs/tutorial-bucket
and I have a block defined for the bucket with the name tutorial-bucket
but I’m not sure what could be missing here, I have pretty much all the python packages installed via pip install 'prefect[all_extras]'
Anna Geller
results
and state_handler
in 2.0?
with Flow(
"Subscriptions API Import",
result=GCSResult(bucket="platformsh-vendor-prefect"),
# executor=LocalExecutor(),
executor=LocalDaskExecutor(scheduler="threads", num_workers=5),
state_handlers=[flow_failure],
Florent VanDeMoortele
08/30/2022, 4:46 PMprefect.environments.storage.Docker
to create an image with my flow, context, scheduler and all the required virtual environnement. Then I register flow to Prefect with prefect.environments.DaskKubernetesEnvironment
as environment. I'm using prefect.executors.DaskExecutor
to run it.
To deploy on 2.x, I simply try to use DockerContainer
but it doesn't work without storage. So I try to install main branch of Prefect because of a merged PR ( https://discourse.prefect.io/t/how-to-build-deployments-with-flow-code-and-dependencies-being-baked-into-a-docker-image/1341 ) to use DockerPackager
but it's not compatible with Deployment.build_from_flow
. I also try to push directly my Docker image with push_image
but push doesn't already work. So, I'm a bit confused because I can't find explicit documentation on Discourse and I read different responses because of the different releases on 2.x.
Is DockerPackager
the best practice to deploy a completely self contained flow? In this case, do you have any idea of the release roadmap to fix this (and to update documentation) ? If not, what can I should use?
Thank you !Dylan McReynolds
08/30/2022, 5:25 PMflow
, is there a way to schedule a subflow to start at a particular time in the future? Or some other way to accomplish a long-delayed subflow/task?Shaoyi Zhang
08/30/2022, 5:51 PMEXTRA_PIP_PACKAGES
to Cloud UI? When using Kubernetes agent, those logs only show up in job pods logs and are not available in Cloud UI.Renan Matias
08/30/2022, 5:52 PM<http://127.0.0.1:4200/api/deployments/{id}/create_flow_run>
and it worked normally (I used Insomnia).
Now, I need to make this same request through Prefect Cloud. I was already able to create the deployment and I was able to run the flow normally through the Prefect UI with the agent active.
Can you help me?Ashley Felber
08/30/2022, 6:23 PMMarcos
08/30/2022, 6:27 PM@flow
async def test():
connected = await connect_db.submit()
task1 = await db_stuff.submit(wait_for=[connected])
await disconnect_db.submit(wait_for=[task1])
If db_stuff
fails, disconnect_db
never is executed but I’d like it to do it