Ben Muller
11/12/2021, 12:51 AMSergey
11/12/2021, 6:43 AMFailed to load and execute Flow's environment: StorageError('An error occurred while unpickling the flow:\n AttributeError("Can\'t get attribute \'_unpickle_timestamp\' on <module \'pandas._libs.tslibs.timestamps\' from \'/usr/local/lib/python3.7/site-packages/pandas/_libs/tslibs/timestamps.cpython-37m-x86_64-linux-gnu.so\'>")\nThis may be due to one of the following version mismatches between the flow build and execution environments:\n - prefect: (flow built with \'0.15.3\', currently running with \'0.14.19\')\n - python: (flow built with \'3.7.3\', currently running with \'3.7.4\')')
Is it that I have a version older than on the server?Adam Everington
11/12/2021, 7:36 AMdef my_state_handler(task,new_state,old_state)->None:
send_message(f'the following task failed {task.name} within this flow: {[GET FLOW NAME HERE}]')
@task(on_failed=my_state_handler)
def my_task():
....
Had a look through the task class on github and I can see it's passed to various methods but couldn't see a property it persisted inChris L.
11/12/2021, 9:39 AMVipul
11/12/2021, 1:58 PMChris Arderne
11/12/2021, 3:04 PMVertexRun
and GitHub Storage
. I'm now trying to get a distributed DaskExecutor
to run using dask_cloudprovider.gcp.GCPCluster
. Using the same Docker image that I already had working with VertexRun, with the Dask dependencies added. I also created a Packer image based on this.
It works if I run the Flow locally (with prefect flow run …, so Vertex is bypassed), spins up a Dask cluster and completes successfully. But when I ran it from Prefect Cloud, via Vertex, it provisioned a scheduler which had some errors (failed to restart for crond, nscd, unscd) and then didn't do anything. Aside: after I cancelled the Flow, I had to manually delete this scheduler. VPC is set up to all full access within the network, so shouldn't be anything to do with that,
Any ideas? Has anyone got this working well?Tim Enders
11/12/2021, 3:41 PMprefect.context
and I can't do prefect run
to get a context.Tom
11/12/2021, 5:16 PMload_data_task
. This provides me images_train
and images_val
. And now i want to transform images_train
with task rescaling_images_task
.
I need to map over `images_train`d(for each image and label) and then apply to recaling_images_task
.
Outside of prefect this works with following code:
images_train_rescaled = images_train.map(lambda x, y: ((preprocess.rescaling_images() (x, training=True), y))
But how i can run this in prefect flow?
preprocess = PreprocessingClass_Images(cfg)
with Flow("Preprocess") as flow:
#1. load images
images_train, images_val = preprocess.load_data_task()
# 2. rescaling
#the code below works outside of prefect flow
images_train_rescaled = images_train.map(lambda x, y: ((preprocess.rescaling_images() (x, training=True), y))
Vamsi Reddy
11/12/2021, 5:24 PMBrett Naul
11/12/2021, 7:40 PMprefect run --watch
(and similarly watch_flow_run
) don't show any logs until my run has finished (at which point they all show up in one deluge), whereas prefect run flow --logs
would stream them continuously. @Michael Adkins is that the intended behavior or might something be amiss? in this case it was about 5 minutes of no logs, haven't tried with something even longer yetXinchi He
11/12/2021, 8:31 PMSantiago Gonzalez
11/12/2021, 8:44 PMMax Mose
11/12/2021, 10:47 PMKathryn Klarich
11/12/2021, 11:15 PMAqib Fayyaz
11/13/2021, 2:44 PMCooper Marcus
11/13/2021, 2:45 PMCarlos Paiva
11/13/2021, 9:25 PMMike Lev
11/14/2021, 11:01 AMAqib Fayyaz
11/14/2021, 3:36 PMAqib Fayyaz
11/14/2021, 6:04 PMimport requests
query = """
mutation {
create_flow_run(input: { flow_id: "9479ea34-f558-4616-8e64-50c7a508787d" }) {
id
}
}
"""
url = "<https://api.prefect.io>"
response = <http://requests.post|requests.post>(
url, json={"query": query}, headers={"authorization": "Bearer your-api-key"}
)
print(response.status_code)
print(response.text)
I am using above python code to invoke flow in prefect cloud, i have given the value of Api key and changed the flow_id as well but when i run this file i get the following error 200
{"errors":[{"path":["create_flow_run"],"message":"Unauthenticated","extensions":{"code":"UNAUTHENTICATED"}}],"data":{"create_flow_run":null}}Aqib Fayyaz
11/15/2021, 8:32 AMAdam Everington
11/15/2021, 8:55 AMAdam Everington
11/15/2021, 10:26 AMJoseph Oladokun
11/15/2021, 11:34 AMGreg Adams
11/15/2021, 3:01 PMMartin Goldman
11/15/2021, 3:47 PMJohn Jacoby
11/15/2021, 3:58 PMKostas Chalikias
11/15/2021, 4:12 PMTao Bian
11/15/2021, 4:28 PMJohn Jacoby
11/15/2021, 4:47 PMJohn Jacoby
11/15/2021, 4:47 PMAusten Bouza
11/15/2021, 4:50 PMupstream_tasks
keyword argument, e.g.
with Flow('my_flow') as flow:
unrelated = unrelated_task()
other = other_task(upstream_tasks=[unrelated])
This guarantees that unrelated
finishes first even if other
doesn’t need any values from it.John Jacoby
11/15/2021, 4:52 PM