Charles Liu
03/22/2021, 9:53 PMAlexandru Sicoe
03/22/2021, 11:11 PMTsang Yong
03/23/2021, 12:19 AMcluster = KubeCluster.from_yaml(dask_worker_spec_file_path)
cluster.adapt(minimum=1, maximum=10)
executor = DaskExecutor(cluster.scheduler_address)
state = flow.run(executor=executor)
but when I try to access the state I'm getting this.
Python 3.8.6 (default, Dec 11 2020, 14:38:29)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.21.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: state
Out[1]: <Failed: "Unexpected error: TypeError('Could not serialize object of type Failed.\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 49, in dumps\n result = pickle.dumps(x, **dump_kwargs)\nTypeError: cannot pickle \'_thread.RLock\' object\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 307, in serialize\n header, frames = dumps(x, context=context) if wants_context else dumps(x)\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 58, in pickle_dumps\n frames[0] = pickle.dumps(\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 60, in dumps\n result = cloudpickle.dumps(x, **dump_kwargs)\n File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps\n cp.dump(obj)\n File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump\n return Pickler.dump(self, obj)\nTypeError: cannot pickle \'_thread.RLock\' object\n')">
any idea what I'm doing wrong?Mahesh
03/23/2021, 1:04 PMimport prefect
from prefect.tasks.snowflake.snowflake import SnowflakeQuery
from prefect import task, Flow
query = """
SHOW DATABASES;
"""
snowflake_def = SnowflakeQuery(
account="account",
user="user",
password="****",
database="***",
warehouse="****",
role="***",
query=query
)
with Flow("hello-snowflake") as flow:
snowflake_def()
flow.register(project_name="tutorial")
flow.run()
when i trigger quick run from UI, Iam facing below issue
Unexpected error: TypeError("cannot pickle '_thread.lock' object")
Traceback (most recent call last):
File "/opt/prefect_env/lib/python3.8/site-packages/prefect/engine/runner.py", line 48, in inner
new_state = method(self, state, *args, **kwargs)
File "/opt/prefect_env/lib/python3.8/site-packages/prefect/engine/task_runner.py", line 900, in get_task_run_state
result = self.result.write(value, **formatting_kwargs)
File "/opt/prefect_env/lib/python3.8/site-packages/prefect/engine/results/local_result.py", line 116, in write
value = self.serializer.serialize(new.value)
File "/opt/prefect_env/lib/python3.8/site-packages/prefect/engine/serializers.py", line 73, in serialize
return cloudpickle.dumps(value)
File "/opt/prefect_env/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 72, in dumps
cp.dump(obj)
File "/opt/prefect_env/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 540, in dump
return Pickler.dump(self, obj)
TypeError: cannot pickle '_thread.lock' object
I made Checkpoint as FALSEIgor Bondartsov
03/23/2021, 2:45 PMemre
03/23/2021, 3:01 PMHttpGetTask
(or another task for the HTTP request family)? Lately, I find myself GET
ting a lot of results from random endpoints, and thought it could save boilerplate code on my end.Javier Domingo Cansino
03/23/2021, 5:25 PMJonathan Wright
03/23/2021, 5:47 PMGleb Erokhin
03/23/2021, 7:15 PMDavid Elliott
03/23/2021, 8:09 PM400 Client Error:
... "input.states[0].task_run_id"; Expected non-nullable type UUID! not to be null.
on some of the tasks when I run the flow. I'll put the full stack trace in the 🧵. It's happening on maybe 1 in every 20 tasks or so. The task then gets put into state 'ClientFailed' (and the UI can't see them) and all downstream dependents of these tasks then get set to state 'Pending'.
I've tried many dask workers, then just 1 dask worker (for simplicity), same issue. Can't replicate it with the smaller (196 task) flow. I'm wondering if there's some kind of rate limiting going on whereby there are so many concurrent tasks running simultaneously (there are a tonne all trying to be ran at the same time) that some of them are getting a generic error from cloud or something?
I would try adding a task concurrency limit to see if this helps with the above hypothesis, but the UI says it's not included in our plan (even though we're an enterprise tenant). Is it possible to set task concurrency at the flow level?
Also, the UI can't load the schematic of the big flow, though that's less of an immediate concern. Thanks in advance for any advice!Kelly Huang
03/23/2021, 9:11 PMJillian Kozyra
03/23/2021, 11:49 PMfrom prefect import Flow, Parameter, context, task, unmapped
, mypy complains but flows work
if we do from prefect.src import Flow, Parameter, context, task, unmapped
, mypy is happy but python complains: ModuleNotFoundError: No module named 'prefect.src
Reece Hart
03/24/2021, 4:05 AMMichael Wedekindt
03/24/2021, 8:26 AMVarun Joshi
03/24/2021, 11:04 AMJacob Blanco
03/24/2021, 11:17 AMJeffery Newburn
03/24/2021, 2:35 PMAaron Richter
03/24/2021, 2:52 PMIrfan Habib
03/24/2021, 4:16 PMWill Milner
03/24/2021, 5:45 PMfor x in range(3):
task = some_task(x)
final_task = another_task(upstream_tasks=task)
I see 3 tasks get created in the loop, but for the final task it only has 1 upstream task, instead of all the tasks created in the loopCharles Liu
03/24/2021, 6:56 PMCharles Liu
03/24/2021, 7:20 PMAdam Lewis
03/24/2021, 8:01 PMNathan Walker
03/24/2021, 8:07 PMAlex Papanicolaou
03/24/2021, 10:57 PMmatta
03/25/2021, 12:28 AM김응진
03/25/2021, 7:25 AMShin'ichiro Suzuki
03/25/2021, 8:07 AMVarun Joshi
03/25/2021, 8:33 AMFailed to load and execute Flow's environment: AttributeError("'str' object has no attribute 'keys'")
Any inputs will be much appreciated.Dave Hirschfeld
03/25/2021, 8:51 AMDave Hirschfeld
03/25/2021, 8:51 AMAmanda Wee
03/25/2021, 8:59 AMDave Hirschfeld
03/25/2021, 9:12 AM