matta
05/09/2020, 12:14 AMERROR - prefect.TaskRunner | Unexpected error: ValueError('Could not infer an active Flow context.')
matta
05/09/2020, 12:14 AMmatta
05/09/2020, 12:14 AMSainath
05/09/2020, 5:16 AMMatthias
05/09/2020, 1:35 PM__init__.py
files everywhere, importing everything in the respective sub folder.
When I use in my Flows a Task
class that I imported, everything goes well until I run a flow. Then I get from Dask ModuleNotFoundError("No module named 'reporting'")
where reporting
is the main package but I cannot refer that back to anything, running it using the LocalExecutor
works. The debug utility also tells me, that the flow is not serializable. As soon as I copy the Task-class into the same place where the Flow gets defined, everything works like a charm (only difference, that the Class does not get imported)
Question: Is there any pitfall I need to consider when defining Task-classes and importing them?Adrien Boutreau
05/10/2020, 3:52 PMMay 10th 2020 at 4:48:48pm | prefect.CloudTaskRunner
INFO
Task 'GetContainerLogs': finished task run for task with final state: 'Success'
which is cool 🙂 but I don't have any print done by my job itself - now if I check log file (in folder .prefect/results) in console I can see all prints done by my job - Do you know how I can have it in UI ?Nate Atkins
05/10/2020, 9:11 PMBrad
05/10/2020, 11:28 PMyield
out the data files for further mapping. Obviously I can’t do this in prefect, but I also don’t want to do an enormous reduce
because the amount of data is too large. What I currently have is just a simple map over the files, but I’m really not getting the parallelism or granularity I’d like. I’m using dask so I thought about just grabbing a worker client and doing a submit tasks from tasks
, but then I lose the benefits of having prefect tasks - is there anything anyone can suggest?Kostas Chalikias
05/11/2020, 9:02 AMMay 11 04:02:26 bohr-prod app/prefect.1: [2020-05-11 03:02:25] CRITICAL - CloudHandler | Failed to write log with error: 413 Client Error: Request Entity Too Large for url: <https://api.prefect.io/graphql/alpha>
Are our individual log items too big?Matias Godoy
05/11/2020, 2:17 PMRalph Willgoss
05/11/2020, 3:07 PMTypeError: can't pickle _thread.RLock objects
I've omitted the stack trace for now, as I wanted some guidance on whether I'm using the context incorrectly.
I understand that my client is going to be serialized when using Dask. When using the local executor all is fine.
Is this a problem with the Client not being serialized correctly or a limitation of prefect?
thanks!itay livni
05/11/2020, 5:21 PMDaskCloudProviderEnvironment
? I am following the docs here https://docs.prefect.io/orchestration/execution/dask_cloud_provider_environment.html#process and am getting an import error .
prefect.__version__
'0.10.7'
from prefect.environments import DaskCloudProviderEnvironment
cannot import name 'DaskCloudProviderEnvironment' from 'prefect.environments' (.../miniconda3/envs/py37moc/lib/python3.7/site-packages/prefect/environments/__init__.py)
dherincx
05/11/2020, 8:05 PMifelse
control flow to determine which task gets applied over a mapped task. Below, I've attached a simple example. When I run the flow, I don't get an error, but rather, each iteration of the mapped task enters the true
condition despite there being cases where the condition is clearly false. Can someone point me in the right direction? Am I using the ifelse
correctly?
param = [[], 2, 3, [], 4]
@task
def cond(val):
if isinstance(val, int):
return True
else:
return False
@task
def print_true(val):
print(f"Value is {val}")
@task
def print_false(val):
print("This is a list...")
with Flow('Insert Health System Data') as flow:
flag = cond.map(param)
true_task = print_true.map(param)
false_task = print_false.map(param)
ifelse(flag, true_task, false_task)
flow.run()
javier
05/11/2020, 10:25 PMLukas
05/12/2020, 8:51 AMLukas
05/12/2020, 8:52 AMDarragh
05/12/2020, 10:20 AMMatthias
05/12/2020, 12:23 PMtask
that returns a list
. Using map
I pass the list to another task, which per each item returns again a list
. I want to map()
this nested list
to anoter task to iterate over all the items. Is there any intended way of doing so?
Thanks!Bartek
05/12/2020, 3:30 PMtkanas
05/12/2020, 4:28 PMDavid N
05/12/2020, 6:01 PMScott Zelenka
05/12/2020, 6:07 PMMatthias
05/12/2020, 7:12 PMJacob (he/him)
05/12/2020, 7:26 PMFlorian K. (He/Him)
05/12/2020, 8:49 PMUnexpected error: TypeError("cannot pickle 'google.protobuf.pyext._message.MessageDescriptor' object")
Traceback (most recent call last):
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/prefect/engine/runner.py", line 48, in inner
new_state = method(self, state, *args, **kwargs)
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/prefect/engine/task_runner.py", line 934, in get_task_run_state
state._result.store_safe_value()
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/prefect/engine/result/base.py", line 126, in store_safe_value
value = self.result_handler.write(self.value)
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/prefect/engine/result_handlers/local_result_handler.py", line 81, in write
f.write(cloudpickle.dumps(result))
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 63, in dumps
cp.dump(obj)
File "/home/fkluiben/bin/miniconda3/envs/RodeoEnv/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 548, in dump
return Pickler.dump(self, obj)
TypeError: cannot pickle 'google.protobuf.pyext._message.MessageDescriptor' object
BTW, apologies for my way of describing the issue. I am sure it's pretty clumsy and inaccurate. Looking forward to learn and raise my awareness of prefect terminology!!Aiden Price
05/13/2020, 5:56 AMAiden Price
05/13/2020, 5:57 AMDavid Ojeda
05/13/2020, 11:10 AMBartek
05/13/2020, 12:29 PMSandeep Aggarwal
05/13/2020, 12:44 PM