Vadym Dytyniak
11/23/2022, 1:01 PMJared Robbins
11/23/2022, 1:16 PMTibs
11/23/2022, 1:30 PMSlackbot
11/23/2022, 1:49 PMTim-Oliver
11/23/2022, 2:11 PMasyncio
exception when using DaskTaskRunner
which leads to tasks hanging and not completing. -->Khuyen Tran
11/23/2022, 3:28 PMLuca Schneider
11/23/2022, 4:29 PMPREFECT_LOGGING_EXTRA_LOGGERS
Has it to be set in the flow, on the agent or on the orion server ? ThanksEsdras Lopes Nani
11/23/2022, 4:30 PMAgent started! Looking for work from queue(s): xxxxx...
Don't know what else I can do 😅
Thanks!Tim-Oliver
11/23/2022, 4:59 PMPermission denied
errors with the github storage blocks since today. Has anyone seen this as well? The error appears if the github repo is already present from a previous run. If the repo is deleted the gitrepo is cloned without any error.Chris Marchetti [Datateer]
11/23/2022, 6:31 PMUnknown Opcode
error in our prefect flow. I looked up the error and saw several issues relating to mismatched python versions. I have verified that we are using python 3.8 (3.87 and 3.8) for everything. Does anyone have any ideas why this error may be occurring? Other tasks in our flow ran successfully. Only one failed.Ashley Felber
11/23/2022, 6:40 PMBraun Reyes
11/23/2022, 9:01 PMSantiago Gonzalez
11/23/2022, 9:45 PMMain class from jar could not be found
• The output directory does not exists, so it could not be synchronize to AWS S3
Do you have any idea of why these types of issues happens time to time?
BTW: I am using boto3
SSM
agent to handle EC2 Instances creation, execution and terminationRyan Sattler
11/24/2022, 4:42 AMDeepanshu Aggarwal
11/24/2022, 6:39 AMDeepanshu Aggarwal
11/24/2022, 7:02 AM07:01:33.898 | ERROR | Task run 'run_executor-a1954751-160' - Crash detected! Execution was interrupted by an unexpected exception: AssertionError
Eden
11/24/2022, 7:16 AMunlimited
It works perfectly fine.
However, when I modify Concurrency into, for example, 3 It failed to run jobs 😞Deepanshu Aggarwal
11/24/2022, 8:31 AMiñigo
11/24/2022, 8:43 AMDeepanshu Aggarwal
11/24/2022, 9:06 AMSylvain Hazard
11/24/2022, 10:10 AMrun
method. It felt like a good way to encapsulate complex tasks and improve code readability. Creating abstract tasks was also something I did sometimes. Is this behavior gone or has it evolved ? I couldn't find much in the docs regarding this unfortunately.Tim Galvin
11/24/2022, 12:59 PMEncountered exception during execution:
Traceback (most recent call last):
File "/software/projects/askaprt/tgalvin/setonix/miniconda3/envs/acesprefect2/lib/python3.9/site-packages/prefect/engine.py", line 612, in orchestrate_flow_run
waited_for_task_runs = await wait_for_task_runs_and_report_crashes(
File "/software/projects/askaprt/tgalvin/setonix/miniconda3/envs/acesprefect2/lib/python3.9/site-packages/prefect/engine.py", line 1317, in wait_for_task_runs_and_report_crashes
if not state.type == StateType.CRASHED:
AttributeError: 'coroutine' object has no attribute 'type'
I am running a known version of my workflow on a known dataset, which has worked perfectly fine dozens of times before. It seems to be saying the the state
above is not an orion
model -- rather a coroutine. All my tasks are using the normal task
decorator around normal non-async python functions.Boris Tseytlin
11/24/2022, 4:28 PM.save
on it, but when I try to retrieve it later by load
I get error 404 from Prefect.
ValueError: Unable to find block document named test-minio-url for block type string
@pytest.fixture(autouse=True, scope="session")
def prefect_test_fixture():
with prefect_test_harness():
yield
@pytest.fixture(scope="session")
def minio_blocks(prefect_test_fixture):
minio_creds_block = MinIOCredentials(
minio_root_user=Config.MINIO_USER,
minio_root_password=Config.MINIO_PASSWORD,
)
minio_creds_block.save("test-minio-creds")
minio_url_block = String(Config.MINIO_URL)
minio_url_block.save("test-minio-url")
return minio_creds_block, minio_url_block
@pytest.fixture
def dummy_mission(minio_blocks):
minio_creds_block, minio_url_block = minio_blocks
minio_url = String.load(minio_url_block).value # <- ERROR HERE
minio_url = minio_url.split("/")[-1:][0]
minio_creds = MinIOCredentials.load(minio_creds_block)
Sami Serbey
11/24/2022, 5:23 PMredsquare
11/24/2022, 5:46 PMdavzucky
11/24/2022, 11:51 PMget_run_logger()
which is set from the context. You can find sample test code on the thread
The test keep failing with the erorr
prefect.exceptions.MissingContextError: There is no active flow or task run context.
wonsun
11/25/2022, 7:21 AMAndrei Tulbure
11/25/2022, 7:30 AMZinovev Daniil
11/25/2022, 10:16 AMroady
11/25/2022, 10:24 AM# Prefect 2.6.9
# Python 3.8
from prefect import flow, task, get_run_logger
@task
def add_one(x):
if x==1:
raise Exception("Raised exception")
return x+1
@task
def do_something(dummy):
get_run_logger().info("Doing something")
return
@flow
def mapped_flow_not_dependent():
a = list([0,2,3])
b = add_one.map(a, return_state=True)
c = add_one.map(b, return_state=True)
d = do_something.map(a, return_state=True, wait_for = [c])
print(c)
print(d)
return "Flow completes"
if _name_ == "_main_":
mapped_flow_not_dependent()
One state in c being failed means none of following do_something tasks run, whereas I would like all of the do_something tasks to run apart from ones where c is failed. I can get the desired behaviour by linking the tasks explicitly: changing the argument of do_something from a to c (and removing the wait_for kwarg).roady
11/25/2022, 10:24 AM# Prefect 2.6.9
# Python 3.8
from prefect import flow, task, get_run_logger
@task
def add_one(x):
if x==1:
raise Exception("Raised exception")
return x+1
@task
def do_something(dummy):
get_run_logger().info("Doing something")
return
@flow
def mapped_flow_not_dependent():
a = list([0,2,3])
b = add_one.map(a, return_state=True)
c = add_one.map(b, return_state=True)
d = do_something.map(a, return_state=True, wait_for = [c])
print(c)
print(d)
return "Flow completes"
if _name_ == "_main_":
mapped_flow_not_dependent()
One state in c being failed means none of following do_something tasks run, whereas I would like all of the do_something tasks to run apart from ones where c is failed. I can get the desired behaviour by linking the tasks explicitly: changing the argument of do_something from a to c (and removing the wait_for kwarg).Tim-Oliver
11/25/2022, 10:27 AMPekka
11/25/2022, 11:17 AMc
how do_something
should proceed with a
?roady
11/25/2022, 11:22 AMPekka
11/25/2022, 11:25 AMroady
11/25/2022, 11:32 AMd = do_something.map(a, return_state=True, wait_for = [c])
with d = do_something.map(c, return_state=True)
then the desired behaviour is retrieved. But I specifically want to be able to enforce dependence for mapped tasks which would otherwise not be dependent.Khuyen Tran
11/30/2022, 5:15 PMc
as an argument of do_something
.
If you want this to be something that Prefect supports, I encourage you to create a GitHub issue for this on Prefect GitHub pageroady
12/01/2022, 8:35 AMKhuyen Tran
12/01/2022, 3:47 PMtask.map()
, didn’t it? I’m not sure if I understood your questionroady
12/01/2022, 4:25 PMwait_for
to not run downstream tasks if an upstream one does not reach a Completed state, even if the downstream tasks do not take the upstream state as an argument.
2. For mapped tasks which do take an upstream state as an argument, only the corresponding downstream tasks do not run if a given upstream task does not enter a Completed state.
3. For mapped tasks which use wait_for
, all of the downstream tasks do not run if any one of the upstream tasks enters a failed state.
Here's an example of the three different cases which I hope will help you understand what I mean:
# Prefect 2.6.9
# Python 3.8
from prefect import flow, task, get_run_logger
@task
def add_one(x):
if x==1:
raise Exception("Raised exception")
return x+1
@task
def do_something(dummy):
get_run_logger().info("Doing something")
return
@flow
def wait_for_mwe():
# No mapping, using wait_for but not argument
a = 1
b = add_one(a, return_state=True)
c_1 = do_something(a, return_state=True, wait_for = b)
# Mapping, using argument
a = list([1,2,3])
b = add_one.map(a, return_state=True)
c_2 = do_something.map(b, return_state=True)
# Mapping, using wait_for but not argument
a = list([1,2,3])
b = add_one.map(a, return_state=True)
c_3 = do_something.map(a, return_state=True, wait_for = b)
return c_1, c_2, c_3
if __name__ == "__main__":
c_1, c_2, c_3 = wait_for_mwe()
# State is NotReady
print("Expecting NotReady state:")
print(c_1)
# Two completed states and one NotReady state
print("Expecting 1 NotReady and 2 Completed:")
print(c_2)
# All states are NotReady! :(
print("Expecting 1 NotReady and 2 Completed:")
print(c_3)
Anna Geller
12/01/2022, 7:05 PMwait_for = [allow_failure(c)]
example:
from prefect import task, flow, get_run_logger, allow_failure
@task
def ingest_data():
return 42
@task
def transform_data(x: int) -> int:
if True:
raise ValueError("Non-deterministic error has occured.")
else:
return x * 42
@task
def clean_up_task():
logger = get_run_logger()
<http://logger.info|logger.info>("Cleaning up 🧹")
@flow
def allow_flaky_transformation_to_pass():
data = ingest_data.submit()
result = transform_data.submit(data)
clean_up_task.submit(wait_for=[allow_failure(result)])
if __name__ == "__main__":
allow_flaky_transformation_to_pass()
roady
12/02/2022, 11:07 AMallow_failure
in my mwe, but it means that all of the mapped downstream tasks run, even if there was a failure in a corresponding upstream task. 😞Peyton Runyan
12/02/2022, 1:40 PMKhuyen Tran
12/02/2022, 4:12 PMfrom prefect import flow, task, get_run_logger
@task
def add_one(x):
if x == 2:
raise Exception("Raised exception")
return x + 1
@task
def add_two(x):
if x == 2:
raise Exception("Raised exception")
return x + 2
@task
def do_something(dummy):
get_run_logger().info("Doing something")
return
@flow
def mapped_flow_not_dependent(a=[1, 2, 3]):
b = add_one.map(a)
c = add_two.map(b)
d = [
do_something.submit(item)
for future, item in zip(c, a)
if future.wait().type == "COMPLETED"
]
return "Flow completes"
@flow
def mapped_flow_not_dependent(a=[1, 2, 3]):
b = add_one.map(a)
c = add_two.map(b)
d = [
do_something.submit(item, return_state=True, wait_for=future)
for item, future in zip(a, c)
]
return "Flow completes"
Anna Geller
12/02/2022, 4:16 PMfrom prefect import task, flow
@task
def upstream_task(item):
if item == "c":
raise Exception("this upstream task failed")
return str(item) + "+1"
@task
def downstream_task(item):
return str(item) + "+2"
@flow
def demo():
items = ["a", "b", "c", "d"]
first = upstream_task.map(items)
downstream_task.map(first) # runs only for a, b, and d. c is in NotReady state
if __name__ == "__main__":
demo()
from prefect import flow, task, get_run_logger, allow_failure
@task
def extract():
return [1, 2, 3]
@task
def add_one(x):
if x == 2:
raise Exception("Something is not right")
return x + 1
@task
def add_two(x):
return x + 2
@task
def cleanup_task():
get_run_logger().info("Cleaning up e.g. removing temp Ray cluster")
@flow
def map_with_cleanup_task():
a = extract()
b = add_one.map(a)
c = add_two.map(b)
cleanup_task.submit(wait_for=[allow_failure(c)])
if __name__ == "__main__":
map_with_cleanup_task()
roady
12/05/2022, 10:03 AM