Vipul
04/14/2022, 8:29 PMApoorva Desai
04/14/2022, 9:07 PMKen Nguyen
04/14/2022, 11:28 PMAlexander Butler
04/14/2022, 11:55 PMprefect deployment create
says it should create or update a deployment but it is failing when the deployment exists (prefect 2.0) ?Ryan R
04/15/2022, 12:11 AMApoorva Desai
04/15/2022, 3:05 AMSubhajit Roy
04/15/2022, 4:42 AMfrom prefect.engine.signals import SKIP
........
........
raise SKIP('Skipping all downstream dependencies.')
With this all the following tasks are being skipped and was expecting the flow state will be skipped
. Though the following tasks are being skipped but at the end the flow becomes successful
.
I have two questions around :
1. Is this something expected?
2. If this is expected, whats the remedy . Do I need to explicitly use a state handler on top of it to make the flow skipped
eventuallyAndrey Vinogradov
04/15/2022, 9:14 AMBrett Naul
04/15/2022, 1:06 PMPatrick Tan
04/15/2022, 2:09 PMPedro Machado
04/15/2022, 2:52 PMDomenico Di Gangi
04/15/2022, 3:09 PMAhmed Ezzat
04/15/2022, 5:29 PMprefect 1.2.0-python3.9
docker image. same as https://github.com/PrefectHQ/prefect/issues/3952
for the dev team: https://cloud.prefect.io/bitthebyte/flow-run/b30223e1-5308-48fe-aa0b-9326c6e48860 (this is the stuck workflow) I already tried restartingMelqui de Carvalho
04/15/2022, 6:46 PMMohan kancherla
04/15/2022, 7:41 PMsidravic
04/16/2022, 8:59 AMtask_definition_arn
with the containers named as flow
While I'm able to trigger the flows, the flow crashes with the error
copilot/flow/8d31faa7f1ba File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
copilot/flow/8d31faa7f1ba return _bootstrap._gcd_import(name[level:], package, level)
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 991, in _find_and_load
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 991, in _find_and_load
copilot/flow/8d31faa7f1ba File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
copilot/flow/8d31faa7f1ba ModuleNotFoundError: No module named '/root/'
However, i've ensured the flows folder from my project are under the PYTHONPATH and I can't entirely figure out what (if anything) that cloudpickle is trying to do to access those flows at the time of execution.Hash Lin
04/16/2022, 2:28 PMFailed to load and execute flow run: ModuleNotFoundError("No module named '/Users/xxx/'")
Thanks for helping. 🙇Blake
04/17/2022, 1:16 AMMasatoShima
04/17/2022, 7:20 AMs3_storage_block = S3StorageBlock(
bucket="********",
profile_name="default",
region_name="ap-northeast-1",
)
async with get_client() as client:
block_id = await client.create_block(
name="********",
block=s3_storage_block,
block_spec_id=uuid.UUID("{12345678-1234-5678-1234-567891234567}")
)
Ken Nguyen
04/17/2022, 7:21 PMAlexander Butler
04/17/2022, 8:52 PMDeploymentSpec(
flow_location=str((FLOW_DIR / "salesforce.py").absolute()),
flow_name="elt-salesforce",
name="sf-production-elt-job",
schedule=IntervalSchedule(interval=timedelta(hours=1)),
tags=["pipeline"],
flow_runner=DockerFlowRunner(image=f"{IMAGE_REPO}/{DBT_IMAGE}:{TAG}", stream_output=True)
)
And it took awhile to come to me as a requirement but I essentially have 2 steps.
Step one requires docker image A to do some data pipeline stuff, step 2 needs my custom dbt docker image B to do some transform AFTER step 1. So these two dependent tasks constitute one flow with each step on independent docker images.
A flow runner is configured at a deployment level but I dont see a way to configure it at the task or subflow level. Definitely a key req in current state.
Please help!Ken Nguyen
04/17/2022, 10:17 PMtest_param = Parameter('test_param', default="default_val")
function(test_param)
Where I got an AttributeError:
AttributeError: 'Parameter' object has no attribute
Stephen Lloyd
04/18/2022, 4:22 AMLeanna Morinishi
04/18/2022, 6:07 AMany_successful
for several upstream tasks. When I write it like below, task_5
doesn’t fail, even if task_3
and task_4
fail. Is it because the creation of the list [task_3, task_4]
succeeds? How should I write this flow instead? Many thanks!
task_3 = my_task(
"input1", task_args=dict(name="input1")
).set_upstream(task_1)
task_4 = my_task(
"input2", task_args=dict(name="input2")
).set_upstream(task_2)
task_5 = my_task("all_inputs",
task_args=dict(name="all_inputs", trigger=any_successful),
).set_upstream([task_3, task_4])
Amir Shur
04/18/2022, 11:19 AMMatthew Seligson
04/18/2022, 12:44 PMXavier Babu
04/18/2022, 3:45 PMJack Sundberg
04/18/2022, 4:42 PM@flow
decorator, I'm guessing this will be possible, but don't see any docs on it.Dylan
04/18/2022, 6:32 PMDylan
04/18/2022, 6:32 PM[{
"data": {
"set_schedule_active": {
"success": false,
"__typename": "success_payload"
}
}
}]