Dominic Pham
11/08/2021, 10:07 PMRyan Sattler
11/08/2021, 11:50 PMFailed to load and execute Flow's environment: ValueError('No flows found in file.')
Why is this?Ovo Ojameruaye
11/09/2021, 1:12 AMAjit Patel
11/09/2021, 3:16 AMAjit Patel
11/09/2021, 3:19 AMMichael Hadorn
11/09/2021, 7:55 AMDan Zhao
11/09/2021, 11:39 AMThanh Minh
11/09/2021, 2:03 PMThanh Minh
11/09/2021, 2:04 PMale
11/09/2021, 2:11 PMThanh Minh
11/09/2021, 2:48 PMKevin
11/09/2021, 3:15 PMNoam polak
11/09/2021, 3:52 PMwith Flow(
flow_name,
run_config=run_config_provider(flow_name),
result=results_provider(os.environ["GCP_PROJECT"]),
storage=storage_provider(flow_name),
) as flow:
portfolio = fetch_portfolio(portfolio_uuid)
parser_results, results_log, csv_log, *input_row_size* = run_parser(
portfolio, excel_bucket_url, workflow_id
)
.... more tasks...
# run the next flow
with case(parser_results, "completed"):
start_flow_run = StartFlowRun(
flow_name=automation_flow_name,
project_name="project_name",
)
start_flow_run(
upstream_tasks=[updated, *input_row_size*],
parameters={
"portfolio_uuid": portfolio_uuid,
"input_row_size": *input_row_size*,
},
)
portfolio_uuid = string
input_row_size = int
When I run the flow it fails when attempting to run the new flow
here is the error message (in the message thread):
What did I do that was wrong?Prashob Nair
11/09/2021, 6:42 PMTim Enders
11/09/2021, 7:26 PMDaniel Katz
11/09/2021, 8:17 PMDaskExecutor
and pass a parameter like num_dask_workers
which would be used when initializing the DaskExecutor
at run-time. Is it possible to parametrize a flows Executor properties?matthew dickinson
11/09/2021, 8:38 PMimport prefect
from foo import postMessage
@task
def postNewMessage(message):
postMessage(message = message)
with Flow("My Flow") as flow:
postNewMessage(message = "Hello World")
or is there a way I can just use my existing function as task instead of nesting it?Manga Dhatrika
11/09/2021, 9:00 PMdocker-compose build
it will make prefect available at 8080 localhost, instead of spinning up the prefect locally with following steps
pip3 install "prefect[GitHub,kubernetes,Snowflake]"
Make sure you have Docker installed before starting up prefect server.
1. Set Backend to Server Instead of Cloud - prefect backend server
2. Start Prefect Server - prefect server start
3. Visit <http://localhost:8080>
Brett Naul
11/10/2021, 12:33 AMSamil
11/10/2021, 1:18 AMautomation action
without going to the UI?Dan Zhao
11/10/2021, 9:53 AMChris Arderne
11/10/2021, 11:29 AMDockerRun
or VertexRun
(I'm using prefect from master branch), is it possible to specify an image elsewhere than DockerHub (along with secrets to access it)?Matic Lubej
11/10/2021, 1:03 PMAlec Koumjian
11/10/2021, 2:07 PMPierre Monico
11/10/2021, 2:08 PMREADME
of a flow through a graphql call. I am able to do this by calling flow_group.description
but:
• What is a flow group? Is that all versions of a given flow?
• If so, why does flow_group.name
return a uuid? (aka. how can I simply relate a flow’s name to it’s description)Ken Nguyen
11/10/2021, 4:37 PMmodel = Parameter('model_name')
. I want to announce the user input in Slack via my state handler, but when I do it displays as <Parameter: model_name>
rather than the actual user input. Does anyone have any suggestions on what I can do to display the user input?Jason Boorn
11/10/2021, 5:08 PMval = my_task()
where val is some json thing. I also want to set that tasks downstream dependencies, though, which I had been doing with val.set_downstream(othertask)
(although I was always a little confused as to why that would work). Well, now it's stopped working. What's the right way to do both of these things?Tim Enders
11/10/2021, 6:24 PMTheo Platt
11/10/2021, 7:46 PMAWSClientWait
to monitor the status of each of those jobs. Here's the code for that mapped task -
@task
def wait_batch(job_id, delay, max_attempts):
logger = prefect.context.get('logger')
<http://logger.info|logger.info>(f"Waiting for job to complete: {job_id}")
waiter = AWSClientWait(
client='batch',
waiter_name='JobComplete',
)
waiter.run(
waiter_kwargs={
'jobs': [job_id],
'WaiterConfig': {
'Delay': delay,
'MaxAttempts': max_attempts
}
},
)
<http://logger.info|logger.info>(f"Job complete: {job_id}")
return job_id
But what we are sometimes seeing are one or more batch jobs failing, which then somehow stops the other jobs from responding to this AWSClientWait call... and so the mapped task keeps running even though all the jobs have either failed or completed. Any ideas?Tao Bian
11/10/2021, 9:24 PM