Matheus Rocha
03/02/2023, 12:42 AMpip3 install 'prefect[aws]'
WARNING: prefect 2.7.9 does not provide the extra 'aws'
Can anyone help me with this?Jacob Bedard
03/02/2023, 12:56 AMSamuel Hinton
03/02/2023, 1:06 AMreturn 3
) and it runs perfectly…. but the flow continues on and keeps checking for the task, acknowledges its already completed…. but never actually terminates? If anyone has any ideas or has seen this before, please let me know! Note: I’m only seeing issues if using the dask task runners. If it all runs in process, no issue, which leads me to believe theres something configured wrong in my dask cluster. Does anyone have know of instructions on what should be configured (ports open, etc) for prefect and dask to play together? EDIT: I think this might actually be an issue with dark workers and ports, and some prefect is interpreting some of the weird info coming back in a way that makes you think everythings finished when it might have failedMuhammad Tariq
03/02/2023, 1:13 AMdeployment_id
and flow_id
. I even tried to do it through prefect_api
- Still it throws errors like
Service Unavailable
when deleting deployment and Cannot find flow
when deleting flow. Can someone help me with this?
UI would just say failed to delete deployment
Ethan Veres
03/02/2023, 2:06 AMStefan
03/02/2023, 9:59 AMLee Mendelowitz
03/02/2023, 2:51 PM~/.prefect/profiles.toml
is configured to connect to Prefect Cloud, but sometimes I just want to run flows locally without cloud tracking.
I was able to figure something out with environment variables, but it seems hacky:
import os
# To disable tracking by Prefect UI, set these environment variables to empty values.
os.environ['PREFECT_API_URL'] = ''
os.environ['PREFECT_API_KEY'] = ''
os.environ['PREFECT_PROFILES_PATH'] = 'dont' # set to a path that doesn't exist
import prefect
import prefect.settings
from prefect import flow, get_run_logger
@flow
def test_run():
logger = get_run_logger()
st = prefect.settings.get_current_settings().dict()
keys = sorted(st.keys())
for k in keys:
v = st[k]
<http://logger.info|logger.info>(f"{k}: {v}")
# This run won't be tracked by Prefect Cloud
test_run()
Rob Hardwick
03/02/2023, 2:57 PMcache_result_in_memory=False
in prefect 2, but does anybody know if there is a way to disable memory caching in prefect 1?Jai P
03/02/2023, 4:01 PMRuntimeError: <asyncio.locks.Event object at 0x11447f760 [unset]> is bound to a different event loop
when attempting to run subflows in parallel, but only when running against Prefect Cloud. details in threadJehan Abduljabbar
03/02/2023, 5:42 PMkasteph
03/02/2023, 7:36 PMflows
module, would it still be to build a deployment for each of them?Josh Paulin
03/02/2023, 7:41 PMDevin Flake
03/02/2023, 9:50 PMOfir
03/02/2023, 11:47 PMPOST /model
REST API call to the Node.js server which then in turn calls Prefect 2.0 using the REST API to start a new run (of an existing deployment).
The workflow will train the model, run inference on some data and then persist the output to a database.
This is obviously an asynchronous operation that may take a few minutes (or more) to complete.
Assuming the workflow succeeded, I would like to notify the users (those who have Chrome opened to my web app) that something has happened, i.e. training completed.
How should the Node.js be notified when the flow has finished (either successfully / failed)?
Is there like a middleware / RabbitMQ / other message queue that the Node.js app can subscribe to, onto which Prefect publishes event?
If not, does Prefect expose other broadcast events? And if not, should I poll periodically from my app and maintain state diff?
Thanks!Ofir
03/02/2023, 11:53 PMMahendra Zadafiya
03/03/2023, 9:42 AMwith Flow("floww_3.5", storage=Azure(container="prefect-flows")) as flow:
input_folder_path = Parameter("input_folder_path", default='demo_path')
utput_folder_path = Parameter("output_folder_path", default='')
.......
if __name__ == "__main__":
client = Client()
client.create_project(project_name="floww_3")
project_path = str(Path.cwd())
flow.run_config = KubernetesRun(.....)
flow.executor = LocalDaskExecutor()
flow.register(add_default_labels=False, project_name="floww_3")
Every time I execute above code it's getting registered properly without any error and version is also getting upgraded.
But if I change default values of Parameter it's not getting updated
Could someone please help with it?Slackbot
03/03/2023, 10:16 AMTibs
03/03/2023, 10:44 AMprefect.exceptions.MissingResult: State data is missing. Typically, this occurs when result persistence is disabled and the state has been retrieved from the API.
I am running same deployment multiple times with different parameters via API call using python requests.
4/5 runs of the deployment failed with this error. They start running and crashed half way.
Prefect 2.8.2, prefect-aws 0.2.4.
My flows are running in ECSTasks.
Any ideas why this happens?Sebastian Gay
03/03/2023, 12:52 PM03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/engine.py", line 246, in retrieve_flow_then_begin_flow_run
2023-03-03 13:24:50 flow = await load_flow_from_flow_run(flow_run, client=client)
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/client.py", line 105, in with_injected_client
2023-03-03 13:24:50 return await fn(*args, **kwargs)
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/deployments.py", line 73, in load_flow_from_flow_run
2023-03-03 13:24:50 flow = await run_sync_in_worker_thread(import_object, str(import_path))
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/utilities/asyncutils.py", line 56, in run_sync_in_worker_thread
2023-03-03 13:24:50 return await anyio.to_thread.run_sync(call, cancellable=True)
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/anyio/to_thread.py", line 32, in run_sync
2023-03-03 13:24:50 func, *args, cancellable=cancellable, limiter=limiter
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
2023-03-03 13:24:50 return await future
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/anyio/_backends/_asyncio.py", line 867, in run
2023-03-03 13:24:50 result = context.run(func, *args)
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/utilities/importtools.py", line 193, in import_object
2023-03-03 13:24:50 module = load_script_as_module(script_path)
2023-03-03 13:24:50 File "/usr/local/lib/python3.7/site-packages/prefect/utilities/importtools.py", line 156, in load_script_as_module
2023-03-03 13:24:50 raise ScriptError(user_exc=exc, path=path) from exc
2023-03-03 13:24:50 prefect.exceptions.ScriptError: Script at '<Flow script location on local machine>' encountered an exception
This seems related to this issue however as I am running prefect 2.1.0 DockerFlowRunner
has been deprecated. I tried the COPY $FLOWSCRIPT.py .
before building the docker image as well but it didn't help, and isn't a solution that will work for me. The link in there to the s3 with volume parameters for the credentials also doesn't seem relevant, as I have already loaded and authorised the storage block using SAS.
The same issue occurs even when I deploy using the CLI according to the prompt in the second to last row of this table.
Could someone please help me to get the docker container to look for the flow script in the storage block where it is successfully uploaded on deployment, not the path of the flow script on the machine that initiates the deployment?Kyle Austin
03/03/2023, 3:02 PMIan
03/03/2023, 3:08 PMCharlie Henry
03/03/2023, 3:38 PMGuy Altman
03/03/2023, 4:42 PMMike O'Connor
03/03/2023, 5:48 PMOfir
03/03/2023, 6:06 PMDavid Steiner Sand
03/03/2023, 7:54 PMLeon Kozlowski
03/03/2023, 8:29 PMallow_failure
utility - is there something similar to invoke only when an upstream task fails?
My use case is a flow running dbt models, I want to run failed models when the original task failsAlbert Wong
03/03/2023, 10:23 PMAlbert Wong
03/04/2023, 1:48 AMprefect server start --host=0.0.0.0
Michał Augoff
03/04/2023, 2:56 AM