function. I would like to use a Parameter to configure the top level flow but it seems parameters are only really meant to be passed into tasks. Is it possible to use the value from a Parameter within a Flow?
The problem seems to be that the GitHub storage only clones the single file and not the entire project which causes my import to fail. (
from tasks import my_task from prefect.storage import GitHub from prefect import Flow from prefect.run_configs import ECSRun storage = GitHub( repo="repo", # name of repo path="path/to/myflow.py", # location of flow file in repo access_token_secret="GITHUB_ACCESS_KEY", # name of personal access token secret ) with Flow(name="foobar", run_config=ECSRun(), storage=storage) as flow: my_task()
) I've seen that there has been some discussion around this issue but it hasn't really helped me to solve the issue.... Is my only option to clone the repo into the custom image that I use for my ECS task? But that would mean that I would have to rebuild that image every time I change something to my underlying modules, right?
ModuleNotFoundError("No module named 'tasks'")
I was investigating a bit and seem that the state contains the result of all tasks while running it locally (
Some reference tasks failed.
) while on the server this is empty (i printed it using the logger) . Any idea on how to address this?
again for those new changes to take affect? 2) if above is true, then do i need to rerun the tasks again on the UI? 3) sometimes i inadverently press run twice and i have two running processes. Is there anyway to stop a process after it has been started? 4) when i delete the workspace, to start over, i notice when i type,
prefect deployment create deployment_name
the python processes are still running and i have to do a
ps aux | grep python | wc -l
to kill all the python processes. Is there any way that once a workspace is killed all the python processes are killed along with it?