Jan Nitschke
04/04/2022, 9:52 AMfrom tasks import my_task
from prefect.storage import GitHub
from prefect import Flow
from prefect.run_configs import ECSRun
storage = GitHub(
repo="repo", # name of repo
path="path/to/myflow.py", # location of flow file in repo
access_token_secret="GITHUB_ACCESS_KEY", # name of personal access token secret
)
with Flow(name="foobar",
run_config=ECSRun(),
storage=storage) as flow:
my_task()
The problem seems to be that the GitHub storage only clones the single file and not the entire project which causes my import to fail. (ModuleNotFoundError("No module named 'tasks'")
) I've seen that there has been some discussion around this issue but it hasn't really helped me to solve the issue.... Is my only option to clone the repo into the custom image that I use for my ECS task? But that would mean that I would have to rebuild that image every time I change something to my underlying modules, right?Andres
04/04/2022, 11:54 AMSome reference tasks failed.
I was investigating a bit and seem that the state contains the result of all tasks while running it locally (state.result
) while on the server this is empty (i printed it using the logger) .
Any idea on how to address this?Atul Anand
04/04/2022, 12:37 PMTom Klein
04/04/2022, 1:16 PMShuchita Tripathi
04/04/2022, 1:59 PMRajan Subramanian
04/04/2022, 2:08 PMJoshua Weber
04/04/2022, 2:09 PMRajan Subramanian
04/04/2022, 2:44 PMprefect deployment create deployment_name
again for those new changes to take affect?
2) if above is true, then do i need to rerun the tasks again on the UI?
3) sometimes i inadverently press run twice and i have two running processes. Is there anyway to stop a process after it has been started?
4) when i delete the workspace, to start over, i notice when i type,
ps aux | grep python | wc -l
the python processes are still running and i have to do a
pkill python
to kill all the python processes. Is there any way that once a workspace is killed all the python processes are killed along with it?Shuchita Tripathi
04/04/2022, 2:55 PMBernd Grolig
04/04/2022, 3:02 PMDonnchadh McAuliffe
04/04/2022, 3:52 PMDockerfile
for the agent:
FROM prefecthq/prefect:2.0b2-python3.9
RUN prefect cloud login --key cloud_api_key
ENTRYPOINT prefect agent start 'queue_id'
However, the second line requires me to specify the workspace - is there some flag I can add (like --workspace {workspace}
) ? There probably is a much better way to set up an agent, any other docs would be appreciated! Thank youjoshua mclellan
04/04/2022, 5:13 PMMadison Schott
04/04/2022, 5:21 PMAtul Anand
04/04/2022, 7:09 PMChris Reuter
04/04/2022, 7:36 PMEthan Veres
04/04/2022, 8:56 PMShiyu Gan
04/05/2022, 2:24 AMEddie Atkinson
04/05/2022, 3:39 AMShiyu Gan
04/05/2022, 4:10 AMJonathan Mathews
04/05/2022, 8:43 AMMuhammad Daniyal
04/05/2022, 10:36 AMdef fun1(): <some code here>
def fun2(): <some code here>
def fun3(): <some code here>
def fun4(): <some code here>
def fun5(): <some code here>
def fun6(): <some code here>
def execute_flow():
@task
def t1():
fun1()
@task
def t2():
fun2()
@task
def t3():
fun3()
@task
def t4():
fun4()
@task
def t5():
fun5()
@task
def t6():
fun6()
with Flow('my flow') as f:
a = t1()
b = t2()
t3(a)
t4(b)
c = t5()
d = t6(c)
output = f.run()
result = output.result[d]._result.value
return result
the expected behaviour of flow would be t1 -> t2 -> t3 -> t4 -> t5 -> t6
but it is not working this way
instead this is what happenning
t1 -> t5{fails but shows 'success'} -> t6{fails since result from t5 are not appropiate} -> t2 -> t3{shows success message without even going through half way of function} -> t4{same result as t3}
except t5 and t6, every task is time takingHawkar Mahmod
04/05/2022, 10:54 AMUnexpected error: TypeError("can't pickle sqlalchemy.cprocessors.UnicodeResultProcessor objects")
Christian
04/05/2022, 10:59 AMJonathan Mathews
04/05/2022, 11:00 AMprefect auth login --key <replaced with my key>
I get the following error:
Unauthorized. Invalid Prefect Cloud API key.
The cloud UI says my key has been createdPatrick Koch
04/05/2022, 2:35 PM└── 14:27:40 | INFO | Entered state <Failed>: Failed to load and execute Flow's environment: ModuleNotFoundError("No module named '/home/prefect-flow-example'")
Flow run failed!
The file "prefect-flow-example.py" is available at my local host, I've set the proper ENV variables for the connection to the Prefect-Server.
Do I need to define some Storage Options before? An Azure subscription would be available.
I'm curious about what you would suggest as best practice 🙂
Thank you a lot in advance!
PatrickMatthew Roeschke
04/05/2022, 3:10 PMprefect get flows
cli command?Matt Brown
04/05/2022, 3:25 PMKen Nguyen
04/05/2022, 3:43 PMschema
within the profiles.yml), and have the dbt task within that flow output to a schema of the user’s choice.Jonathan Mathews
04/05/2022, 3:50 PMDbtShellTask
because dbt wouldn’t have been installed). It’s coming back with an error where I need to specify a execution_role_arn
in the ECSRun
which I’ve now done, but can I use the previously defined role, or do I need to define something else? Thanks!Rhys Mansal
04/05/2022, 6:24 PMRhys Mansal
04/05/2022, 6:24 PMKevin Kho
04/05/2022, 6:31 PMscheduled_start_time
from the contextRhys Mansal
04/05/2022, 7:02 PM