Ben Collier07/05/2021, 10:08 AM
Evgenii07/05/2021, 3:55 PM
Ben Muller07/05/2021, 10:37 PM
wiretrack07/05/2021, 11:04 PM
. Tried every variation for apollo: `prefect-apollo-service`(my svc name),
[Errno -2] Name or service not known
address, can’t make it to work. any ideas on what I should be doing to get the agent to talk to apollo? If I remove the
I get a different error:
1 No connection adapters were found for 'prefect-apollo-service:4200/graphql'
Laura Vaida07/06/2021, 7:48 AM
@task(log_stdout=True) def write_order_data (dataframe): current_date = dt.today().strftime("%Y_%m_%d") #GCS_Result = GCSResult(bucket='uwg-mail', location = 'orders_import_sf' + '_' + current_date + '.csv') dataframe.to_csv('<gs://uwg-mail/orders_import_sf.csv>', dataframe, header=True)
Gabriel Santos07/06/2021, 4:20 PM
Did i do something wrong?
Failed to load and execute Flow's environment: ModuleNotFoundError("No module named '/app/'
Joseph Loss07/06/2021, 5:34 PM
Madison Schott07/06/2021, 7:53 PM
fatal: Not a dbt project (or any of the parent directories). Missing dbt_project.yml file
wiretrack07/06/2021, 8:25 PM
Suchindra07/06/2021, 8:26 PM
Zach Schumacher07/06/2021, 8:27 PM
isn’t the default executor, if dask is a dep of prefect anyways.
Samuel Kohlleffel07/06/2021, 9:07 PM
is there anyway to set
so the files in the container are overwritten with the new uploaded files when my flow runs?
Peyton Murray07/06/2021, 11:45 PM
What's the right way to structure this to specify
import prefect as pf from prefect.engine.results import LocalResult @pf.task(checkpoint=True, result=LocalResult(dir=path_to_result) def my_task(a, b, c): return do_stuff(a, b, c) with pf.Flow('my flow') as flow: my_task(1, 2, 'foo') # <--- I want to be able to specify path_to_result here flow.run()
at the indicated location?
matta07/07/2021, 12:54 AM
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/.prefect/auth.toml'
Brad I07/07/2021, 1:12 AM
. It works if I set the key in both variables, is this expected?
failed to authenticate, missing token
env: - name: PREFECT__CLOUD__AGENT__AUTH_TOKEN value: XXXXXXXX - name: PREFECT__CLOUD__API value: <https://api.prefect.io> - name: PREFECT__BACKEND value: cloud - name: PREFECT__CLOUD__API_KEY value: XXXXXXXX - name: PREFECT__CLOUD__TENANT_ID value: TTTTTTTT image: prefecthq/prefect:0.15.0-python3.7
Ben Muller07/07/2021, 2:58 AM
Sean True07/07/2021, 8:31 AM
Krzysztof Nawara07/07/2021, 10:52 AM
Ben Collier07/07/2021, 10:58 AM
nib07/07/2021, 12:14 PM
) and error details. But it’s empty and I can get only “Some reference tasks failed.” from
. Is it possible to extract this kind of details?
Madison Schott07/07/2021, 2:30 PM
Also is it ok if I just have the
defined before this with the parameters needed?
user_profile_w_campaign = Flow("User Profile with Campaign") user_profile_w_campaign.set_dependencies( task=dbt_task, upstream_tasks=[FivetranSyncTask()] ) user_profile_w_campaign.run()
Mike Wochner07/07/2021, 3:36 PM
with Flow('Example') as flow: today_date = datetime.date.today().strftime("%Y-%m-%d") data = extract_data(security_list, today_date) load_data(data) ... more_data = extract_more_data(security_list) load_more_data(more_data, today_date)
Amit07/07/2021, 5:17 PM
ale07/07/2021, 5:33 PM
Madison Schott07/07/2021, 6:09 PM
wiretrack07/07/2021, 8:40 PM
), I was wondering if the large amount of rows on other tables will start to get in the way of the frontend performance (and hasura’s, and apollo’s) . Putting
in mongodb or something should completely solve the challenge (not really sure if it’s really a challenge), but it seems that this would be a huge change, since the code is really well tied together. I was wondering how do you guys see scalability on the server, and curious on what approaches the cloud version uses to overcome potencial scalability issues in the long term.
Aric Huang07/07/2021, 9:55 PM
error running the flow, when using the
Failed to load and execute Flow's environment: FileNotFoundError(2, 'No such file or directory')
option for KubernetesRun. I can successfully register the flow, and when running the flow it seems to be respecting the
I passed in (I see my Kubernetes cluster running an appropriate pod based on my template) - but after pulling the image I get the
. Thoughts on what is going on? My flow basically looks like this:
with Flow("Test") as test_flow: ... test_flow.run_config = KubernetesRun( job_template_path="template.yml" ) test_flow.storage = GitHub( repo="<path>", path="flows/test_flow.py", access_token_secret="GITHUB_ACCESS_TOKEN" )
Joseph Loss07/07/2021, 10:02 PM
I had registered the flow and previously used the flow on LocalRun, now all of a sudden it's failing but it was working a few hours ago?
Flow run 478adfa1-5f4f-4dec-a121-a2443bc0a253 has a `run_config` of type `LocalRun`, only `DockerRun` is supported