Nikolaus Landgraf
07/19/2022, 7:15 AM2.0b8
we have experienced 2 things:
• performance of single flows became around 2x slower (we are using the Sequential task runner) and have a lot of database operations in our tasks
• Some Flows get interrupted by a 403 forbidden error and do not pick up from where they were left. Is there a way of increasing the lifetime of the token?Mike Geeves
07/19/2022, 7:57 AMVlad Tudor
07/19/2022, 10:11 AMprefect server start
I get this error:
OSError: [Errno 8] Exec format error: 'docker-compose'
I suspect it might be a docker-compose
version issue, since the last version is called by running docker compose
(without -
) Should I downgrade? (Cuurently Docker Compose version v2.6.0)xyzz
07/19/2022, 10:22 AMRiccardo Tesselli
07/19/2022, 1:16 PMMatthew Seligson
07/19/2022, 2:48 PMYiwei Hou
07/19/2022, 3:06 PMMichael Reynolds
07/19/2022, 3:43 PMcheckpoint = False
flag will be introduced into orion
/ prefect 2.0?Alvaro Durán Tovar
07/19/2022, 3:47 PMprefect.exceptions.ClientError: [{'path': ['create_project'], 'message': 'Uniqueness violation.', 'extensions': {'code': 'INTERNAL_SERVER_ERROR'}}]
Michelle Brochmann
07/19/2022, 4:50 PMMichael Reynolds
07/19/2022, 5:32 PMAmogh Kulkarni
07/19/2022, 5:48 PMChu
07/19/2022, 7:35 PMwith Flow as flow:
for i in id_list:
dbt_run_function(i)
(I’m wondering if a simple for loop would achieve parallelism?)Sebastián Montoya Tapia
07/19/2022, 8:38 PMSeth Goodman
07/19/2022, 9:32 PMIlya Galperin
07/19/2022, 9:50 PMlist_sum
function in the flow. Is there a configuration I might be missing that enables this, or has there been any progress in getting the output of DaskExecutor logs into Prefect Cloud?Matthew Seligson
07/19/2022, 10:14 PMMaikel Penz
07/20/2022, 2:22 AMEmma Rizzi
07/20/2022, 7:22 AMRiccardo Tesselli
07/20/2022, 8:08 AMconfig = CustomConfig.load('my_block')
Deployment(
name="My deployment",
flow=my_flow,
parameters={
"password": config.password
}
)
when I run this command from CLI
prefect deployment create my_deployment.py
I get this error
AttributeError: 'coroutine' object has no attribute 'password'
Failed to load deployments from 'my_deployment.py'
sys:1: RuntimeWarning: coroutine 'Block.load' was never awaited
How can I do that?xyzz
07/20/2022, 8:49 AMfs = RemoteFileSystem(basepath="<s3://my-bucket/folder/>")
fs.write_path("foo", b"hello")
fs.save("dev-s3")
What das fs.write_path do? Store actual content on the fs?
Also, is the secret key stored in the prefect database if you pass them to the settings parameter of RemoteFileSystem and then call fs.save? If so, how is ensured only I can read it and not e.g. a admin at prefect?Andreas Nigg
07/20/2022, 10:54 AMIsara Ovin
07/20/2022, 11:30 AMadd_json_index() missing 1 required positional argument: 'output'
output = filter_empty_responses(data)
# Parsing Jsons
indexed_output = add_json_index(output, upstream_tasks=[filter_empty_responses])
David
07/20/2022, 11:35 AM400 List of found errors: 1.Field: job_spec.worker_pool_specs[0].container_spec.env[12].value; Message: Required field is not set. 2.Field: job_spec.worker_pool_specs[0].container_spec.env[3].value; Message: Required field is not set. 3.Field: job_spec.worker_pool_specs[0].container_spec.env[4].value; Message: Required field is not set. [field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[12].value"
description: "Required field is not set."
}
field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[3].value"
description: "Required field is not set."
}
field_violations {
field: "job_spec.worker_pool_specs[0].container_spec.env[4].value"
description: "Required field is not set."
}
]
Chu
07/20/2022, 12:56 PMMatthew Seligson
07/20/2022, 1:35 PMharis khan
07/20/2022, 2:00 PMMatt Delacour
07/20/2022, 2:37 PMLocalDaskExecutor
and I cannot understand the problem.
I am trying to run parallel calls for Rest API endpoints. The problem is that the Dask executor will run some calls twice for no apparent reason. Here is the logic I have
Then I can see that some Dask tasks run on the same endpoint while when I log all the endpoints, there are all unique...
I will post the logic as a snippet in the thread 🧵Denys Volokh
07/20/2022, 2:39 PMPREFECT__CONTEXT__SECRETS_GITHUB_ACCESS_TOKEN
Registered flow with GitHubStorage
flow.storage = GitHub(
repo="company/prefect-workflows",
path="flows/benchmarks/flow_import_index_data.py",
ref="master",
access_token_secret="GITHUB_ACCESS_TOKEN",
)
but when I run flow from cloud.prefect.com I am getting error
Failed to load and execute flow run: ValueError('Local Secret "GITHUB_ACCESS_TOKEN" was not found.')
alex
07/20/2022, 2:46 PM