Constantino Schillebeeckx
06/23/2022, 2:34 PMrectalogic
06/23/2022, 3:13 PMScott Aefsky
06/23/2022, 5:27 PMcapture_ID_batch, all_capture_metadata = get_unprocessed_capture_ID_batches(snowflake_tbl, experiment_id=10, batch_size = 500)
capture_loc_map = get_locations_for_captures(all_capture_metadata, s3_bucket)
#read the files from local disk given IDs, process and record proc jsons
process_batch.map(capture_ID_batch, unmapped(snowflake_tbl), unmapped(capture_loc_map), unmapped(intrazone_extractor), unmapped(dynamic_zone_finder))
The Prefect schematic shows that my first task `get_unprocessed_capture_ID_batches`is getting duplicated, and my flow is eventually failing because of a Dask scheduler timeout.
The last task starts running, but won't complete because the timeout kills it. I'm trying to understand a couple of things:
• Why is my first task showing up as if it's a mapped task?
• Why is the base node of that task not completing if each of the 2 instances of it are complete?
• Why is the scheduler timing out? If I run this flow with a small amount of data, it runs fine, but with a larger dataset I get this failure.
Thanks for any help you can provide!Nicholas Kan
06/23/2022, 5:29 PMJavier Ochoa
06/23/2022, 6:48 PMbotocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the GetParameter operation: User: arn:aws:sts::999999999999:assumed-role/MyRole-dev/12345459ba45458183ed3d1aa5112341 is not authorized to perform: ssm:GetParameter on resource: arn:aws:ssm:region:999999999999:parameter/XXX/value because no identity-based policy allows the ssm:GetParameter action
What is your suggestion here, or a possible solution for this?Jehan Abduljabbar
06/23/2022, 7:32 PMAmit Gupta
06/23/2022, 7:40 PMGuoying Qi
06/23/2022, 8:37 PMprefect.exceptions.PrefectHTTPStatusError: Client error '400 Bad Request' for url 'https://*******************************'
Response: {'detail': 'Request specified API version 0.5.0 but this server only supports version 0.6.0 and below.'}
For more information check: <https://httpstatuses.com/400>
Which version of the Prefect I should install on my local machine? thanks.Guoying Qi
06/23/2022, 8:54 PM{
"detail": "Not Found"
}
Guoying Qi
06/23/2022, 9:02 PMGuoying Qi
06/23/2022, 9:38 PMWhen will Prefect 2.0 be released?
The Prefect 2.0 beta period is expected to last for at least the second quarter of 2022.
Tilé
06/23/2022, 10:26 PMPrefectSecret
class.
The problem is that there is a secret I have to pass on every task. It works properly, but when I try to visualize my flow I get this secret shown as input to every task, which basically ruins the whole diagram and makes it hard to read. So
1. Is there any other way to pass my secret to the tasks when running locally or using the agent from the cloud, or
2. Is there any way to remove the secret box from the visualized flow?
ThanksJosh
06/23/2022, 11:21 PMJeff Kehler
06/24/2022, 4:57 AMDarren Fleetwood
06/24/2022, 6:07 AMPREFECT__LOGGING__EXTRA_LOGGERS= "['ray']"
As well as this within the flow:
prefect.config.logging.extra_loggers = ['ray']
Plus a few other things, none of which have worked. This is the case for all logging levels (error, info, etc.)
What’s the proper way of doing this?
Thanks!Balveer Singh
06/24/2022, 6:47 AMZheng Cheng
06/24/2022, 9:16 AMGuoying Qi
06/24/2022, 11:06 AM$ prefect -v
2.0b7
$ prefect orion kubernetes-manifest
Error: No such command 'kubernetes-manifest'.
$ prefect orion --help
Commands:
database Commands for interacting with the database.
start Start an Orion server
The documents needs to be updated?
https://orion-docs.prefect.io/tutorials/kubernetes-flow-runner/Joshua Greenhalgh
06/24/2022, 11:29 AMMaverick Humbert
06/24/2022, 11:55 AMBalveer Singh
06/24/2022, 12:36 PMJelle Vegter
06/24/2022, 1:15 PMSander
06/24/2022, 1:38 PMBenjamin Bonhomme
06/24/2022, 2:19 PMredsquare
06/24/2022, 5:28 PMredsquare
06/24/2022, 5:29 PMredsquare
06/24/2022, 5:32 PMkomal azram
06/25/2022, 8:30 AMimport prefect
from prefect import task, Flow
@task
def claims_func():
airbyte_server_host = "localhost",
airbyte_server_port = 8000,
airbyte_api_version = "v1",
connection_id = conn_id,
with Flow("fhir-flow") as flow:
claims_func()
flow.run()
As per my understanding when when I run this flow it should trigger the connection and automatically sync data in from gcp->snowflake. I don't get any error but no data is synced.Guoying Qi
06/25/2022, 3:58 PMprefect config set PREFECT_API_URL="<http://prefect.example.com/api>"
the flow run states being stored in the database successfully, you can query them in the database table.
But nothing shown in the UI, because all the API calls from the UI page still go to local server: http://127.0.0.1:4200/api/Zheng Cheng
06/25/2022, 4:01 PMZheng Cheng
06/25/2022, 4:01 PMMichael Adkins
06/25/2022, 4:22 PMapp = create_app()
should get you thereprefect orion start
and visiting localhost/api/docs
and we have all the API documentation hosted in the docs you’re currently viewing.Zheng Cheng
06/25/2022, 4:27 PMMichael Adkins
06/25/2022, 5:43 PMZheng Cheng
06/27/2022, 1:34 PMfrom prefect.orion.api.server import create_app
app = create_app()
openapi_doc = app.openapi()
unreadable but I guess it just json for machineMichael Adkins
06/27/2022, 3:03 PMZheng Cheng
06/27/2022, 3:18 PMKevin Kho
06/27/2022, 3:59 PM