Hrm, `BigQueryLoadGoogleCloudStorage` seems to not...
# ask-community
m
Hrm,
BigQueryLoadGoogleCloudStorage
seems to not be inferring my GCP credentials? Not sure how to pass it, either (I can grab it as a
PrefectSecret
fine, and can pass it directly to GCP functions)
k
Hey @matta, how did you store it?
GCP_CREDENTIALS
? What is the error you get? Maybe your Flow is looking for local secrets?
m
Yeah, I grab it from
GCP_CREDENTIALS
Oh, hrm, maybe I need to use the
GCPSecret
task instead of
PrefectSecret("GCP_CREDENTIALS")
?
k
Try setting`"PREFECT__CLOUD__USE_LOCAL_SECRETS": "false"` ? I don’t think so.
m
No luck. Is there an example lying around of how it's supposed to be called?
k
Just this and this , but no complete example. I think it might be meant to be a local secret for those tasks, but not sure. Could you try it as a local secret on the agent? Set
"PREFECT__CLOUD__USE_LOCAL_SECRETS": "false"
back to true.
m
This worked:
Copy code
@task
def storage_to_bigquery(
    secret: PrefectSecret, project: str, dataset_name: str, table: str, file_uri: str
) -> str:
    full_dataset_id = f"{project}.{dataset_name}"
    credentials = service_account.Credentials.from_service_account_info(secret)
    bigquery_client = bigquery.Client(credentials=credentials)

    dataset_ref = bigquery.dataset.Dataset.from_string(full_dataset_id)

    job_config = bigquery.LoadJobConfig(
        write_disposition=bigquery.WriteDisposition.WRITE_TRUNCATE,
        autodetect=True,
        source_format=bigquery.SourceFormat.NEWLINE_DELIMITED_JSON,
    )

    load_job = bigquery_client.load_table_from_uri(
        file_uri, dataset_ref.table(table), job_config=job_config,
    )
    print(f"Starting job {load_job.job_id}")
    load_job.result()  # Waits for table load to complete.
    return "Done"