Hi, trying to upload files generated from my extra...
# prefect-community
k
Hi, trying to upload files generated from my extraction step to GCS and found the examples in
prefect-gcp
to do it, but when I combine this with block information it seems like the info is not in the correct format.
Copy code
gcs_block = GCS.load("gcs-dev")

@flow()
def example_cloud_storage_upload_blob_from_file_flow():
    gcp_credentials = GcpCredentials(service_account_info=gcs_block.service_account_info)
    test_upload_file = "test_upload.txt"
    blob = cloud_storage_upload_blob_from_file(test_upload_path, gcs_block.bucket_path, "test_upload.txt", gcp_credentials)
    return blob
Running this produces the following error
Copy code
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for GcpCredentials
service_account_info
  JSON object must be str, bytes or bytearray (type=type_error.json)
a
it might be that the block is not configured correctly - could you check if you can use the block e.g. with a deployment as described here https://discourse.prefect.io/t/how-to-deploy-prefect-2-0-flows-to-gcp/1251 to confirm the block is working?
k
Deployment files arrived in GCS
Figured out the issue, looks like you need 2 things 1. Call
get_secret_value()
after the service_account_info is loaded from the block 2. Pass that object through
json.dumps()
b/c the
GcpCredentials
class is expecting a string.
Copy code
gcs_block = GCS.load("gcs-dev")

@flow()
def example_cloud_storage_upload_blob_from_file_flow():
    gcp_credentials = GcpCredentials(service_account_info=json.dumps(gcs_block.service_account_info.get_secret_value()))
    test_upload_file = "test_upload.txt"
    blob = cloud_storage_upload_blob_from_file(test_upload_file, gcs_block.bucket_path, "test_upload.txt", gcp_credentials)
    return blob
🙌 1
a
that makes sense since that value is a Secret field nice work figuring this out!