Hi folks I am having issues using the prefect back...
# ask-community
m
Hi folks I am having issues using the prefect backend key-value store. (Please see the exception in the thread)
The task calls
Copy code
kv_store.set_key_value("filenames", filenames) # filenames is a list of strings
And I get the following error
Copy code
prefect.utilities.exceptions.ClientError: [{'path': ['set_key_value'], 'message': '[{\'extensions\': {\'path\': \'$.selectionSet.key_value_aggregate\', \'code\': \'validation-failed\'}, \'message\': \'field "key_value_aggregate" not found in type: \\\'query_root\\\'\'}]', 'extensions': {'code': 'INTERNAL_SERVER_ERROR'}}]
so I fail to replicate this when running the flow locally …
We are running a DaskExecutor on EKS using prefect cloud _ is there any additional config we should set for us to be able to use the prefect backend kv store ?
we are using prefect version 0.14.22 both locally and remotely …
googling this seems like an authorization issue https://stackoverflow.com/questions/56702835/graphqlerror-field-not-found-in-type-query-root-after-merge-schema should I be calling client login as part of the flow ?
hmm - running locally without being logged in raises a different error which is much more informative …
Copy code
...
_request(self, method, path, params, server, headers, token, retry_on_api_error)
    466         except JSONDecodeError as exc:
    467             if prefect.config.backend == "cloud" and "Authorization" not in headers:
--> 468                 raise ClientError(
    469                     "Malformed response received from Cloud - please ensure that you "
    470                     "have an API token properly configured."

ClientError: Malformed response received from Cloud - please ensure that you have an API token properly configured.
Searching through the prefect slack - this thread looks the most relevant https://prefect-community.slack.com/archives/CL09KU1K7/p1633396064249300
If I understand this correctly - what was missing is the
as_user=False
flag
seems like one workaround is to login with a user-scoped token on the dask worker - but that doesn’t sound ideal
ok I replicated this locally by logging out - and then using the worker non-user scoped token
seems like one workaround is to login with a user-scoped token on the dask worker - but that doesn’t sound ideal (edited)
this workaround works - but no where in the docs does it state that one requires a user-scoped token to interact with the KV store … Additionally for flows executed remotely on dask workers - one has to pass the user-scoped api key to Client() and make use of graphql to make this work …
k
Hey @Marwan Sarieddine, sorry for the delayed response. Am at KubeCon this week. I think though that in general, the KV Store requires a key that has the permissions to read and write. The issue is that keys and tokens created before this feature was released don’t have those permissions by default. If you create a new key with those permissions and pass it, I think this should work. Does that make sense? Basically moving forward, I don;t think that anyone should have to worry about this as they make a new key, unless you have specific RBAC permissions. Do you have custom RBAC roles?
upvote 1