When I try to use `from pydantic.v1 import SecretS...
# ask-community
c
When I try to use
from pydantic.v1 import SecretStr
as an input arg type to my flow I"m getting:
Copy code
pydantic.errors.PydanticUserError: The `__modify_schema__` method is not supported in Pydantic v2. Use `__get_pydantic_json_schema__` instead in class `SecretStr`.
Looks like this use case was called out here, but never implemented. Should I file a bug?
āœ… 1
n
hi @Constantino Schillebeeckx - can you confirm your setup? sounds like you're using pydantic 2 but still pydantic 1 objects in some cases the trace might be a long, but could you paste it in a snippet here in this thread?
c
right, that's what I"m doing, I've got Pydantic 2 installed, but I'm using the v1 of SecretStr
n
got it thanks - will look into this! an aside, in case its time sensitive,
SecretStr
doesn't actually hide anything (except visually in the UI), so practically speaking you can likely replace that param type with
str
or the name of a
Secret
block to unblock yourself
c
right, that's what I"m trying to avoid right now, I'm showing tokens in the UI and I don't want to do that šŸ™‚
n
my recommendation would be to always avoid passing secrets plaintext (since
SecretStr
is just a python type, it doesnt stop the secret from going over the wire raw) and instead store your secret in a
Secret
block that is actually encrypted and then you never have to worry about it being in the UI, and just pass the name of that block as the param but that said, I will look into this typing issue
c
We store our secrets in AWS Parameter Store, so I'd rather not use a Secret block
n
I see - you're passing the token for authing to AWS to the flow?
c
no - prefect is running in ECS which has an associated IAM role which can access the param store
n
sorry, I just mean to ask what the token is for
c
ah - token for databricks to execute a notebook
n
what about using this? then you can pass the name of the aws secret to the flow, and you can load it during the flow you can leave
AwsCredentials
empty, as it will scoop the details as boto would if your runtime is already authed
c
I'm a bit confused, I've got no issues reading the token from AWS param store
n
ultimately im saying you should avoid passing the secret plaintext. so instead im suggesting you may want to store that databricks token where you store secrets and load it at runtime in the flow
c
I've got a flow that does that, then starts a subflow with that token to execute a databricks notebook; that subflow takes the secret and shows it as plain text in the UI. I want to change that subflow to accept a
SecretStr
The prefect-databricks integration does something similar; https://github.com/PrefectHQ/prefect/pull/13609/files. In that it takes as input a databricks_credentials object which is composed of
SecretStr
n
databricks credentials is a block šŸ™‚ so its entire value, including its fields are encrypted (if saved) taking a step back, can you say why you have to use
SecretStr
from the
v1
backport in the first place? (if you have v2 installed)
c
I'm calling a subflow with the signature shown in the screenshot: •
JobTaskSettings
is a pydantic v1 object (source) • therefore databricks_token must also be a v1 object
šŸ‘ 1
I know you've got a check that won't allow for mixed pydantic types.
n
i see - you should be able to do
dbricks_flow.with_options(validate_parameters=False)(**kwargs)
right?
c
as a workaround? yeah sure, that's what I ended up doing in the interim. I understand all of this pydantic version juggling is a bit silly, alas.
n
yep it is quite annoying - its a lot nicer in prefect 3.x where there's no more weirdness / conditional imports here's a PR for 2.x that should fix validation of
v1.SecretStr
in 2.x
šŸ™Œ 1
c
omg - thank you so much for a quick turn around.
251 Views