Hi all, wonder of anyone can point me to an offici...
# prefect-community
Hi all, wonder of anyone can point me to an official task for pulling a git repo? I am wanting to use such a task in the context of running DBT - can I perhaps use something from the underlying Github tasks - don't see an explicit clone task...
There is no separate task because it's a single line of code. Check this blog post if you need an example
🙌 1
Copy code
pygit2.clone_repository(url=repo_url, path=DBT_PROJECT, checkout_branch=branch)
🙌 1
Ahh ok thought maybe there would be something that deals with secrets etc but this works fine for me!
👍 1
@Anna Geller just coming back to this - do you have any thoughts about best way to get the dbt profile setup when running on k8s? Just bake a profile.yml into my container?
Uisng oauth so it just picks up the service account of the pod?
I don't know how to do that with OAuth, but rehandling dbt profile information, you could set it directly on the
, and for the Secrets, you can use PrefectSecret. Maybe this post can help? e.g. this
Thanks! I was literally just reading that!! Its all a little involved I feel - think I need to create a dbt user service account - mount the key to the pod via k8s secrets - then point to the key file...
I kind just hoped I could rely on the pods default service account (probably not good practice though)
does dbt support that? I didn't even know that - what credential does it even store - Snowflake/BigQuery credentials?
you need to also consider that
command runs as a subprocess, so even if you mount it to the underlying pod, I'm not sure whether this would be available to this dbt subprocess doing this via
args and Prefect Secrets is easier and I can confirm that it works
So I would crate a profile.yml - which points to a service account key - bake the profile.yml into the container (nothing really secret there) - mount the service account key as k8s secret and then pass profile and profiles_dir to DBTShellTask
At least thats the plan!
using Prefect Secrets how do I then get the service account key into a file so that dbt can read it and authenticate to BQ with it? Just have a task that writes it to disk?
You should have mentioned earlier you are asking about BigQuery 🙂 BQ is indeed tricky as it requires:
Copy code
keyfile: your_service_account_file.json
in the worst case, you could bake the file and profiles.yml into your Docker image, but this is not a security best practice This looks interesting - if you use keyfile_json instead, you could use the same approach with Prefect Secrets as described in the post without mounting files to containers
Copy code
  target: dev
      type: bigquery
      method: service-account-json
      project: [GCP project id]
      dataset: [the name of your dbt dataset]
      threads: [1 or more]
      [<optional_config>](#optional-configurations): <value>

      # These fields come from the service account json keyfile
        type: xxx
        project_id: xxx
        private_key_id: xxx
        private_key: xxx
        client_email: xxx
        client_id: xxx
        auth_uri: xxx
        token_uri: xxx
        auth_provider_x509_cert_url: xxx
        client_x509_cert_url: xxx