Stephen Herron
01/20/2023, 11:49 AMRaul Maldonado
01/25/2023, 7:47 PMFarid
02/06/2023, 10:31 PMERROR: Runtime Error
Credentials in profile "snowflake_dbt_transformations", target "dev" invalid: 'database' is a required propertyDefined profiles:
- snowflake_dbt_transformations
For more information on configuring profiles, please consult the dbt docs:
<https://docs.getdbt.com/docs/configure-your-profile>
Upon investigation, it seems like the dbtCliProfile passed to the dbtTrigger task gets saved in the ~/.dbt/profiles.yml
first and then used inside the dbtCli shell task. However the problem is the method where it saves the dbtCliProfile into a yaml file does not save the TargetConfigs
resulting in the profile file being incomplete.
Has anyone had an experience with this before and know any workarounds?Donny Flynn
02/09/2023, 5:45 PMdbt docs generate
as the Flow Run is outputting that
Catalog written to /opt/prefect/target/catalog.json
I checked S3 and it's definitely not there, and I'm guessing that the /opt
is a directory that's tied to the ECS Task from the container? Is there a way that the dbt cli command can output the docs file (specifically index.html into the S3 bucket so we can host our dbt docs as a static site)
Flow code is in the 🧵. I really appreciate any help or pointers 🙂Oluremi Akinwale
02/13/2023, 8:24 AMJaime Raldua Veuthey
02/15/2023, 4:58 PMAndrew Huang
02/16/2023, 10:46 PMShaun Fender
02/24/2023, 10:59 AMMarius Vollmer
02/27/2023, 10:32 AMShaun Fender
03/01/2023, 9:19 AMChris Reuter
03/02/2023, 8:06 PMMarius Vollmer
03/08/2023, 11:36 AMRuntimeError: Command failed with exit code 133:
/tmp/prefect-4zqou1u7: line 1: 299284 Trace/breakpoint trap (core dumped) dbt build --select dl_nicc_ecom__order_header --profiles-dir /tmp/dbt --project-dir /tmp/dbt
12:14:43 PM
Finished in state Failed('Flow run encountered an exception. RuntimeError: Command failed with exit code 133:\n/tmp/prefect-4zqou1u7: line 1: 299284 Trace/breakpoint trap (core dumped) dbt build --select dl_nicc_ecom__order_header --profiles-dir /tmp/dbt --project-dir /tmp/dbt\n\n')
Many thanks in advance!Vincenzo
03/21/2023, 1:32 PMStephen Lloyd
04/10/2023, 9:22 AMDbtCoreOperation
we are receiving the following error...
This suggests that our dbt cli profile block, which we have defined in dbt Cloud, does not have a name or target or target_configs, but it does..
pydantic.error_wrappers.ValidationError: 3 validation errors for DbtCoreOperation
dbt_cli_profile -> name
field required (type=value_error.missing)
dbt_cli_profile -> target
field required (type=value_error.missing)
dbt_cli_profile -> target_configs
field required (type=value_error.missing)
I was able to use the profile_dir
parameter to use my local credentials, but this only verifies there aren't obvious problems with the flow code.
prefect-dbt==0.3.1
prefect-snowflake==0.26.0
Any ideas?Stephen Lloyd
04/12/2023, 10:06 AMdbtCoreOperation
.
I am running...
with DbtCoreOperation(
commands=["dbt deps", "dbt build " + dbt_build_arg + " --target " + DBT_ENV],
project_dir="edw-dbt",
# working_directory="edw-dbt",
# overwrite_profiles=True,
profiles_dir="~/.dbt",
# dbt_cli_profile=dbt_cli_profile,
stream_output=True,
) as dbt_operation:
dbt_process = dbt_operation.trigger()
dbt_process.wait_for_completion()
result = dbt_process.fetch_result()
Currently failing with:
15:35:09.705 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Error importing adapter: No module named 'dbt.adapters.snowflake'
15:35:09.706 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Encountered an error while reading profiles:
ERROR: Runtime Error
Credentials in profile "edw", target "dev" invalid: Runtime Error
Could not find adapter type snowflake!Defined profiles:
- edw
For more information on configuring profiles, please consult the dbt docs:
<https://docs.getdbt.com/docs/configure-your-profile>
15:35:09.720 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Encountered an error:
Runtime Error
Could not run dbt
fozan talat
04/13/2023, 5:22 PMFinished in state Failed('Flow run encountered an exception. RuntimeError: PID 17072 failed with return code 1.\n')
ale
05/08/2023, 4:33 PMprefect-dbt
I see the available targets are Snowflake, Bigquery and Postgres, but not Redshift.
Should I use the Postgres target?
Any hint is much appreciated! 🙌Ali Marques
05/17/2023, 2:29 PMTyler Simpson
05/18/2023, 11:52 PMGiacomo Chiarella
05/26/2023, 9:38 AMTom A
06/08/2023, 2:59 PM3.10
environment and am hitting some errors. Is there anyone who can take a look and let me know if I'm doing something wrong? Going to put some more info in the thread to not clog up the channel..Justin
06/24/2023, 7:34 PMRuntimeError: PID 29636 failed with return code 1.
and with another section saying:
OneDrive\Documents\WindowsPowerShell\profile.ps1 cannot be loaded because running scripts is disabled
on this system. For more information, see about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.
My code and more info in this thread:Mike B
06/29/2023, 12:09 AMFile "/usr/lib/python3.10/subprocess.py", line 1845, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'bash'
I'm hitting this error any time that I try to trigger a DbtCoreOperation from a deployment. I'm running the most recent version of the prefect server and task runner, and most recent versions of the prefect_dbt and prefect_airbyte pip packages. This is running on an Ubuntu 22.04 server. It's using the default local file storage for the flow, and a local agent running on the same server as the Prefect server process. I'm hitting the above exception as soon as I try to kick off any DbtCoreOperation, as soon as either "run()" or "trigger()" is called. The workflow runs perfectly when running the script manually, but fails on the dbt stuff as soon as I try running it through a deployment. The Airbyte operation runs fine from both environments, so this is specific to the dbt operations.
Any help would be much appreciated! Code below:
import sys
from prefect import flow
from prefect_dbt.cli import DbtCoreOperation
from prefect_airbyte.server import AirbyteServer
from prefect_airbyte.connections import AirbyteConnection
DBT_PROJECT_DIR = "<path to dbt project>"
DBT_PROFILE_DIR = "<path to dbt project>"
AIRBYTE_CONNECTION = "<connection id>"
airbyte_server = AirbyteServer(server_host="<host>", server_port=<port>)
@flow(log_prints=True)
def dbt_snapshot():
with DbtCoreOperation(commands=["dbt snapshot"],
project_dir = DBT_PROJECT_DIR,
profiles_dir = DBT_PROFILE_DIR) as dbt_run:
dbt_process = dbt_run.trigger()
dbt_process.wait_for_completion()
dbt_output = dbt_process.fetch_result()
return dbt_output
@flow(log_prints=True)
def dbt_model():
result = DbtCoreOperation(
commands = [f"dbt run --select models/*"],
project_dir = DBT_PROJECT_DIR,
profiles_dir = DBT_PROFILE_DIR
).run()
return result
@flow(log_prints=True)
def airbyte_sync():
connection = AirbyteConnection(
connection_id = AIRBYTE_CONNECTION,
airbyte_server = airbyte_server
)
job_run = connection.trigger()
job_run.wait_for_completion()
res = job_run.fetch_result()
return res
@flow(log_prints=True)
def dbt_airbyte_sync():
dbt_snapshot()
dbt_model()
airbyte_sync()
if __name__ == "__main__":
try:
dbt_airbyte_sync()
except Exception as err:
print("Exception while running workflow", err)
sys.exit(1)
sys.exit(0)
Tyson Chavarie
07/06/2023, 6:40 PMfrom prefect_dbt.cloud import DbtCloudCredentials
dbt_cloud_credentials = DbtCloudCredentials.load("dbt-cloud")
Asterios Pantousas
07/27/2023, 3:41 PMtrigger_dbt_cli_command
, each performing various CLI commands. More specifically, we run different dbt models based on certain tags.
The issue we encounter is within the Prefect UI, where these dbt tasks are automatically assigned default names (such as trigger_dbt_cli_command-0
, trigger_dbt_cli_command-1
, etc.). This automatic naming becomes problematic when a task fails, as it is difficult to discern precisely which task has encountered the error.
From the docs I didn't manage to find any solution to that. We would greatly appreciate your insight and any potential solutions to this naming issue. 😁Giacomo Chiarella
07/28/2023, 2:42 PMGiacomo Chiarella
07/31/2023, 11:03 AMkodai
08/08/2023, 9:17 AMPython Version: 3.11.4
prefect-dbt Version: 0.3.1
Code:
from prefect_dbt.cloud import DbtCloudCredentials, DbtCloudJob
dbt_cloud_credentials = DbtCloudCredentials.load("BLOCK-NAME-PLACEHOLDER")
dbt_cloud_job = DbtCloudJob.load(
dbt_cloud_credentials=dbt_cloud_credentials,
job_id="JOB-ID-PLACEHOLDER"
)
Error Message:
$ python create_job_block.py
Traceback (most recent call last):
File "/Users/k/work/prefect/dbt-snowflake/create_job_block.py", line 4, in <module>
dbt_cloud_job = DbtCloudJob.load(
^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/utilities/asyncutils.py", line 255, in coroutine_wrapper
return call()
^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 383, in __call__
return self.result()
^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 283, in result
return self.future.result(timeout=timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 169, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 346, in _run_async
result = await coro
^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/client/utilities.py", line 51, in with_injected_client
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
TypeError: Block.load() got an unexpected keyword argument 'dbt_cloud_credentials'
$
Triet Le
08/22/2023, 5:16 AMBigQueryTargetConfig
block in order to register a DbtCliProfile
block ? Let’s say I have 30 datasets on prod, do I have to do this step 30 times? This sounds quite cumbersome for me so I’d appreciate if you could give me any pointer on this
from prefect_gcp.credentials import GcpCredentials
from prefect_dbt.cli import BigQueryTargetConfigs, DbtCliProfile
credentials = GcpCredentials.load("CREDENTIALS-BLOCK-NAME-PLACEHOLDER")
target_configs = BigQueryTargetConfigs(
schema="SCHEMA-NAME-PLACEHOLDER", # also known as dataset
credentials=credentials,
)
target_configs.save("TARGET-CONFIGS-BLOCK-NAME-PLACEHOLDER")
dbt_cli_profile = DbtCliProfile(
name="PROFILE-NAME-PLACEHOLDER",
target="TARGET-NAME-placeholder",
target_configs=target_configs,
)
dbt_cli_profile.save("DBT-CLI-PROFILE-BLOCK-NAME-PLACEHOLDER")
Camila Caleones
09/27/2023, 10:23 PM