Marius Vollmer
03/08/2023, 11:36 AMRuntimeError: Command failed with exit code 133:
/tmp/prefect-4zqou1u7: line 1: 299284 Trace/breakpoint trap (core dumped) dbt build --select dl_nicc_ecom__order_header --profiles-dir /tmp/dbt --project-dir /tmp/dbt
12:14:43 PM
Finished in state Failed('Flow run encountered an exception. RuntimeError: Command failed with exit code 133:\n/tmp/prefect-4zqou1u7: line 1: 299284 Trace/breakpoint trap (core dumped) dbt build --select dl_nicc_ecom__order_header --profiles-dir /tmp/dbt --project-dir /tmp/dbt\n\n')
Many thanks in advance!Vincenzo
03/21/2023, 1:32 PMStephen Lloyd
04/10/2023, 9:22 AMDbtCoreOperation
we are receiving the following error...
This suggests that our dbt cli profile block, which we have defined in dbt Cloud, does not have a name or target or target_configs, but it does..
pydantic.error_wrappers.ValidationError: 3 validation errors for DbtCoreOperation
dbt_cli_profile -> name
field required (type=value_error.missing)
dbt_cli_profile -> target
field required (type=value_error.missing)
dbt_cli_profile -> target_configs
field required (type=value_error.missing)
I was able to use the profile_dir
parameter to use my local credentials, but this only verifies there aren't obvious problems with the flow code.
prefect-dbt==0.3.1
prefect-snowflake==0.26.0
Any ideas?Stephen Lloyd
04/12/2023, 10:06 AMdbtCoreOperation
.
I am running...
with DbtCoreOperation(
commands=["dbt deps", "dbt build " + dbt_build_arg + " --target " + DBT_ENV],
project_dir="edw-dbt",
# working_directory="edw-dbt",
# overwrite_profiles=True,
profiles_dir="~/.dbt",
# dbt_cli_profile=dbt_cli_profile,
stream_output=True,
) as dbt_operation:
dbt_process = dbt_operation.trigger()
dbt_process.wait_for_completion()
result = dbt_process.fetch_result()
Currently failing with:
15:35:09.705 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Error importing adapter: No module named 'dbt.adapters.snowflake'
15:35:09.706 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Encountered an error while reading profiles:
ERROR: Runtime Error
Credentials in profile "edw", target "dev" invalid: Runtime Error
Could not find adapter type snowflake!Defined profiles:
- edw
For more information on configuring profiles, please consult the dbt docs:
<https://docs.getdbt.com/docs/configure-your-profile>
15:35:09.720 | INFO | Flow run 'furry-carp' - PID 98611 stream output:
09:50:09 Encountered an error:
Runtime Error
Could not run dbt
fozan talat
04/13/2023, 5:22 PMFinished in state Failed('Flow run encountered an exception. RuntimeError: PID 17072 failed with return code 1.\n')
ale
05/08/2023, 4:33 PMprefect-dbt
I see the available targets are Snowflake, Bigquery and Postgres, but not Redshift.
Should I use the Postgres target?
Any hint is much appreciated! 🙌Ali Marques
05/17/2023, 2:29 PMTyler Simpson
05/18/2023, 11:52 PMGiacomo Chiarella
05/26/2023, 9:38 AMTom A
06/08/2023, 2:59 PM3.10
environment and am hitting some errors. Is there anyone who can take a look and let me know if I'm doing something wrong? Going to put some more info in the thread to not clog up the channel..Justin
06/24/2023, 7:34 PMRuntimeError: PID 29636 failed with return code 1.
and with another section saying:
OneDrive\Documents\WindowsPowerShell\profile.ps1 cannot be loaded because running scripts is disabled
on this system. For more information, see about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.
My code and more info in this thread:Mike B
06/29/2023, 12:09 AMFile "/usr/lib/python3.10/subprocess.py", line 1845, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'bash'
I'm hitting this error any time that I try to trigger a DbtCoreOperation from a deployment. I'm running the most recent version of the prefect server and task runner, and most recent versions of the prefect_dbt and prefect_airbyte pip packages. This is running on an Ubuntu 22.04 server. It's using the default local file storage for the flow, and a local agent running on the same server as the Prefect server process. I'm hitting the above exception as soon as I try to kick off any DbtCoreOperation, as soon as either "run()" or "trigger()" is called. The workflow runs perfectly when running the script manually, but fails on the dbt stuff as soon as I try running it through a deployment. The Airbyte operation runs fine from both environments, so this is specific to the dbt operations.
Any help would be much appreciated! Code below:
import sys
from prefect import flow
from prefect_dbt.cli import DbtCoreOperation
from prefect_airbyte.server import AirbyteServer
from prefect_airbyte.connections import AirbyteConnection
DBT_PROJECT_DIR = "<path to dbt project>"
DBT_PROFILE_DIR = "<path to dbt project>"
AIRBYTE_CONNECTION = "<connection id>"
airbyte_server = AirbyteServer(server_host="<host>", server_port=<port>)
@flow(log_prints=True)
def dbt_snapshot():
with DbtCoreOperation(commands=["dbt snapshot"],
project_dir = DBT_PROJECT_DIR,
profiles_dir = DBT_PROFILE_DIR) as dbt_run:
dbt_process = dbt_run.trigger()
dbt_process.wait_for_completion()
dbt_output = dbt_process.fetch_result()
return dbt_output
@flow(log_prints=True)
def dbt_model():
result = DbtCoreOperation(
commands = [f"dbt run --select models/*"],
project_dir = DBT_PROJECT_DIR,
profiles_dir = DBT_PROFILE_DIR
).run()
return result
@flow(log_prints=True)
def airbyte_sync():
connection = AirbyteConnection(
connection_id = AIRBYTE_CONNECTION,
airbyte_server = airbyte_server
)
job_run = connection.trigger()
job_run.wait_for_completion()
res = job_run.fetch_result()
return res
@flow(log_prints=True)
def dbt_airbyte_sync():
dbt_snapshot()
dbt_model()
airbyte_sync()
if __name__ == "__main__":
try:
dbt_airbyte_sync()
except Exception as err:
print("Exception while running workflow", err)
sys.exit(1)
sys.exit(0)
Tyson Chavarie
07/06/2023, 6:40 PMfrom prefect_dbt.cloud import DbtCloudCredentials
dbt_cloud_credentials = DbtCloudCredentials.load("dbt-cloud")
Asterios Pantousas
07/27/2023, 3:41 PMtrigger_dbt_cli_command
, each performing various CLI commands. More specifically, we run different dbt models based on certain tags.
The issue we encounter is within the Prefect UI, where these dbt tasks are automatically assigned default names (such as trigger_dbt_cli_command-0
, trigger_dbt_cli_command-1
, etc.). This automatic naming becomes problematic when a task fails, as it is difficult to discern precisely which task has encountered the error.
From the docs I didn't manage to find any solution to that. We would greatly appreciate your insight and any potential solutions to this naming issue. 😁Giacomo Chiarella
07/28/2023, 2:42 PMGiacomo Chiarella
07/31/2023, 11:03 AMkodai
08/08/2023, 9:17 AMPython Version: 3.11.4
prefect-dbt Version: 0.3.1
Code:
from prefect_dbt.cloud import DbtCloudCredentials, DbtCloudJob
dbt_cloud_credentials = DbtCloudCredentials.load("BLOCK-NAME-PLACEHOLDER")
dbt_cloud_job = DbtCloudJob.load(
dbt_cloud_credentials=dbt_cloud_credentials,
job_id="JOB-ID-PLACEHOLDER"
)
Error Message:
$ python create_job_block.py
Traceback (most recent call last):
File "/Users/k/work/prefect/dbt-snowflake/create_job_block.py", line 4, in <module>
dbt_cloud_job = DbtCloudJob.load(
^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/utilities/asyncutils.py", line 255, in coroutine_wrapper
return call()
^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 383, in __call__
return self.result()
^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 283, in result
return self.future.result(timeout=timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 169, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 346, in _run_async
result = await coro
^^^^^^^^^^
File "/usr/local/Caskroom/miniconda/base/lib/python3.11/site-packages/prefect/client/utilities.py", line 51, in with_injected_client
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
TypeError: Block.load() got an unexpected keyword argument 'dbt_cloud_credentials'
$
Triet Le
08/22/2023, 5:16 AMBigQueryTargetConfig
block in order to register a DbtCliProfile
block ? Let’s say I have 30 datasets on prod, do I have to do this step 30 times? This sounds quite cumbersome for me so I’d appreciate if you could give me any pointer on this
from prefect_gcp.credentials import GcpCredentials
from prefect_dbt.cli import BigQueryTargetConfigs, DbtCliProfile
credentials = GcpCredentials.load("CREDENTIALS-BLOCK-NAME-PLACEHOLDER")
target_configs = BigQueryTargetConfigs(
schema="SCHEMA-NAME-PLACEHOLDER", # also known as dataset
credentials=credentials,
)
target_configs.save("TARGET-CONFIGS-BLOCK-NAME-PLACEHOLDER")
dbt_cli_profile = DbtCliProfile(
name="PROFILE-NAME-PLACEHOLDER",
target="TARGET-NAME-placeholder",
target_configs=target_configs,
)
dbt_cli_profile.save("DBT-CLI-PROFILE-BLOCK-NAME-PLACEHOLDER")
Camila Caleones
09/27/2023, 10:23 PMChris Reuter
10/17/2023, 3:22 PMJakob Bluhm
10/24/2023, 6:56 PMDaniel Lomartra
10/24/2023, 10:37 PM.deploy()
method to create a docker image and push to an azure container registry then running my deployment from K8s.
Seems to be some type of import error:
PID 147 stream output:
[0m22:33:16 Finished running in 0 hours 0 minutes and 0.31 seconds (0.31s).
[0m22:33:16 Encountered an error:
Runtime Error
Database error while listing schemas in database "<OMITTED>"
Database Error
cannot import name 'SnowflakeOCSPAsn1Crypto' from 'snowflake.connector.ocsp_asn1crypto' (/usr/local/lib/python3.11/site-packages/snowflake/connector/ocsp_asn1crypto.py)
Vishnu Duggirala
10/31/2023, 4:40 PMtrigger_dbt_cloud_job_run_and_wait_for_completion
🧵Eric Ti Yu Chiang
10/31/2023, 6:45 PMRobert
11/08/2023, 3:11 AMfrom prefect import flow
from prefect_dbt.cloud import DbtCloudCredentials, DbtCloudJob
@flow
def staging_job(JOB_ID = 123456):
dbt_cloud_credentials = DbtCloudCredentials.load("dbt-cloud-credentials")
dbt_cloud_job = DbtCloudJob(dbt_cloud_credentials=dbt_cloud_credentials, job_id=JOB_ID)
dbt_cloud_job_run = dbt_cloud_job.trigger()
dbt_cloud_job_run.wait_for_completion()
dbt_cloud_job_run.fetch_result()
return dbt_cloud_job_run
staging_job()
David Anderson
12/07/2023, 4:23 PMImporting flow code from 'flows/testing/test-flow-dbt-deps.py:run_dbt_deps'
10:07:11 AM
prefect.flow_runs
Starting 'ConcurrentTaskRunner'; submitted tasks will be run concurrently...
10:07:11 AM
prefect.flow_runs
Executing flow 'run-dbt-deps' for flow run 'versatile-seriema'...
10:07:11 AM
prefect.flow_runs
Beginning execution...
10:07:11 AM
prefect.flow_runs
Writing the following commands to '/tmp/prefect-igdjymm3.sh':
dbt deps --profiles-dir /home/ec2-user/.dbt --project-dir /home/ec2-user/data-dbt-projects
10:07:11 AM
prefect.flow_runs
PID 31482 triggered with 1 commands running inside the PosixPath('/home/ec2-user/data-dbt-projects') directory.
10:07:11 AM
prefect.flow_runs
Waiting for PID 31482 to complete.
10:07:11 AM
prefect.flow_runs
Encountered exception during execution:
Traceback (most recent call last):
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/engine.py", line 841, in orchestrate_flow_run
result = await flow_call.aresult()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 293, in aresult
return await asyncio.wrap_future(self.future)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 318, in _run_sync
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/tmpb9mm9vjuprefect/data-prefect-main/flows/testing/test-flow-dbt-deps.py", line 17, in run_dbt_deps
).run()
^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/utilities/asyncutils.py", line 255, in coroutine_wrapper
return call()
^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 398, in __call__
return self.result()
^^^^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 284, in result
return self.future.result(timeout=timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 168, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 355, in _run_async
result = await coro
^^^^^^^^^^
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect_shell/commands.py", line 396, in run
await shell_process.wait_for_completion()
File "/home/ec2-user/.pyenv/versions/venv_3-11-3_prefect/lib/python3.11/site-packages/prefect_shell/commands.py", line 177, in wait_for_completion
raise RuntimeError(
RuntimeError: PID 31482 failed with return code 127.
10:07:11 AM
prefect.flow_runs
Finished in state Failed('Flow run encountered an exception. RuntimeError: PID 31482 failed with return code 127.')
Matthew Jeffs
12/19/2023, 7:45 PMHenning Holgersen
02/06/2024, 8:41 PMEthan Veres
03/04/2024, 7:06 PMtrigger_dbt_cli_command
?clesa
03/07/2024, 10:49 AMfrom pathlib import Path
from prefect import flow
from prefect_dbt.cli.commands import trigger_dbt_cli_command, DbtCliProfile, DbtCoreOperation
PROJECT_DIR = Path(__file__).parent.parent / 'datawarehouse'
DBT_PROJECT_DIR = PROJECT_DIR / 'dbt_project.yml'
PROFILES_DIR = PROJECT_DIR / 'profiles.yml'
@flow(log_prints=True)
def dbt_model():
result = DbtCoreOperation(
commands=["dbt debug"],
project_dir=DBT_PROJECT_DIR,
profiles_path=PROFILES_DIR,
).run()
return result
if __name__ == "__main__":
dbt_model()