dbt integration: technical reference#

This page explains technical details of Adverity’s dbt integration.

Git repository connection#

Adverity uses SSH key-based authentication to connect to Git repositories. When you create a dbt project, Adverity generates a unique SSH key pair for that project. The public key is displayed in the project settings and must be added to your Git repository as a deploy key. Read-only access is sufficient.

The public key uses the ed25519 algorithm, for example: ssh-ed25519 AAAAC3NzaC1... adverity.

Private keys are encrypted at rest and are never exposed in logs or the UI.

Repository cloning#

For each job run, Adverity performs a fresh clone of your repository into an isolated temporary directory. After the job completes, all temporary files are automatically cleaned up.

The clone uses the branch configured in your project settings. If no branch is specified, the default branch of your repository is used.

To locate dbt_project.yml, Adverity walks the repository tree until it finds the file. dbt then runs from the directory where dbt_project.yml is located. If you have specified a path to dbt_project.yml in your project settings, the tree search is skipped and Adverity uses that path directly. For example, if the file is located in an /analytics subdirectory, execution happens from that directory rather than the repository root.

For troubleshooting repository connection issues, see dbt troubleshooting.

profiles.yml configuration#

There are two ways to set up profiles when executing dbt jobs in Adverity:

Adverity-managed profile

When you select an Adverity destination as the warehouse for your dbt project, Adverity automatically generates a profiles.yml file at runtime. The --profiles-dir parameter is always passed to dbt, pointing to the location of this generated file.

The profile name is read from the profile property in your dbt_project.yml file. The target name is generated based on the destination type, for example snowflake or databricks.

Warning

The profile property must be present in your dbt_project.yml, even when using an Adverity-managed profile. If this property is missing, the job run will fail.

For troubleshooting, see dbt troubleshooting.

Own profiles.yml

If you select Use own profiles.yml in your project settings, Adverity searches for a profiles.yml file in your repository and passes its path as the --profiles-dir parameter. In this case, you are responsible for ensuring that your profiles.yml is complete and contains valid credentials.

Credential handling#

When using an Adverity-managed profile, credentials are injected at runtime using dbt’s native env_var() function. This ensures that sensitive values are never written to disk or included in logs.

The authentication method depends on the destination type:

Destination

Authentication method

Snowflake

Username and password

Google BigQuery

OAuth2 or service account

Databricks

Personal access token

Redshift

Username and password

PostgreSQL

Username and password

When using your own profiles.yml, you are responsible for credential management.

Job execution and supported commands#

When a dbt job runs, commands execute in the following order:

  1. dbt deps (if enabled)

  2. dbt run

  3. dbt docs (if enabled)

dbt deps

Installs packages defined in your packages.yml before the run. This command does not accept extra parameters.

dbt run

Executes your dbt models against the target warehouse. Possible outcomes:

  • Success — all models executed successfully.

  • Failed — one or more models failed.

  • Cancelled — the job was manually cancelled.

dbt docs

Generates documentation for your dbt project, including project metadata and catalog information. The output is available as a downloadable archive in the job run logs. This command does not accept extra parameters.

Runtime arguments#

You can pass additional command-line arguments to dbt run by selecting Run custom resources when configuring the job run. Extra parameters can be set at three levels:

  • Schedule — applied to all runs on that schedule.

  • Trigger — applied when dbt is triggered by a datastream load.

  • Single run — applied to a specific manual run.

The following arguments are supported:

Argument

Shortcut

Use case

--select

-s

Run specific models, tags, or folders.

--full-refresh

-f

Rebuild incremental models from scratch.

--fail-fast

-x

Stop execution on the first model failure.

--vars

Pass variables to dbt models.

--exclude

Skip specific models from the run.

--target

Switch between environments.

Note the following when using extra parameters:

  • Arguments are case-sensitive.

  • When using --select, model paths are resolved relative to model-paths in your dbt_project.yml. If model-paths is not configured, dbt defaults to looking in a models/ folder in the same directory as the dbt_project.yml file used for the run.

  • --vars values must be enclosed in single quotes: --vars '{"key": "value"}'

  • --target has no effect when using an Adverity-managed warehouse. Adverity uses its own generated target name, which cannot be overridden.

  • --profiles-dir and --project-dir are managed by Adverity and should not be specified.

  • For security purposes, extra parameters are validated before execution. Certain special characters are not permitted.