Skip to content

Commit

Permalink
Merge branch 'current' into mwong-add-visual-editor
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Jun 26, 2024
2 parents 301335c + b8da686 commit e61a855
Show file tree
Hide file tree
Showing 12 changed files with 122 additions and 44 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,75 @@ The following fields are required when creating a Postgres, Redshift, or AlloyDB

<Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/postgres-redshift-connection.png" width="70%" title="Configuring a Redshift connection"/>

For dbt Cloud users, please log in using the default Database username and password. Note this is because [`IAM` authentication](https://docs.aws.amazon.com/redshift/latest/mgmt/generating-user-credentials.html) is not compatible with dbt Cloud.
### Authentication Parameters

For authentication, dbt Cloud users can use either a **Database username and password**, or they can now use **IAM User authentication** to Redshift via [extended attributes](/docs/dbt-cloud-environments#extended-attributes).

<Tabs
defaultValue="database"
values={[
{label: 'Database', value: 'database'},
{label: 'IAM User', value: 'iam-user-inline'},
]}
>
<TabItem value="database">

The following table contains the parameters for the database (password-based) connection method.


| Field | Description | Examples |
| ------------- | ------- | ------------ |
| `user` | Account username to log into your cluster | myuser |
| `password` | Password for authentication | _password1! |

<br/>

</TabItem>

<TabItem value="iam-user-inline">

On Cloud, the IAM user authentication is currently only supported via [extended attributes](/docs/dbt-cloud-environments#extended-attributes). Once the project is created, development and deployment environments can be updated to use extended attributes to pass the fields described below, as some are not supported via textbox.

You will need to create an IAM User, generate an [access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_CreateAccessKey), and either:
- on a cluster, a database user is expected in the `user` field. The IAM user is only leveraged for authentication, the database user for authorization
- on Serverless, grant permission to the IAM user in Redshift. The `user` field is ignored (but still required)
- For both, the `password` field will be ignored.


| Profile field | Example | Description |
| ------------- | ------- | ------------ |
| `method` |IAM| use IAM to authenticate via IAM User authentication |
| `cluster_id` | CLUSTER_ID| Required for IAM authentication only for provisoned cluster, not for Serverless |
| `user` | username | User querying the database, ignored for Serverless (but still required) |
| `region` | us-east-1 | Region of your Redshift instance |
| `access_key_id` | ACCESS_KEY_ID | IAM user [access key id](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_CreateAccessKey) |
| `secret_access_key` | SECRET_ACCESS_KEY | IAM user secret access key |

<br/>

#### Example Extended Attributes for IAM User on Redshift Serverless

To avoid pasting secrets in extended attributes, leverage [environment variables](/docs/build/environment-variables#handling-secrets):

<File name='~/.dbt/profiles.yml'>

```yaml
host: my-production-instance.myregion.redshift-serverless.amazonaws.com
method: iam
region: us-east-2
access_key_id: '{{ env_var(''DBT_ENV_ACCESS_KEY_ID'') }}'
secret_access_key: '{{ env_var(''DBT_ENV_SECRET_ACCESS_KEY'') }}'
```
</File>
Both `DBT_ENV_ACCESS_KEY_ID` and `DBT_ENV_SECRET_ACCESS_KEY` will need [to be assigned](/docs/build/environment-variables) for every environment leveraging extended attributes as such.

</TabItem>

</Tabs>


### Connecting via an SSH Tunnel

Expand Down
46 changes: 19 additions & 27 deletions website/docs/docs/core/connect-data-platform/redshift-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,22 +42,20 @@ import SetUpPages from '/snippets/_setup-pages-intro.md';

## Authentication Parameters

The authentication methods that dbt Core supports are:

- `database` &mdash; Password-based authentication (default, will be used if `method` is not provided)
- `IAM` &mdash; IAM
The authentication methods that dbt Core supports on Redshift are:

For dbt Cloud users, log in using the default **Database username** and **password**. This is necessary because dbt Cloud does not support `IAM` authentication.
- `Database` &mdash; Password-based authentication (default, will be used if `method` is not provided)
- `IAM User` &mdash; IAM User authentication via AWS Profile

Click on one of these authentication methods for further details on how to configure your connection profile. Each tab also includes an example `profiles.yml` configuration file for you to review.

<Tabs
defaultValue="database"
values={[
{label: 'database', value: 'database'},
{label: 'IAM', value: 'IAM'},
]}
>
{label: 'Database', value: 'database'},
{label: 'IAM User via AWS Profile (Core)', value: 'iam-user-profile'}]
}>

<TabItem value="database">

Expand All @@ -66,7 +64,6 @@ The following table contains the parameters for the database (password-based) co
| Profile field | Example | Description |
| ------------- | ------- | ------------ |
| `method` | database| Leave this parameter unconfigured, or set this to database |
| `host` | hostname.region.redshift.amazonaws.com| Host of cluster |
| `user` | username | Account username to log into your cluster |
| `password` | password1 | Password for authentication |

Expand Down Expand Up @@ -103,28 +100,22 @@ company-name:

</TabItem>

<TabItem value="IAM">
<TabItem value="iam-user-profile">

The following table lists the authentication parameters to use IAM authentication.

To set up a Redshift profile using IAM Authentication, set the `method` parameter to `iam` as shown below. Note that a password is not required when using IAM Authentication. For more information on this type of authentication,
consult the [Redshift Documentation](https://docs.aws.amazon.com/redshift/latest/mgmt/generating-user-credentials.html)
and [boto3
docs](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/redshift.html#Redshift.Client.get_cluster_credentials)
on generating user credentials with IAM Auth.
To set up a Redshift profile using IAM Authentication, set the `method` parameter to `iam` as shown below. Note that a password is not required when using IAM Authentication. For more information on this type of authentication, consult the [Redshift Documentation](https://docs.aws.amazon.com/redshift/latest/mgmt/generating-user-credentials.html) and [boto3 docs](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/redshift.html#Redshift.Client.get_cluster_credentials) on generating user credentials with IAM Auth.

If you receive the "You must specify a region" error when using IAM
Authentication, then your aws credentials are likely misconfigured. Try running
`aws configure` to set up AWS access keys, and pick a default region. If you have any questions,
please refer to the official AWS documentation on [Configuration and credential file settings](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html).
If you receive the "You must specify a region" error when using IAM Authentication, then your aws credentials are likely misconfigured. Try running `aws configure` to set up AWS access keys, and pick a default region. If you have any questions, please refer to the official AWS documentation on [Configuration and credential file settings](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html).

| Profile field | Example | Description |
| ------------- | ------- | ------------ |
| `method` |IAM| use IAM to authenticate |
| `method` |IAM| use IAM to authenticate via IAM User authentication |
| `iam_profile` | analyst | dbt will use the specified profile from your ~/.aws/config file |
| `cluster_id` | CLUSTER_ID| Required for IAM |
| `user` | username | Account user to log into your cluster |
| `region` | us-east-1 | Required for IAM authentication |
| `cluster_id` | CLUSTER_ID| Required for IAM authentication only for provisoned cluster, not for Serverless |
| `user` | username | User querying the database, ignored for Serverless (but field still required) |
| `region` | us-east-1 | Region of your Redshift instance |


<br/>

Expand Down Expand Up @@ -163,13 +154,14 @@ please refer to the official AWS documentation on [Configuration and credential

</File>

</TabItem>
#### Specifying an IAM Profile

</Tabs>
When the `iam_profile` configuration is set, dbt will use the specified profile from your `~/.aws/config` file instead of using the profile name `default`

### Specifying an IAM Profile
</TabItem>

When the `iam_profile` configuration is set, dbt will use the specified profile from your `~/.aws/config` file instead of using the profile name `default`

</Tabs>

## Redshift notes

Expand Down
2 changes: 2 additions & 0 deletions website/docs/docs/dbt-versions/release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ Release notes are grouped by month for both multi-tenant and virtual private clo

## June 2024

- **New:** [Job warnings](/docs/deploy/job-notifications) are now GA. Previously, you could receive email or Slack alerts about your jobs when they succeeded, failed, or were canceled. Now with the new **Warns** option, you can also receive alerts when jobs have encountered warnings from tests or source freshness checks during their run. This gives you more flexibility on _when_ to be notified.

- **New:** A [preview](/docs/dbt-versions/product-lifecycles#dbt-cloud) of the dbt Snowflake Native App is now available. With this app, you can access dbt Explorer, the **Ask dbt** chatbot, and orchestration observability features, extending your dbt Cloud experience into the Snowflake UI. To learn more, check out [About the dbt Snowflake Native App](/docs/cloud-integrations/snowflake-native-app) and [Set up the dbt Snowflake Native App](/docs/cloud-integrations/set-up-snowflake-native-app).

## May 2024
Expand Down
18 changes: 12 additions & 6 deletions website/docs/docs/deploy/job-notifications.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,13 @@ id: "job-notifications"
description: "Set up notifications in dbt Cloud to receive email or Slack alerts about job run status."
---

Set up notifications in dbt Cloud to receive email or Slack alerts when a job run succeeds, fails, or is cancelled.

Set up notifications in dbt Cloud to receive email or Slack alerts about the status of a job run. You can choose to be notified by one or more of the following job run statuses:

- **Succeeds** option &mdash; A job run completed successfully.
- **Warns** option &mdash; A job run encountered warnings from [tests](/docs/build/data-tests) or [source freshness](/docs/deploy/source-freshness) checks (if applicable).
- **Fails** option &mdash; A job run failed to complete.
- **Is canceled** option &mdash; A job run is canceled.

## Email notifications

Expand All @@ -27,7 +33,7 @@ You can receive email alerts about jobs by configuring the dbt Cloud email notif

1. Select the **Environment** for the jobs you want to receive notifications about from the dropdown.

1. Click **Edit** to configure the email notification settings. Choose one or more of the run statuses (**Succeeds**, **Fails**, **Is Canceled**) for each job you want to receive notifications about.
1. Click **Edit** to configure the email notification settings. Choose one or more of the run statuses for each job you want to receive notifications about.

1. When you're done with the settings, click **Save**.

Expand Down Expand Up @@ -69,19 +75,19 @@ If you're already logged in to Slack, the handshake only requires allowing the a
If you're logged out or the Slack app/website is closed, you must authenticate before completing the integration.

1. Complete the field defining the Slack workspace you want to integrate with dbt Cloud.
<Lightbox src="/img/docs/dbt-cloud/define-workspace.png" width="75%" title="Define the workspace"/>
<Lightbox src="/img/docs/dbt-cloud/define-workspace.png" width="60%" title="Define the workspace"/>
2. Sign in with an existing identity or use email address and password.
3. Once you have authenticated successfully, accept the permissions.
<Lightbox src="/img/docs/dbt-cloud/accept-permissions.png" width="75%" title="Allow dbt access to Slack"/>
<Lightbox src="/img/docs/dbt-cloud/accept-permissions.png" width="65%" title="Allow dbt access to Slack"/>

### Configure Slack notifications

1. From the gear menu, choose **Notification settings**.
1. Select **Slack notifications** in the left sidebar.
1. Select the **Notification channel** you want to receive the job run notifications from the dropdown.
<Lightbox src="/img/docs/deploy/example-notification-slack-channels.png" width="75%" title="Example of the Notification channel dropdown"/>
<Lightbox src="/img/docs/deploy/example-notification-slack-channels.png" width="100%" title="Example of the Notification channel dropdown"/>
1. Select the **Environment** for the jobs you want to receive notifications about from the dropdown.
1. Click **Edit** to configure the Slack notification settings. Choose one or more of the run statuses (**Succeeds**, **Fails**, **Is Canceled**) for each job you want to receive notifications about.
1. Click **Edit** to configure the Slack notification settings. Choose one or more of the run statuses for each job you want to receive notifications about.
1. When you're done with the settings, click **Save**.

To send alerts to another Slack channel, select another **Notification channel** from the dropdown, **Edit** those job notification settings, and **Save** the changes.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/deploy/monitor-jobs.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This portion of our documentation will go over dbt Cloud's various capabilities

- [Run visibility](/docs/deploy/run-visibility) &mdash; View your run history to help identify where improvements can be made to scheduled jobs.
- [Retry jobs](/docs/deploy/retry-jobs) &mdash; Rerun your errored jobs from start or the failure point.
- [Job notifications](/docs/deploy/job-notifications) &mdash; Receive email or slack notifications when a job run succeeds, fails, or is canceled.
- [Job notifications](/docs/deploy/job-notifications) &mdash; Receive email or Slack notifications when a job run succeeds, encounters warnings, fails, or is canceled.
- [Webhooks](/docs/deploy/webhooks) &mdash; Use webhooks to send events about your dbt jobs' statuses to other systems.
- [Leverage artifacts](/docs/deploy/artifacts) &mdash; dbt Cloud generates and saves artifacts for your project, which it uses to power features like creating docs for your project and reporting freshness of your sources.
- [Source freshness](/docs/deploy/source-freshness) &mdash; Monitor data governance by enabling snapshots to capture the freshness of your data sources.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/deploy/webhooks.md
Original file line number Diff line number Diff line change
Expand Up @@ -549,5 +549,5 @@ DELETE https://{your access URL}/api/v3/accounts/{account_id}/webhooks/subscript

## Related docs
- [dbt Cloud CI](/docs/deploy/continuous-integration)
- [Use dbt Cloud's webhooks with other SaaS apps](/guides)
- [Use dbt Cloud's webhooks with other SaaS apps](https://docs.getdbt.com/guides?tags=Webhooks)

4 changes: 2 additions & 2 deletions website/docs/docs/use-dbt-semantic-layer/exports.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ dbt sl export --saved-query sq_number1 --export-as table --alias new_export

## Exports in production

Enabling and executing exports in dbt Cloud optimizes data workflows and ensures real-time data access. It enhances efficiency and governance for smarter decisions.
Enabling and executing exports in dbt Cloud optimizes data workflows and ensures real-time data access. It enhances efficiency and governance for smarter decisions.

To enable exports in production to run saved queries and write them within your data platform, you'll need to set up dbt Cloud job scheduler and perform the following steps:
Exports use the default credentials of the production environment. To enable exports to run saved queries and write them within your data platform, perform the following steps:

1. [Set an environment variable](#set-environment-variable) in dbt Cloud.
2. [Create and execute export](#create-and-execute-exports) job run.
Expand Down
12 changes: 6 additions & 6 deletions website/docs/guides/mesh-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ You can also watch the [YouTube video on dbt and Snowflake](https://www.youtube.

### Related content:
- [Data mesh concepts: What it is and how to get started](https://www.getdbt.com/blog/data-mesh-concepts-what-it-is-and-how-to-get-started)
- [Deciding how to structure your dbt Mesh](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-2-structures)
- [dbt Mesh best practices guide](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-3-implementation)
- [dbt Mesh FAQs](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-4-faqs)
- [Deciding how to structure your dbt Mesh](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-3-structures)
- [dbt Mesh best practices guide](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-4-implementation)
- [dbt Mesh FAQs](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-5-faqs)

## Prerequisites​

Expand Down Expand Up @@ -553,7 +553,7 @@ A member of the Finance team would like to schedule a dbt Cloud job for their cu

1. In the “Jaffle | Finance” project, go to the **Jobs** page by navigating to **Deploy** and then **Jobs**.
2. Then click **Create job** and then **Deploy job**.
3. Add a name for the job, then scroll to the bottom to the **Job completion** section.
3. Add a name for the job, then scroll to the bottom of the **Job completion** section.
4. In **Job completion** section, configure the job to **Run when another job finishes** and select the upstream job from the “Jaffle | Data Analytics” project.
<Lightbox src="/img/guides/dbt-mesh/trigger_on_completion.png" title="Trigger job on completion" />

Expand Down Expand Up @@ -626,12 +626,12 @@ Congratulations 🎉! You're ready to bring the benefits of dbt Mesh to your org
- How to establish a foundational project "Jaffle | Data Analytics."
- Create a downstream project "Jaffle | Finance."
- Implement model access, versions, and contracts.
- Set up up dbt Cloud jobs triggered by upstream job completions.
- Set up dbt Cloud jobs triggered by upstream job completions.

Here are some additional resources to help you continue your journey:

- [How we build our dbt mesh projects](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-1-intro)
- [dbt Mesh FAQs](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-4-faqs)
- [dbt Mesh FAQs](https://docs.getdbt.com/best-practices/how-we-mesh/mesh-5-faqs)
- [Cross-project references](/docs/collaborate/govern/project-dependencies#how-to-write-cross-project-ref)
- [dbt Explorer](/docs/collaborate/explore-projects)

Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
10 changes: 10 additions & 0 deletions website/vercel.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@
"cleanUrls": true,
"trailingSlash": false,
"redirects": [
{
"source": "/best-practices/how-we-mesh/mesh-4-faqs",
"destination": "/best-practices/how-we-mesh/mesh-5-faqs",
"permanent": true
},
{
"source": "/docs/collaborate/cloud-build-and-view-your-docs",
"destination": "/docs/collaborate/build-and-view-your-docs",
Expand Down Expand Up @@ -3175,6 +3180,11 @@
"destination": "/guides/orchestration/set-up-ci/multiple-environments",
"permanent": true
},
{
"source": "/best-practices/how-we-mesh/mesh-2-structures",
"destination": "/best-practices/how-we-mesh/mesh-3-structures",
"permanent": true
},
{
"source": "/guides/advanced/adapter-development/:path*",
"destination": "/guides/adapter-creation",
Expand Down

0 comments on commit e61a855

Please sign in to comment.