diff --git a/MIGRATION_GUIDE.md b/MIGRATION_GUIDE.md index 148dc15e45..b02fcedef1 100644 --- a/MIGRATION_GUIDE.md +++ b/MIGRATION_GUIDE.md @@ -7,6 +7,36 @@ across different versions. > [!TIP] > We highly recommend upgrading the versions one by one instead of bulk upgrades. +## v0.99.0 ➞ v1.0.0 + +### Removed deprecated objects +All of the deprecated objects are removed from v1 release. This includes: + +- Resources + - `snowflake_database_old` + - `snowflake_role` + - `snowflake_oauth_integration` + - `snowflake_saml_integration` + - `snowflake_session_parameter` + - `snowflake_stream` + - `snowflake_tag_masking_policy_association` +- Data sources + - `snowflake_role` +- Fields in the provider configuration: + - `account` + - `oauth_access_token` + - `oauth_client_id` + - `oauth_client_secret` + - `oauth_endpoint` + - `oauth_redirect_url` + - `oauth_refresh_token` + - `private_key_path` + - `region` + - `session_params` + - `username` + +Additionally, `JWT` value is no longer available for `authenticator` field in the provider configuration. + ## v0.98.0 ➞ v0.99.0 ### *(new feature)* snowflake_tags datasource @@ -102,7 +132,8 @@ We have added new fields to match the ones in [the driver](https://pkg.go.dev/gi To be more consistent with other configuration options, we have decided to add `driver_tracing` to the configuration schema. This value can also be configured by `SNOWFLAKE_DRIVER_TRACING` environmental variable and by `drivertracing` field in the TOML file. The previous `SF_TF_GOSNOWFLAKE_LOG_LEVEL` environmental variable is not supported now, and was removed from the provider. #### *(behavior change)* deprecated fields -Because of new fields `account_name` and `organization_name`, `account` is now deprecated. It will be removed with the v1 release. Please adjust your configurations from +Because of new fields `account_name` and `organization_name`, `account` is now deprecated. It will be removed with the v1 release. +If you use Terraform configuration file, adjust it from ```terraform provider "snowflake" { account = "ORGANIZATION-ACCOUNT" @@ -117,6 +148,31 @@ provider "snowflake" { } ``` +If you use TOML configuration file, adjust it from +```toml +[default] + account = "ORGANIZATION-ACCOUNT" +} +``` + +to +```toml +[default] + organizationname = "ORGANIZATION" + accountname = "ACCOUNT" +} +``` + +If you use environmental variables, adjust them from +```bash +SNOWFLAKE_ACCOUNT = "ORGANIZATION-ACCOUNT" +``` + +```bash +SNOWFLAKE_ORGANIZATION_NAME = "ORGANIZATION" +SNOWFLAKE_ACCOUNT_NAME = "ACCOUNT" +``` + #### *(behavior change)* changed behavior of some fields For the fields that are not deprecated, we focused on improving validations and documentation. Also, we adjusted some fields to match our [driver's](https://github.com/snowflakedb/gosnowflake) defaults. Specifically: - Relaxed validations for enum fields like `protocol` and `authenticator`. Now, the case on such fields is ignored. diff --git a/docs/data-sources/role.md b/docs/data-sources/role.md deleted file mode 100644 index ddb3bcc22e..0000000000 --- a/docs/data-sources/role.md +++ /dev/null @@ -1,30 +0,0 @@ ---- -page_title: "snowflake_role Data Source - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_role (Data Source) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use [snowflake_roles](./roles) instead. - -## Example Usage - -```terraform -data "snowflake_role" "this" { - name = "role1" -} -``` - - -## Schema - -### Required - -- `name` (String) The role for which to return metadata. - -### Read-Only - -- `comment` (String) The comment on the role -- `id` (String) The ID of this resource. diff --git a/docs/index.md b/docs/index.md index 518af87d7c..f6e9f2cfd6 100644 --- a/docs/index.md +++ b/docs/index.md @@ -70,10 +70,8 @@ provider "snowflake" { ### Optional -- `account` (String, Deprecated) Use `account_name` and `organization_name` instead. Specifies your Snowflake account identifier assigned, by Snowflake. The [account locator](https://docs.snowflake.com/en/user-guide/admin-account-identifier#format-2-account-locator-in-a-region) format is not supported. For information about account identifiers, see the [Snowflake documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html). Required unless using `profile`. Can also be sourced from the `SNOWFLAKE_ACCOUNT` environment variable. - `account_name` (String) Specifies your Snowflake account name assigned by Snowflake. For information about account identifiers, see the [Snowflake documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier#account-name). Required unless using `profile`. Can also be sourced from the `SNOWFLAKE_ACCOUNT_NAME` environment variable. -- `authenticator` (String) Specifies the [authentication type](https://pkg.go.dev/github.com/snowflakedb/gosnowflake#AuthType) to use when connecting to Snowflake. Valid options are: `SNOWFLAKE` | `OAUTH` | `EXTERNALBROWSER` | `OKTA` | `JWT` | `SNOWFLAKE_JWT` | `TOKENACCESSOR` | `USERNAMEPASSWORDMFA`. Value `JWT` is deprecated and will be removed in future releases. Can also be sourced from the `SNOWFLAKE_AUTHENTICATOR` environment variable. -- `browser_auth` (Boolean, Deprecated) Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_USE_BROWSER_AUTH` environment variable. +- `authenticator` (String) Specifies the [authentication type](https://pkg.go.dev/github.com/snowflakedb/gosnowflake#AuthType) to use when connecting to Snowflake. Valid options are: `SNOWFLAKE` | `OAUTH` | `EXTERNALBROWSER` | `OKTA` | `SNOWFLAKE_JWT` | `TOKENACCESSOR` | `USERNAMEPASSWORDMFA`. Can also be sourced from the `SNOWFLAKE_AUTHENTICATOR` environment variable. - `client_ip` (String) IP address for network checks. Can also be sourced from the `SNOWFLAKE_CLIENT_IP` environment variable. - `client_request_mfa_token` (String) When true the MFA token is cached in the credential manager. True by default in Windows/OSX. False for Linux. Can also be sourced from the `SNOWFLAKE_CLIENT_REQUEST_MFA_TOKEN` environment variable. - `client_store_temporary_credential` (String) When true the ID token is cached in the credential manager. True by default in Windows/OSX. False for Linux. Can also be sourced from the `SNOWFLAKE_CLIENT_STORE_TEMPORARY_CREDENTIAL` environment variable. @@ -91,12 +89,6 @@ provider "snowflake" { - `keep_session_alive` (Boolean) Enables the session to persist even after the connection is closed. Can also be sourced from the `SNOWFLAKE_KEEP_SESSION_ALIVE` environment variable. - `login_timeout` (Number) Login retry timeout in seconds EXCLUDING network roundtrip and read out http response. Can also be sourced from the `SNOWFLAKE_LOGIN_TIMEOUT` environment variable. - `max_retry_count` (Number) Specifies how many times non-periodic HTTP request can be retried by the driver. Can also be sourced from the `SNOWFLAKE_MAX_RETRY_COUNT` environment variable. -- `oauth_access_token` (String, Sensitive, Deprecated) Token for use with OAuth. Generating the token is left to other tools. Cannot be used with `browser_auth`, `private_key_path`, `oauth_refresh_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_ACCESS_TOKEN` environment variable. -- `oauth_client_id` (String, Sensitive, Deprecated) Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_CLIENT_ID` environment variable. -- `oauth_client_secret` (String, Sensitive, Deprecated) Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_CLIENT_SECRET` environment variable. -- `oauth_endpoint` (String, Sensitive, Deprecated) Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_ENDPOINT` environment variable. -- `oauth_redirect_url` (String, Sensitive, Deprecated) Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_REDIRECT_URL` environment variable. -- `oauth_refresh_token` (String, Sensitive, Deprecated) Token for use with OAuth. Setup and generation of the token is left to other tools. Should be used in conjunction with `oauth_client_id`, `oauth_client_secret`, `oauth_endpoint`, `oauth_redirect_url`. Cannot be used with `browser_auth`, `private_key_path`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_REFRESH_TOKEN` environment variable. - `ocsp_fail_open` (String) True represents OCSP fail open mode. False represents OCSP fail closed mode. Fail open true by default. Can also be sourced from the `SNOWFLAKE_OCSP_FAIL_OPEN` environment variable. - `okta_url` (String) The URL of the Okta server. e.g. https://example.okta.com. Okta URL host needs to to have a suffix `okta.com`. Read more in Snowflake [docs](https://docs.snowflake.com/en/user-guide/oauth-okta). Can also be sourced from the `SNOWFLAKE_OKTA_URL` environment variable. - `organization_name` (String) Specifies your Snowflake organization name assigned by Snowflake. For information about account identifiers, see the [Snowflake documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier#organization-name). Required unless using `profile`. Can also be sourced from the `SNOWFLAKE_ORGANIZATION_NAME` environment variable. @@ -105,20 +97,16 @@ provider "snowflake" { - `passcode_in_password` (Boolean) False by default. Set to true if the MFA passcode is embedded to the configured password. Can also be sourced from the `SNOWFLAKE_PASSCODE_IN_PASSWORD` environment variable. - `password` (String, Sensitive) Password for user + password auth. Cannot be used with `browser_auth` or `private_key_path`. Can also be sourced from the `SNOWFLAKE_PASSWORD` environment variable. - `port` (Number) Specifies a custom port value used by the driver for privatelink connections. Can also be sourced from the `SNOWFLAKE_PORT` environment variable. -- `private_key` (String, Sensitive) Private Key for username+private-key auth. Cannot be used with `browser_auth` or `password`. Can also be sourced from the `SNOWFLAKE_PRIVATE_KEY` environment variable. +- `private_key` (String, Sensitive) Private Key for username+private-key auth. Cannot be used with `password`. Can also be sourced from the `SNOWFLAKE_PRIVATE_KEY` environment variable. - `private_key_passphrase` (String, Sensitive) Supports the encryption ciphers aes-128-cbc, aes-128-gcm, aes-192-cbc, aes-192-gcm, aes-256-cbc, aes-256-gcm, and des-ede3-cbc. Can also be sourced from the `SNOWFLAKE_PRIVATE_KEY_PASSPHRASE` environment variable. -- `private_key_path` (String, Sensitive, Deprecated) Path to a private key for using keypair authentication. Cannot be used with `browser_auth`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_PRIVATE_KEY_PATH` environment variable. - `profile` (String) Sets the profile to read from ~/.snowflake/config file. Can also be sourced from the `SNOWFLAKE_PROFILE` environment variable. - `protocol` (String) A protocol used in the connection. Valid options are: `http` | `https`. Can also be sourced from the `SNOWFLAKE_PROTOCOL` environment variable. -- `region` (String, Deprecated) Snowflake region, such as "eu-central-1", with this parameter. However, since this parameter is deprecated, it is best to specify the region as part of the account parameter. For details, see the description of the account parameter. [Snowflake region](https://docs.snowflake.com/en/user-guide/intro-regions.html) to use. Required if using the [legacy format for the `account` identifier](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#format-2-legacy-account-locator-in-a-region) in the form of `.`. Can also be sourced from the `SNOWFLAKE_REGION` environment variable. - `request_timeout` (Number) request retry timeout in seconds EXCLUDING network roundtrip and read out http response. Can also be sourced from the `SNOWFLAKE_REQUEST_TIMEOUT` environment variable. - `role` (String) Specifies the role to use by default for accessing Snowflake objects in the client session. Can also be sourced from the `SNOWFLAKE_ROLE` environment variable. -- `session_params` (Map of String, Deprecated) Sets session parameters. [Parameters](https://docs.snowflake.com/en/sql-reference/parameters) - `tmp_directory_path` (String) Sets temporary directory used by the driver for operations like encrypting, compressing etc. Can also be sourced from the `SNOWFLAKE_TMP_DIRECTORY_PATH` environment variable. - `token` (String, Sensitive) Token to use for OAuth and other forms of token based auth. Can also be sourced from the `SNOWFLAKE_TOKEN` environment variable. - `token_accessor` (Block List, Max: 1) (see [below for nested schema](#nestedblock--token_accessor)) - `user` (String) Username. Required unless using `profile`. Can also be sourced from the `SNOWFLAKE_USER` environment variable. -- `username` (String, Deprecated) Username for user + password authentication. Required unless using `profile`. Can also be sourced from the `SNOWFLAKE_USERNAME` environment variable. - `validate_default_parameters` (String) True by default. If false, disables the validation checks for Database, Schema, Warehouse and Role at the time a connection is established. Can also be sourced from the `SNOWFLAKE_VALIDATE_DEFAULT_PARAMETERS` environment variable. - `warehouse` (String) Specifies the virtual warehouse to use by default for queries, loading, etc. in the client session. Can also be sourced from the `SNOWFLAKE_WAREHOUSE` environment variable. @@ -359,15 +347,6 @@ provider "snowflake" { } ``` -## Currently deprecated resources + -- [snowflake_database_old](./docs/resources/database_old) -- [snowflake_oauth_integration](./docs/resources/oauth_integration) -- [snowflake_role](./docs/resources/role) - use [snowflake_account_role](./docs/resources/account_role) instead -- [snowflake_saml_integration](./docs/resources/saml_integration) - use [snowflake_saml2_integration](./docs/resources/saml2_integration) instead -- [snowflake_stream](./docs/resources/stream) -- [snowflake_tag_masking_policy_association](./docs/resources/tag_masking_policy_association) - -## Currently deprecated datasources - -- [snowflake_role](./docs/data-sources/role) - use [snowflake_roles](./docs/data-sources/roles) instead + diff --git a/docs/resources/database_old.md b/docs/resources/database_old.md deleted file mode 100644 index a0ee5bb8c1..0000000000 --- a/docs/resources/database_old.md +++ /dev/null @@ -1,88 +0,0 @@ ---- -page_title: "snowflake_database_old Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_database_old (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use snowflake_database or snowflake_shared_database or snowflake_secondary_database instead. - -## Example Usage - -```terraform -resource "snowflake_database_old" "simple" { - name = "testing" - comment = "test comment" - data_retention_time_in_days = 3 -} - -resource "snowflake_database_old" "with_replication" { - name = "testing_2" - comment = "test comment 2" - replication_configuration { - accounts = ["test_account1", "test_account_2"] - ignore_edition_check = true - } -} - -resource "snowflake_database_old" "from_replica" { - name = "testing_3" - comment = "test comment" - data_retention_time_in_days = 3 - from_replica = "\"org1\".\"account1\".\"primary_db_name\"" -} - -resource "snowflake_database_old" "from_share" { - name = "testing_4" - comment = "test comment" - from_share = { - provider = "account1_locator" - share = "share1" - } -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `name` (String) Specifies the identifier for the database; must be unique for your account. - -### Optional - -- `comment` (String) Specifies a comment for the database. -- `data_retention_time_in_days` (Number) Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). -- `from_database` (String) Specify a database to create a clone from. -- `from_replica` (String) Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. An example would be: `"myorg1"."account1"."db1"` -- `from_share` (Map of String) Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator. -- `is_transient` (Boolean) Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss. -- `replication_configuration` (Block List, Max: 1) When set, specifies the configurations for database replication. (see [below for nested schema](#nestedblock--replication_configuration)) - -### Read-Only - -- `id` (String) The ID of this resource. - - -### Nested Schema for `replication_configuration` - -Required: - -- `accounts` (List of String) - -Optional: - -- `ignore_edition_check` (Boolean) - -## Import - -Import is supported using the following syntax: - -```shell -terraform import snowflake_database_old.example 'database_name' -``` diff --git a/docs/resources/oauth_integration.md b/docs/resources/oauth_integration.md deleted file mode 100644 index 2038424c66..0000000000 --- a/docs/resources/oauth_integration.md +++ /dev/null @@ -1,58 +0,0 @@ ---- -page_title: "snowflake_oauth_integration Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_oauth_integration (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use snowflake_oauth_integration_for_custom_clients or snowflake_oauth_integration_for_partner_applications instead. - -## Example Usage - -```terraform -resource "snowflake_oauth_integration" "tableau_desktop" { - name = "TABLEAU_DESKTOP" - oauth_client = "TABLEAU_DESKTOP" - enabled = true - oauth_issue_refresh_tokens = true - oauth_refresh_token_validity = 3600 - blocked_roles_list = ["SYSADMIN"] -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `name` (String) Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. -- `oauth_client` (String) Specifies the OAuth client type. - -### Optional - -- `blocked_roles_list` (Set of String) List of roles that a user cannot explicitly consent to using after authenticating. Do not include ACCOUNTADMIN, ORGADMIN or SECURITYADMIN as they are already implicitly enforced and will cause in-place updates. -- `comment` (String) Specifies a comment for the OAuth integration. -- `enabled` (Boolean) Specifies whether this OAuth integration is enabled or disabled. -- `oauth_client_type` (String) Specifies the type of client being registered. Snowflake supports both confidential and public clients. -- `oauth_issue_refresh_tokens` (Boolean) Specifies whether to allow the client to exchange a refresh token for an access token when the current access token has expired. -- `oauth_redirect_uri` (String) Specifies the client URI. After a user is authenticated, the web browser is redirected to this URI. -- `oauth_refresh_token_validity` (Number) Specifies how long refresh tokens should be valid (in seconds). OAUTH_ISSUE_REFRESH_TOKENS must be set to TRUE. -- `oauth_use_secondary_roles` (String) Specifies whether default secondary roles set in the user properties are activated by default in the session being opened. - -### Read-Only - -- `created_on` (String) Date and time when the OAuth integration was created. -- `id` (String) The ID of this resource. - -## Import - -Import is supported using the following syntax: - -```shell -terraform import snowflake_oauth_integration.example name -``` diff --git a/docs/resources/role.md b/docs/resources/role.md deleted file mode 100644 index cf79b2cdf3..0000000000 --- a/docs/resources/role.md +++ /dev/null @@ -1,71 +0,0 @@ ---- -page_title: "snowflake_role Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - The resource is used for role management, where roles can be assigned privileges and, in turn, granted to users and other roles. When granted to roles they can create hierarchies of privilege structures. For more details, refer to the official documentation https://docs.snowflake.com/en/user-guide/security-access-control-overview. ---- - -# snowflake_role (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use [snowflake_account_role](./account_role) instead. - -The resource is used for role management, where roles can be assigned privileges and, in turn, granted to users and other roles. When granted to roles they can create hierarchies of privilege structures. For more details, refer to the [official documentation](https://docs.snowflake.com/en/user-guide/security-access-control-overview). - -## Example Usage - -```terraform -## Minimal -resource "snowflake_role" "minimal" { - name = "role_name" -} - -## Complete (with every optional set) -resource "snowflake_role" "complete" { - name = "role_name" - comment = "my account role" -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `name` (String) Identifier for the role; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` - -### Optional - -- `comment` (String) - -### Read-Only - -- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). -- `id` (String) The ID of this resource. -- `show_output` (List of Object) Outputs the result of `SHOW ROLES` for the given role. (see [below for nested schema](#nestedatt--show_output)) - - -### Nested Schema for `show_output` - -Read-Only: - -- `assigned_to_users` (Number) -- `comment` (String) -- `created_on` (String) -- `granted_roles` (Number) -- `granted_to_roles` (Number) -- `is_current` (Boolean) -- `is_default` (Boolean) -- `is_inherited` (Boolean) -- `name` (String) -- `owner` (String) - -## Import - -Import is supported using the following syntax: - -```shell -terraform import snowflake_role.example "name" -``` diff --git a/docs/resources/saml_integration.md b/docs/resources/saml_integration.md deleted file mode 100644 index 9ec4415251..0000000000 --- a/docs/resources/saml_integration.md +++ /dev/null @@ -1,66 +0,0 @@ ---- -page_title: "snowflake_saml_integration Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_saml_integration (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use [snowflake_saml2_integration](./saml2_integration) instead. - -## Example Usage - -```terraform -resource "snowflake_saml_integration" "saml_integration" { - name = "saml_integration" - saml2_provider = "CUSTOM" - saml2_issuer = "test_issuer" - saml2_sso_url = "https://testsamlissuer.com" - saml2_x509_cert = "MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO" - enabled = true -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `name` (String) Specifies the name of the SAML2 integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. -- `saml2_issuer` (String) The string containing the IdP EntityID / Issuer. -- `saml2_provider` (String) The string describing the IdP. One of the following: OKTA, ADFS, Custom. -- `saml2_sso_url` (String) The string containing the IdP SSO URL, where the user should be redirected by Snowflake (the Service Provider) with a SAML AuthnRequest message. -- `saml2_x509_cert` (String) The Base64 encoded IdP signing certificate on a single line without the leading -----BEGIN CERTIFICATE----- and ending -----END CERTIFICATE----- markers. - -### Optional - -- `enabled` (Boolean) Specifies whether this security integration is enabled or disabled. -- `saml2_enable_sp_initiated` (Boolean) The Boolean indicating if the Log In With button will be shown on the login page. TRUE: displays the Log in WIth button on the login page. FALSE: does not display the Log in With button on the login page. -- `saml2_force_authn` (Boolean) The Boolean indicating whether users, during the initial authentication flow, are forced to authenticate again to access Snowflake. When set to TRUE, Snowflake sets the ForceAuthn SAML parameter to TRUE in the outgoing request from Snowflake to the identity provider. TRUE: forces users to authenticate again to access Snowflake, even if a valid session with the identity provider exists. FALSE: does not force users to authenticate again to access Snowflake. -- `saml2_post_logout_redirect_url` (String) The endpoint to which Snowflake redirects users after clicking the Log Out button in the classic Snowflake web interface. Snowflake terminates the Snowflake session upon redirecting to the specified endpoint. -- `saml2_requested_nameid_format` (String) The SAML NameID format allows Snowflake to set an expectation of the identifying attribute of the user (i.e. SAML Subject) in the SAML assertion from the IdP to ensure a valid authentication to Snowflake. If a value is not specified, Snowflake sends the urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress value in the authentication request to the IdP. NameID must be one of the following values: urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified, urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress, urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName, urn:oasis:names:tc:SAML:1.1:nameid-format:WindowsDomainQualifiedName, urn:oasis:names:tc:SAML:2.0:nameid-format:kerberos, urn:oasis:names:tc:SAML:2.0:nameid-format:persistent, urn:oasis:names:tc:SAML:2.0:nameid-format:transient . -- `saml2_sign_request` (Boolean) The Boolean indicating whether SAML requests are signed. TRUE: allows SAML requests to be signed. FALSE: does not allow SAML requests to be signed. -- `saml2_snowflake_acs_url` (String) The string containing the Snowflake Assertion Consumer Service URL to which the IdP will send its SAML authentication response back to Snowflake. This property will be set in the SAML authentication request generated by Snowflake when initiating a SAML SSO operation with the IdP. If an incorrect value is specified, Snowflake returns an error message indicating the acceptable values to use. Default: https://..snowflakecomputing.com/fed/login -- `saml2_snowflake_issuer_url` (String) The string containing the EntityID / Issuer for the Snowflake service provider. If an incorrect value is specified, Snowflake returns an error message indicating the acceptable values to use. -- `saml2_snowflake_x509_cert` (String) The Base64 encoded self-signed certificate generated by Snowflake for use with Encrypting SAML Assertions and Signed SAML Requests. You must have at least one of these features (encrypted SAML assertions or signed SAML responses) enabled in your Snowflake account to access the certificate value. -- `saml2_sp_initiated_login_page_label` (String) The string containing the label to display after the Log In With button on the login page. - -### Read-Only - -- `created_on` (String) Date and time when the SAML integration was created. -- `id` (String) The ID of this resource. -- `saml2_digest_methods_used` (String) -- `saml2_signature_methods_used` (String) -- `saml2_snowflake_metadata` (String) Metadata created by Snowflake to provide to SAML2 provider. - -## Import - -Import is supported using the following syntax: - -```shell -terraform import snowflake_saml_integration.example name -``` diff --git a/docs/resources/session_parameter.md b/docs/resources/session_parameter.md deleted file mode 100644 index 7f41515856..0000000000 --- a/docs/resources/session_parameter.md +++ /dev/null @@ -1,54 +0,0 @@ ---- -page_title: "snowflake_session_parameter Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_session_parameter (Resource) - - - -## Example Usage - -```terraform -resource "snowflake_session_parameter" "s" { - key = "AUTOCOMMIT" - value = "false" - user = "TEST_USER" -} - -resource "snowflake_session_parameter" "s2" { - key = "BINARY_OUTPUT_FORMAT" - value = "BASE64" - on_account = true -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `key` (String) Name of session parameter. Valid values are those in [session parameters](https://docs.snowflake.com/en/sql-reference/parameters.html#session-parameters). -- `value` (String) Value of session parameter, as a string. Constraints are the same as those for the parameters in Snowflake documentation. - -### Optional - -- `on_account` (Boolean) If true, the session parameter will be set on the account level. -- `user` (String) The user to set the session parameter for. Required if on_account is false - -### Read-Only - -- `id` (String) The ID of this resource. - -## Import - -Import is supported using the following syntax: - -```shell -terraform import snowflake_session_parameter.s -``` diff --git a/docs/resources/stream.md b/docs/resources/stream.md deleted file mode 100644 index 1a36f91eb3..0000000000 --- a/docs/resources/stream.md +++ /dev/null @@ -1,76 +0,0 @@ ---- -page_title: "snowflake_stream Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - ---- - -# snowflake_stream (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_stream_on_directory_table` | `snowflake_stream_on_external_table` | `snowflake_stream_on_table` | `snowflake_stream_on_view` - -## Example Usage - -```terraform -resource "snowflake_table" "table" { - database = "database" - schema = "schema" - name = "name" - - column { - type = "NUMBER(38,0)" - name = "id" - } -} - -resource "snowflake_stream" "stream" { - comment = "A stream." - - database = "database" - schema = "schema" - name = "stream" - - on_table = snowflake_table.table.fully_qualified_name - append_only = false - insert_only = false - - owner = "role1" -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `database` (String) The database in which to create the stream. -- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. -- `schema` (String) The schema in which to create the stream. - -### Optional - -- `append_only` (Boolean) Type of the stream that will be created. -- `comment` (String) Specifies a comment for the stream. -- `insert_only` (Boolean) Create an insert only stream type. -- `on_stage` (String) Specifies an identifier for the stage the stream will monitor. -- `on_table` (String) Specifies an identifier for the table the stream will monitor. -- `on_view` (String) Specifies an identifier for the view the stream will monitor. -- `show_initial_rows` (Boolean) Specifies whether to return all existing rows in the source table as row inserts the first time the stream is consumed. - -### Read-Only - -- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). -- `id` (String) The ID of this resource. -- `owner` (String) Name of the role that owns the stream. - -## Import - -Import is supported using the following syntax: - -```shell -# format is database name | schema name | stream name -terraform import snowflake_stream.example 'dbName|schemaName|streamName' -``` diff --git a/docs/resources/tag_masking_policy_association.md b/docs/resources/tag_masking_policy_association.md deleted file mode 100644 index 36d6c8943a..0000000000 --- a/docs/resources/tag_masking_policy_association.md +++ /dev/null @@ -1,85 +0,0 @@ ---- -page_title: "snowflake_tag_masking_policy_association Resource - terraform-provider-snowflake" -subcategory: "" -description: |- - Attach a masking policy to a tag. Requires a current warehouse to be set. Either with SNOWFLAKE_WAREHOUSE env variable or in current session. If no warehouse is provided, a temporary warehouse will be created. ---- - -# snowflake_tag_masking_policy_association (Resource) - -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_tag` - -Attach a masking policy to a tag. Requires a current warehouse to be set. Either with SNOWFLAKE_WAREHOUSE env variable or in current session. If no warehouse is provided, a temporary warehouse will be created. - -## Example Usage - -```terraform -# Note: Currently this feature is only available to accounts that are Enterprise Edition (or higher) - -resource "snowflake_database" "test" { - name = "TEST_DB1" - data_retention_time_in_days = 1 -} - -resource "snowflake_database" "test2" { - name = "TEST_DB2" - data_retention_time_in_days = 1 -} - - -resource "snowflake_schema" "test2" { - database = snowflake_database.test2.name - name = "FOOBAR2" - data_retention_days = snowflake_database.test2.data_retention_time_in_days -} - -resource "snowflake_schema" "test" { - database = snowflake_database.test.name - name = "FOOBAR" - data_retention_days = snowflake_database.test.data_retention_time_in_days -} - -resource "snowflake_tag" "this" { - name = upper("test_tag") - database = snowflake_database.test2.name - schema = snowflake_schema.test2.name -} - -resource "snowflake_masking_policy" "example_masking_policy" { - name = "EXAMPLE_MASKING_POLICY" - database = snowflake_database.test.name - schema = snowflake_schema.test.name - value_data_type = "string" - masking_expression = "case when current_role() in ('ACCOUNTADMIN') then val else sha2(val, 512) end" - return_data_type = "string" -} - -resource "snowflake_tag_masking_policy_association" "name" { - tag_id = snowflake_tag.this.fully_qualified_name - masking_policy_id = snowflake_masking_policy.example_masking_policy.fully_qualified_name -} -``` - --> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). - - - -## Schema - -### Required - -- `masking_policy_id` (String) The resource id of the masking policy -- `tag_id` (String) Specifies the identifier for the tag. Note: format must follow: "databaseName"."schemaName"."tagName" or "databaseName.schemaName.tagName" or "databaseName|schemaName.tagName" (snowflake_tag.tag.id) - -### Read-Only - -- `id` (String) The ID of this resource. - -## Import - -Import is supported using the following syntax: - -```shell -# format is tag database name | tag schema name | tag name | masking policy database | masking policy schema | masking policy name -terraform import snowflake_tag_masking_policy_association.example 'tag_db|tag_schema|tag_name|mp_db|mp_schema|mp_name' -``` diff --git a/docs/technical-documentation/identifiers_rework_design_decisions.md b/docs/technical-documentation/identifiers_rework_design_decisions.md index b60aad2183..b0edcc776a 100644 --- a/docs/technical-documentation/identifiers_rework_design_decisions.md +++ b/docs/technical-documentation/identifiers_rework_design_decisions.md @@ -18,7 +18,7 @@ * [Conclusions](#conclusions) -This document summarises work done in the [identifiers rework](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#identifiers-rework) and future plans for further identifier improvements. +This document summarises work done in the [identifiers rework](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#identifiers-rework) and future plans for further identifier improvements. But before we dive into results and design decisions, here’s the list of reasons why we decided to rework the identifiers in the first place: - Common issues with identifiers with arguments (identifiers for functions, procedures, and external functions). - Meaningless error messages whenever an invalid identifier is specified. @@ -31,31 +31,31 @@ Now, knowing the issues we wanted to solve, we would like to present the changes ## Topics ### New identifier parser -To resolve many of our underlying problems with parsing identifiers, we decided to go with the new one that will be able to correctly parse fully qualified names of objects. -In addition to a better parsing function, we made sure it will return user-friendly error messages that will be able to find the root cause of a problem when specifying invalid identifiers. +To resolve many of our underlying problems with parsing identifiers, we decided to go with the new one that will be able to correctly parse fully qualified names of objects. +In addition to a better parsing function, we made sure it will return user-friendly error messages that will be able to find the root cause of a problem when specifying invalid identifiers. Previously, the error looked like [this](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2091). ### Using the recommended format for account identifiers -Previously, the use of account identifiers was mixed across the resources, in many cases causing confusion ([commonly known issues reference](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/CREATING_ISSUES.md#incorrect-account-identifier-snowflake_databasefrom_share)). -Some of them required an account locator format (that was not fully supported and is currently deprecated), and some of the new recommended ones. -We decided to unify them and use the new account identifier format everywhere. +Previously, the use of account identifiers was mixed across the resources, in many cases causing confusion ([commonly known issues reference](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/CREATING_ISSUES.md#incorrect-account-identifier-snowflake_databasefrom_share)). +Some of them required an account locator format (that was not fully supported), and some of the new recommended ones. +We decided to unify them and use the new account identifier format everywhere. The account locator format is not supported in v1. ### Better handling for identifiers with arguments Previously, the handling of identifiers with arguments was not done fully correctly, causing many issues and confusion on how to use them ([commonly known issues reference](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/CREATING_ISSUES.md#granting-on-functions-or-procedures)). -The main pain point was using them with privilege-granting resources. To address this we had to make two steps. -The first one was adding a dedicated representation of an identifier containing arguments and using it in our SDK. -The second one was additional parsing for the output of SHOW GRANTS in our SDK which was only necessary for functions, +The main pain point was using them with privilege-granting resources. To address this we had to make two steps. +The first one was adding a dedicated representation of an identifier containing arguments and using it in our SDK. +The second one was additional parsing for the output of SHOW GRANTS in our SDK which was only necessary for functions, procedures, and external functions that returned non-valid identifier formats. ### Quoting differences -There are many reported issues on identifier quoting and how it is inconsistent across resources and causes plan diffs to enforce certain format (e.g. [#2982](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2982), [#2236](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2236)). -To address that, we decided to add diff suppress on identifier fields that ignore changes related to differences in quotes. -The main root cause of such differences was that Snowflake has specific rules when a given identifier (or part of an identifier) is quoted and when it’s not. +There are many reported issues on identifier quoting and how it is inconsistent across resources and causes plan diffs to enforce certain format (e.g. [#2982](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2982), [#2236](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2236)). +To address that, we decided to add diff suppress on identifier fields that ignore changes related to differences in quotes. +The main root cause of such differences was that Snowflake has specific rules when a given identifier (or part of an identifier) is quoted and when it’s not. The diff suppression should make those rules irrelevant whenever identifiers in your Terraform configuration contain quotes or not. ### New computed fully qualified name field in resources -With the combination of quotes, old parsing methods, and other factors, it was a struggle to specify the fully qualified name of an object needed (e.g. [#2164](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2164), [#2754](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2754)). -Now, with v0.95.0, every resource that represents an object in Snowflake (e.g. user, role), and not an association (e.g. grants) will have a new computed field named `fully_qualified_name`. +With the combination of quotes, old parsing methods, and other factors, it was a struggle to specify the fully qualified name of an object needed (e.g. [#2164](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2164), [#2754](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2754)). +Now, with v0.95.0, every resource that represents an object in Snowflake (e.g. user, role), and not an association (e.g. grants) will have a new computed field named `fully_qualified_name`. With the new computed field, it will be much easier to use resources requiring fully qualified names, for examples of usage head over to the [documentation for granting privileges to account role](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/grant_privileges_to_account_role). ### New resource identifier format @@ -70,11 +70,11 @@ The main limitations around identifiers are strictly connected to what character - Avoid parentheses ‘(’ and ‘)’ when specifying identifiers for functions, procedures, external functions. Parentheses as part of their identifiers could potentially make our parser split the identifier into wrong parts causing issues. - Do not use double quotes as part of identifiers (in Snowflake you can have double quotes inside identifiers by escaping them with the second double quote, e.g. `create database “test””identifier”` will create a database with name `test"identifier`). -As a general recommendation, please lean toward simple names without any special characters, and if word separation is needed, use underscores. -This also applies to other “identifiers” like column names in tables or argument names in functions. +As a general recommendation, please lean toward simple names without any special characters, and if word separation is needed, use underscores. +This also applies to other “identifiers” like column names in tables or argument names in functions. If you are currently using complex identifiers, we recommend considering migration to simpler identifiers for a more straightforward and less error-prone experience. Also, we want to make it clear that every field specifying an identifier (or its part, e.g. `name`, `database`, `schema`) is always case-sensitive. By specifying -an identifier with lowercase characters in Terraform, you also have to refer to them with lowercase names in quotes in Snowflake. +an identifier with lowercase characters in Terraform, you also have to refer to them with lowercase names in quotes in Snowflake. For example, by specifying an account role with `name = "test"` to check privileges granted to the role in Snowflake, you have to call: ```sql show grants to role "test"; @@ -82,18 +82,18 @@ show grants to role test; -- this won't work, because unquoted identifiers are c ``` ### New identifier conventions -Although, we are closing the identifiers rework, some resources won’t have the mentioned improvements. -They were mostly applied to the objects that were already prepared for v1 ([essential objects](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/v1-preparations/ESSENTIAL_GA_OBJECTS.MD)). -The remaining resources (and newly created ones) will receive these improvements [during v1 preparation](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1) following our internal guidelines that contain those new rules regarding identifiers. +Although, we are closing the identifiers rework, some resources won’t have the mentioned improvements. +They were mostly applied to the objects that were already prepared for v1 ([essential objects](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/v1-preparations/ESSENTIAL_GA_OBJECTS.MD)). +The remaining resources (and newly created ones) will receive these improvements [during v1 preparation](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1) following our internal guidelines that contain those new rules regarding identifiers. No matter if the resource has been refactored or not, the same recommendations mentioned above apply. ## Next steps -While we have completed the identifiers rework for now, we plan to revisit these topics in the future to ensure continued improvements. +While we have completed the identifiers rework for now, we plan to revisit these topics in the future to ensure continued improvements. In the upcoming phases, we will focus on addressing the following key areas: - Implementing better validations for identifiers. - Providing support for new identifier formats in our resources (e.g. [instance roles](https://docs.snowflake.com/en/sql-reference/snowflake-db-classes#instance-roles)). ## Conclusions -We have concluded the identifiers rework, implementing significant improvements to address common issues and inconsistencies in identifier handling. +We have concluded the identifiers rework, implementing significant improvements to address common issues and inconsistencies in identifier handling. Moving forward, we aim to continue enhancing our identifier functionalities to provide a smoother experience. We value your feedback on the recent changes made to the identifiers. Please share your thoughts and suggestions to help us refine our identifier management further. diff --git a/examples/additional/deprecated_datasources.MD b/examples/additional/deprecated_datasources.MD index 935ebcfd54..393ab71209 100644 --- a/examples/additional/deprecated_datasources.MD +++ b/examples/additional/deprecated_datasources.MD @@ -1,3 +1,3 @@ -## Currently deprecated datasources + + -- [snowflake_role](./docs/data-sources/role) - use [snowflake_roles](./docs/data-sources/roles) instead diff --git a/examples/additional/deprecated_resources.MD b/examples/additional/deprecated_resources.MD index 557f4af44a..3b52465691 100644 --- a/examples/additional/deprecated_resources.MD +++ b/examples/additional/deprecated_resources.MD @@ -1,8 +1,3 @@ -## Currently deprecated resources + + -- [snowflake_database_old](./docs/resources/database_old) -- [snowflake_oauth_integration](./docs/resources/oauth_integration) -- [snowflake_role](./docs/resources/role) - use [snowflake_account_role](./docs/resources/account_role) instead -- [snowflake_saml_integration](./docs/resources/saml_integration) - use [snowflake_saml2_integration](./docs/resources/saml2_integration) instead -- [snowflake_stream](./docs/resources/stream) -- [snowflake_tag_masking_policy_association](./docs/resources/tag_masking_policy_association) diff --git a/examples/data-sources/snowflake_role/data-source.tf b/examples/data-sources/snowflake_role/data-source.tf deleted file mode 100644 index 49dea38bc1..0000000000 --- a/examples/data-sources/snowflake_role/data-source.tf +++ /dev/null @@ -1,3 +0,0 @@ -data "snowflake_role" "this" { - name = "role1" -} diff --git a/examples/resources/snowflake_database_old/import.sh b/examples/resources/snowflake_database_old/import.sh deleted file mode 100644 index 3ea61a2c21..0000000000 --- a/examples/resources/snowflake_database_old/import.sh +++ /dev/null @@ -1 +0,0 @@ -terraform import snowflake_database_old.example 'database_name' diff --git a/examples/resources/snowflake_database_old/resource.tf b/examples/resources/snowflake_database_old/resource.tf deleted file mode 100644 index 2219295495..0000000000 --- a/examples/resources/snowflake_database_old/resource.tf +++ /dev/null @@ -1,30 +0,0 @@ -resource "snowflake_database_old" "simple" { - name = "testing" - comment = "test comment" - data_retention_time_in_days = 3 -} - -resource "snowflake_database_old" "with_replication" { - name = "testing_2" - comment = "test comment 2" - replication_configuration { - accounts = ["test_account1", "test_account_2"] - ignore_edition_check = true - } -} - -resource "snowflake_database_old" "from_replica" { - name = "testing_3" - comment = "test comment" - data_retention_time_in_days = 3 - from_replica = "\"org1\".\"account1\".\"primary_db_name\"" -} - -resource "snowflake_database_old" "from_share" { - name = "testing_4" - comment = "test comment" - from_share = { - provider = "account1_locator" - share = "share1" - } -} diff --git a/examples/resources/snowflake_oauth_integration/import.sh b/examples/resources/snowflake_oauth_integration/import.sh deleted file mode 100644 index cbbb03d1ea..0000000000 --- a/examples/resources/snowflake_oauth_integration/import.sh +++ /dev/null @@ -1 +0,0 @@ -terraform import snowflake_oauth_integration.example name diff --git a/examples/resources/snowflake_oauth_integration/resource.tf b/examples/resources/snowflake_oauth_integration/resource.tf deleted file mode 100644 index d28900d9ce..0000000000 --- a/examples/resources/snowflake_oauth_integration/resource.tf +++ /dev/null @@ -1,8 +0,0 @@ -resource "snowflake_oauth_integration" "tableau_desktop" { - name = "TABLEAU_DESKTOP" - oauth_client = "TABLEAU_DESKTOP" - enabled = true - oauth_issue_refresh_tokens = true - oauth_refresh_token_validity = 3600 - blocked_roles_list = ["SYSADMIN"] -} diff --git a/examples/resources/snowflake_role/import.sh b/examples/resources/snowflake_role/import.sh deleted file mode 100644 index e0d1439a34..0000000000 --- a/examples/resources/snowflake_role/import.sh +++ /dev/null @@ -1 +0,0 @@ -terraform import snowflake_role.example "name" diff --git a/examples/resources/snowflake_role/resource.tf b/examples/resources/snowflake_role/resource.tf deleted file mode 100644 index 97d8851c29..0000000000 --- a/examples/resources/snowflake_role/resource.tf +++ /dev/null @@ -1,10 +0,0 @@ -## Minimal -resource "snowflake_role" "minimal" { - name = "role_name" -} - -## Complete (with every optional set) -resource "snowflake_role" "complete" { - name = "role_name" - comment = "my account role" -} diff --git a/examples/resources/snowflake_saml_integration/import.sh b/examples/resources/snowflake_saml_integration/import.sh deleted file mode 100644 index a2356ebbdf..0000000000 --- a/examples/resources/snowflake_saml_integration/import.sh +++ /dev/null @@ -1 +0,0 @@ -terraform import snowflake_saml_integration.example name \ No newline at end of file diff --git a/examples/resources/snowflake_saml_integration/resource.tf b/examples/resources/snowflake_saml_integration/resource.tf deleted file mode 100644 index fbd9bbb6c2..0000000000 --- a/examples/resources/snowflake_saml_integration/resource.tf +++ /dev/null @@ -1,8 +0,0 @@ -resource "snowflake_saml_integration" "saml_integration" { - name = "saml_integration" - saml2_provider = "CUSTOM" - saml2_issuer = "test_issuer" - saml2_sso_url = "https://testsamlissuer.com" - saml2_x509_cert = "MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO" - enabled = true -} \ No newline at end of file diff --git a/examples/resources/snowflake_session_parameter/import.sh b/examples/resources/snowflake_session_parameter/import.sh deleted file mode 100644 index 387e43c1ab..0000000000 --- a/examples/resources/snowflake_session_parameter/import.sh +++ /dev/null @@ -1 +0,0 @@ -terraform import snowflake_session_parameter.s diff --git a/examples/resources/snowflake_session_parameter/resource.tf b/examples/resources/snowflake_session_parameter/resource.tf deleted file mode 100644 index c7c0dc2b30..0000000000 --- a/examples/resources/snowflake_session_parameter/resource.tf +++ /dev/null @@ -1,11 +0,0 @@ -resource "snowflake_session_parameter" "s" { - key = "AUTOCOMMIT" - value = "false" - user = "TEST_USER" -} - -resource "snowflake_session_parameter" "s2" { - key = "BINARY_OUTPUT_FORMAT" - value = "BASE64" - on_account = true -} diff --git a/examples/resources/snowflake_stream/import.sh b/examples/resources/snowflake_stream/import.sh deleted file mode 100644 index e8086527ab..0000000000 --- a/examples/resources/snowflake_stream/import.sh +++ /dev/null @@ -1,2 +0,0 @@ -# format is database name | schema name | stream name -terraform import snowflake_stream.example 'dbName|schemaName|streamName' diff --git a/examples/resources/snowflake_stream/resource.tf b/examples/resources/snowflake_stream/resource.tf deleted file mode 100644 index aba5459816..0000000000 --- a/examples/resources/snowflake_stream/resource.tf +++ /dev/null @@ -1,24 +0,0 @@ -resource "snowflake_table" "table" { - database = "database" - schema = "schema" - name = "name" - - column { - type = "NUMBER(38,0)" - name = "id" - } -} - -resource "snowflake_stream" "stream" { - comment = "A stream." - - database = "database" - schema = "schema" - name = "stream" - - on_table = snowflake_table.table.fully_qualified_name - append_only = false - insert_only = false - - owner = "role1" -} diff --git a/examples/resources/snowflake_tag_masking_policy_association/import.sh b/examples/resources/snowflake_tag_masking_policy_association/import.sh deleted file mode 100644 index 69a2971cdf..0000000000 --- a/examples/resources/snowflake_tag_masking_policy_association/import.sh +++ /dev/null @@ -1,2 +0,0 @@ -# format is tag database name | tag schema name | tag name | masking policy database | masking policy schema | masking policy name -terraform import snowflake_tag_masking_policy_association.example 'tag_db|tag_schema|tag_name|mp_db|mp_schema|mp_name' \ No newline at end of file diff --git a/examples/resources/snowflake_tag_masking_policy_association/resource.tf b/examples/resources/snowflake_tag_masking_policy_association/resource.tf deleted file mode 100644 index 94f080aa35..0000000000 --- a/examples/resources/snowflake_tag_masking_policy_association/resource.tf +++ /dev/null @@ -1,44 +0,0 @@ -# Note: Currently this feature is only available to accounts that are Enterprise Edition (or higher) - -resource "snowflake_database" "test" { - name = "TEST_DB1" - data_retention_time_in_days = 1 -} - -resource "snowflake_database" "test2" { - name = "TEST_DB2" - data_retention_time_in_days = 1 -} - - -resource "snowflake_schema" "test2" { - database = snowflake_database.test2.name - name = "FOOBAR2" - data_retention_days = snowflake_database.test2.data_retention_time_in_days -} - -resource "snowflake_schema" "test" { - database = snowflake_database.test.name - name = "FOOBAR" - data_retention_days = snowflake_database.test.data_retention_time_in_days -} - -resource "snowflake_tag" "this" { - name = upper("test_tag") - database = snowflake_database.test2.name - schema = snowflake_schema.test2.name -} - -resource "snowflake_masking_policy" "example_masking_policy" { - name = "EXAMPLE_MASKING_POLICY" - database = snowflake_database.test.name - schema = snowflake_schema.test.name - value_data_type = "string" - masking_expression = "case when current_role() in ('ACCOUNTADMIN') then val else sha2(val, 512) end" - return_data_type = "string" -} - -resource "snowflake_tag_masking_policy_association" "name" { - tag_id = snowflake_tag.this.fully_qualified_name - masking_policy_id = snowflake_masking_policy.example_masking_policy.fully_qualified_name -} diff --git a/framework/provider/provider.go b/framework/provider/provider.go index 235a6f3dd1..2ad2a26cd9 100644 --- a/framework/provider/provider.go +++ b/framework/provider/provider.go @@ -77,7 +77,6 @@ type snowflakeProviderModelV0 struct { OauthEndpoint types.String `tfsdk:"oauth_endpoint"` OauthRedirectURL types.String `tfsdk:"oauth_redirect_url"` BrowserAuth types.Bool `tfsdk:"browser_auth"` - PrivateKeyPath types.String `tfsdk:"private_key_path"` SessionParams types.Map `tfsdk:"session_params"` } @@ -111,11 +110,11 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque DeprecationMessage: "Use `user` instead", }, "password": schema.StringAttribute{ - Description: "Password for username+password auth. Cannot be used with `browser_auth` or `private_key_path`. Can also be sourced from the `SNOWFLAKE_PASSWORD` environment variable.", + Description: "Password for username+password auth. Cannot be used with `browser_auth`. Can also be sourced from the `SNOWFLAKE_PASSWORD` environment variable.", Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), }, }, "warehouse": schema.StringAttribute{ @@ -225,7 +224,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("password"), path.MatchRoot("private_key_path"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), }, }, "private_key_passphrase": schema.StringAttribute{ @@ -233,7 +232,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("password"), path.MatchRoot("private_key_path"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token"), path.MatchRoot("oauth_refresh_token")), }, }, "disable_telemetry": schema.BoolAttribute{ @@ -277,20 +276,20 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque DeprecationMessage: "Use `params` instead", }, "oauth_access_token": schema.StringAttribute{ - Description: "Token for use with OAuth. Generating the token is left to other tools. Cannot be used with `browser_auth`, `private_key_path`, `oauth_refresh_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_ACCESS_TOKEN` environment variable.", + Description: "Token for use with OAuth. Generating the token is left to other tools. Cannot be used with `browser_auth`, `oauth_refresh_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_ACCESS_TOKEN` environment variable.", Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_refresh_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_refresh_token")), }, DeprecationMessage: "Use `token` instead", }, "oauth_refresh_token": schema.StringAttribute{ - Description: "Token for use with OAuth. Setup and generation of the token is left to other tools. Should be used in conjunction with `oauth_client_id`, `oauth_client_secret`, `oauth_endpoint`, `oauth_redirect_url`. Cannot be used with `browser_auth`, `private_key_path`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_REFRESH_TOKEN` environment variable.", + Description: "Token for use with OAuth. Setup and generation of the token is left to other tools. Should be used in conjunction with `oauth_client_id`, `oauth_client_secret`, `oauth_endpoint`, `oauth_redirect_url`. Cannot be used with `browser_auth`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_REFRESH_TOKEN` environment variable.", Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), stringvalidator.AlsoRequires(path.MatchRoot("oauth_client_id"), path.MatchRoot("oauth_client_secret"), path.MatchRoot("oauth_endpoint"), path.MatchRoot("oauth_redirect_url")), }, DeprecationMessage: "Use `token_accessor.0.refresh_token` instead", @@ -300,7 +299,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), stringvalidator.AlsoRequires(path.MatchRoot("oauth_refresh_token"), path.MatchRoot("oauth_client_secret"), path.MatchRoot("oauth_endpoint"), path.MatchRoot("oauth_redirect_url")), }, DeprecationMessage: "Use `token_accessor.0.client_id` instead", @@ -310,7 +309,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), stringvalidator.AlsoRequires(path.MatchRoot("oauth_refresh_token"), path.MatchRoot("oauth_client_id"), path.MatchRoot("oauth_endpoint"), path.MatchRoot("oauth_redirect_url")), }, DeprecationMessage: "Use `token_accessor.0.client_secret` instead", @@ -320,7 +319,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), stringvalidator.AlsoRequires(path.MatchRoot("oauth_refresh_token"), path.MatchRoot("oauth_client_id"), path.MatchRoot("oauth_client_secret"), path.MatchRoot("oauth_redirect_url")), }, DeprecationMessage: "Use `token_accessor.0.token_endpoint` instead", @@ -330,7 +329,7 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Optional: true, Sensitive: true, Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key_path"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), + stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("private_key"), path.MatchRoot("private_key_passphrase"), path.MatchRoot("password"), path.MatchRoot("oauth_access_token")), stringvalidator.AlsoRequires(path.MatchRoot("oauth_refresh_token"), path.MatchRoot("oauth_client_id"), path.MatchRoot("oauth_client_secret"), path.MatchRoot("oauth_endpoint")), }, DeprecationMessage: "Use `token_accessor.0.redirect_uri` instead", @@ -341,15 +340,6 @@ func (p *SnowflakeProvider) Schema(ctx context.Context, req provider.SchemaReque Sensitive: false, DeprecationMessage: "Use `authenticator` instead", }, - "private_key_path": schema.StringAttribute{ - Description: "Path to a private key for using keypair authentication. Cannot be used with `browser_auth`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_PRIVATE_KEY_PATH` environment variable.", - Optional: true, - Sensitive: true, - Validators: []validator.String{ - stringvalidator.ConflictsWith(path.MatchRoot("browser_auth"), path.MatchRoot("oauth_access_token"), path.MatchRoot("password")), - }, - DeprecationMessage: "use the [file Function](https://developer.hashicorp.com/terraform/language/functions/file) instead", - }, }, Blocks: map[string]schema.Block{ "token_accessor": schema.ListNestedBlock{ @@ -612,16 +602,12 @@ func (p *SnowflakeProvider) Configure(ctx context.Context, req provider.Configur if data.PrivateKey.ValueString() != "" { privateKey = data.PrivateKey.ValueString() } - privateKeyPath := os.Getenv("SNOWFLAKE_PRIVATE_KEY_PATH") - if data.PrivateKeyPath.ValueString() != "" { - privateKeyPath = data.PrivateKeyPath.ValueString() - } privateKeyPassphrase := os.Getenv("SNOWFLAKE_PRIVATE_KEY_PASSPHRASE") if data.PrivateKeyPassphrase.ValueString() != "" { privateKeyPassphrase = data.PrivateKeyPassphrase.ValueString() } - if privateKey != "" || privateKeyPath != "" { - if v, err := getPrivateKey(privateKeyPath, privateKey, privateKeyPassphrase); err != nil && v != nil { + if privateKey != "" { + if v, err := getPrivateKey(privateKey, privateKeyPassphrase); err != nil && v != nil { config.PrivateKey = v } } diff --git a/framework/provider/provider_helpers.go b/framework/provider/provider_helpers.go index 76dac3c27c..424a57c435 100644 --- a/framework/provider/provider_helpers.go +++ b/framework/provider/provider_helpers.go @@ -13,21 +13,13 @@ import ( "strconv" "strings" - "github.com/mitchellh/go-homedir" "github.com/snowflakedb/gosnowflake" "github.com/youmark/pkcs8" "golang.org/x/crypto/ssh" ) -func getPrivateKey(privateKeyPath, privateKeyString, privateKeyPassphrase string) (*rsa.PrivateKey, error) { +func getPrivateKey(privateKeyString, privateKeyPassphrase string) (*rsa.PrivateKey, error) { privateKeyBytes := []byte(privateKeyString) - var err error - if len(privateKeyBytes) == 0 && privateKeyPath != "" { - privateKeyBytes, err = readFile(privateKeyPath) - if err != nil { - return nil, fmt.Errorf("private Key file could not be read err = %w", err) - } - } return parsePrivateKey(privateKeyBytes, []byte(privateKeyPassphrase)) } @@ -79,24 +71,6 @@ func getBoolEnv(key string, defaultValue bool) bool { } } -func readFile(privateKeyPath string) ([]byte, error) { - expandedPrivateKeyPath, err := homedir.Expand(privateKeyPath) - if err != nil { - return nil, fmt.Errorf("invalid Path to private key err = %w", err) - } - - privateKeyBytes, err := os.ReadFile(expandedPrivateKeyPath) - if err != nil { - return nil, fmt.Errorf("could not read private key err = %w", err) - } - - if len(privateKeyBytes) == 0 { - return nil, errors.New("private key is empty") - } - - return privateKeyBytes, nil -} - func parsePrivateKey(privateKeyBytes []byte, passhrase []byte) (*rsa.PrivateKey, error) { privateKeyBlock, _ := pem.Decode(privateKeyBytes) if privateKeyBlock == nil { diff --git a/go.mod b/go.mod index 7f42d3fbd8..7f39239bc6 100644 --- a/go.mod +++ b/go.mod @@ -19,7 +19,6 @@ require ( github.com/hashicorp/terraform-plugin-testing v1.6.0 github.com/jmoiron/sqlx v1.3.5 github.com/luna-duclos/instrumentedsql v1.1.3 - github.com/mitchellh/go-homedir v1.1.0 github.com/pelletier/go-toml/v2 v2.1.1 github.com/snowflakedb/gosnowflake v1.10.0 github.com/stretchr/testify v1.8.4 diff --git a/go.sum b/go.sum index f73906bfd7..322bf89a27 100644 --- a/go.sum +++ b/go.sum @@ -229,8 +229,6 @@ github.com/mattn/go-sqlite3 v1.14.6 h1:dNPt6NO46WmLVt2DLNpwczCmdV5boIZ6g/tlDrlRU github.com/mattn/go-sqlite3 v1.14.6/go.mod h1:NyWgC/yNuGj7Q9rpYnZvas74GogHl5/Z4A/KQRfk6bU= github.com/mitchellh/copystructure v1.2.0 h1:vpKXTN4ewci03Vljg/q9QvCGUDttBOGBIa15WveJJGw= github.com/mitchellh/copystructure v1.2.0/go.mod h1:qLl+cE2AmVv+CoeAwDPye/v+N2HKCj9FbZEVFJRxO9s= -github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y= -github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0= github.com/mitchellh/go-testing-interface v1.14.1 h1:jrgshOhYAUVNMAJiKbEu7EqAwgJJ2JqpQmpLJOu07cU= github.com/mitchellh/go-testing-interface v1.14.1/go.mod h1:gfgS7OtZj6MA4U1UrDRp04twqAjfvlZyCfX3sDjEym8= github.com/mitchellh/go-wordwrap v1.0.1 h1:TLuKupo69TCn6TQSyGxwI1EblZZEsQ0vMlAFQflz0v0= diff --git a/pkg/acceptance/check_destroy.go b/pkg/acceptance/check_destroy.go index 57145b726f..6660908b23 100644 --- a/pkg/acceptance/check_destroy.go +++ b/pkg/acceptance/check_destroy.go @@ -111,9 +111,6 @@ var showByIdFunctions = map[resources.Resource]showByIdFunc{ resources.Database: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.Databases.ShowByID) }, - resources.DatabaseOld: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { - return runShowById(ctx, id, client.Databases.ShowByID) - }, resources.DatabaseRole: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.DatabaseRoles.ShowByID) }, @@ -183,9 +180,6 @@ var showByIdFunctions = map[resources.Resource]showByIdFunc{ resources.ResourceMonitor: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.ResourceMonitors.ShowByID) }, - resources.Role: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { - return runShowById(ctx, id, client.Roles.ShowByID) - }, resources.RowAccessPolicy: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.RowAccessPolicies.ShowByID) }, @@ -234,9 +228,6 @@ var showByIdFunctions = map[resources.Resource]showByIdFunc{ resources.StorageIntegration: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.StorageIntegrations.ShowByID) }, - resources.Stream: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { - return runShowById(ctx, id, client.Streams.ShowByID) - }, resources.StreamOnDirectoryTable: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.Streams.ShowByID) }, diff --git a/pkg/datasources/parameters_acceptance_test.go b/pkg/datasources/parameters_acceptance_test.go index 29c9dfba7a..9fbef94d26 100644 --- a/pkg/datasources/parameters_acceptance_test.go +++ b/pkg/datasources/parameters_acceptance_test.go @@ -6,8 +6,6 @@ import ( acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-testing/helper/resource" "github.com/hashicorp/terraform-plugin-testing/tfversion" ) @@ -99,38 +97,6 @@ func TestAcc_Parameters_TransactionAbortOnErrorCanBeSet(t *testing.T) { }) } -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2353 is fixed -// done on user, to not interfere with other parallel tests on the same account -func TestAcc_Parameters_QuotedIdentifiersIgnoreCaseCanBeSet(t *testing.T) { - _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) - - user, userCleanup := acc.TestClient().User.CreateUser(t) - t.Cleanup(userCleanup) - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - Steps: []resource.TestStep{ - { - Config: sessionParameterOnUser(user.ID()), - }, - }, - }) -} - -func sessionParameterOnUser(userId sdk.AccountObjectIdentifier) string { - return fmt.Sprintf( - ` - resource "snowflake_session_parameter" "test" { - key = "QUOTED_IDENTIFIERS_IGNORE_CASE" - value = "true" - user = %[1]s - }`, userId.FullyQualifiedName()) -} - func parametersConfigOnAccount() string { return `data "snowflake_parameters" "p" { parameter_type = "ACCOUNT" diff --git a/pkg/datasources/role.go b/pkg/datasources/role.go deleted file mode 100644 index 4bf1ddbb1b..0000000000 --- a/pkg/datasources/role.go +++ /dev/null @@ -1,65 +0,0 @@ -package datasources - -import ( - "context" - "log" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" -) - -var roleSchema = map[string]*schema.Schema{ - "name": { - Type: schema.TypeString, - Required: true, - Description: "The role for which to return metadata.", - }, - "comment": { - Type: schema.TypeString, - Computed: true, - Description: "The comment on the role", - }, -} - -// Role Snowflake Role resource. -func Role() *schema.Resource { - return &schema.Resource{ - Read: ReadRole, - Schema: roleSchema, - DeprecationMessage: "This resource is deprecated and will be removed in a future major version release. Please use snowflake_roles instead.", - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - } -} - -// ReadRole Reads the database metadata information. -func ReadRole(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - - roleId, err := sdk.ParseAccountObjectIdentifier(d.Get("name").(string)) - if err != nil { - return err - } - - role, err := client.Roles.ShowByID(ctx, roleId) - if err != nil { - log.Printf("[DEBUG] role (%s) not found", roleId.Name()) - d.SetId("") - return nil - } - - d.SetId(helpers.EncodeResourceIdentifier(role.ID())) - if err := d.Set("name", role.Name); err != nil { - return err - } - if err := d.Set("comment", role.Comment); err != nil { - return err - } - return nil -} diff --git a/pkg/datasources/role_acceptance_test.go b/pkg/datasources/role_acceptance_test.go deleted file mode 100644 index b79c0d162d..0000000000 --- a/pkg/datasources/role_acceptance_test.go +++ /dev/null @@ -1,48 +0,0 @@ -package datasources_test - -import ( - "fmt" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_Role(t *testing.T) { - roleName := acc.TestClient().Ids.Alpha() - comment := random.Comment() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: role(roleName, comment), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("data.snowflake_role.t", "name", roleName), - resource.TestCheckResourceAttr("data.snowflake_role.t", "comment", comment), - ), - }, - }, - }) -} - -func role(roleName, comment string) string { - return fmt.Sprintf(` - resource snowflake_role "test_role" { - name = "%v" - comment = "%v" - } - data snowflake_role "t" { - depends_on = [snowflake_role.test_role] - name = "%v" - } - `, roleName, comment, roleName) -} diff --git a/pkg/internal/snowflakeenvs/snowflake_environment_variables.go b/pkg/internal/snowflakeenvs/snowflake_environment_variables.go index 91f817c655..8440f0092c 100644 --- a/pkg/internal/snowflakeenvs/snowflake_environment_variables.go +++ b/pkg/internal/snowflakeenvs/snowflake_environment_variables.go @@ -1,11 +1,9 @@ package snowflakeenvs const ( - Account = "SNOWFLAKE_ACCOUNT" AccountName = "SNOWFLAKE_ACCOUNT_NAME" OrganizationName = "SNOWFLAKE_ORGANIZATION_NAME" User = "SNOWFLAKE_USER" - Username = "SNOWFLAKE_USERNAME" Password = "SNOWFLAKE_PASSWORD" Warehouse = "SNOWFLAKE_WAREHOUSE" Role = "SNOWFLAKE_ROLE" diff --git a/pkg/internal/tools/doc-gen-helper/main.go b/pkg/internal/tools/doc-gen-helper/main.go index d476136dea..887dafbc56 100644 --- a/pkg/internal/tools/doc-gen-helper/main.go +++ b/pkg/internal/tools/doc-gen-helper/main.go @@ -2,7 +2,6 @@ package main import ( "bytes" - "io" "log" "os" "path/filepath" @@ -72,25 +71,22 @@ func main() { } } - var deprecatedResourcesBuffer bytes.Buffer - printTo(&deprecatedResourcesBuffer, DeprecatedResourcesTemplate, DeprecatedResourcesContext{deprecatedResources}) - - var deprecatedDatasourcesBuffer bytes.Buffer - printTo(&deprecatedDatasourcesBuffer, DeprecatedDatasourcesTemplate, DeprecatedDatasourcesContext{deprecatedDatasources}) - - err := os.WriteFile(filepath.Join(additionalExamplesPath, deprecatedResourcesFilename), deprecatedResourcesBuffer.Bytes(), 0o600) + err := printTo(DeprecatedResourcesTemplate, DeprecatedResourcesContext{deprecatedResources}, filepath.Join(additionalExamplesPath, deprecatedResourcesFilename)) if err != nil { - log.Panicln(err) + log.Fatal(err) } - err = os.WriteFile(filepath.Join(additionalExamplesPath, deprecatedDatasourcesFilename), deprecatedDatasourcesBuffer.Bytes(), 0o600) + + err = printTo(DeprecatedDatasourcesTemplate, DeprecatedDatasourcesContext{deprecatedDatasources}, filepath.Join(additionalExamplesPath, deprecatedDatasourcesFilename)) if err != nil { - log.Panicln(err) + log.Fatal(err) } } -func printTo(writer io.Writer, template *template.Template, model any) { - err := template.Execute(writer, model) +func printTo(template *template.Template, model any, filepath string) error { + var writer bytes.Buffer + err := template.Execute(&writer, model) if err != nil { - log.Panicln(err) + return err } + return os.WriteFile(filepath, writer.Bytes(), 0o600) } diff --git a/pkg/internal/tools/doc-gen-helper/templates.go b/pkg/internal/tools/doc-gen-helper/templates.go index d4dbc16f86..7121145a68 100644 --- a/pkg/internal/tools/doc-gen-helper/templates.go +++ b/pkg/internal/tools/doc-gen-helper/templates.go @@ -3,7 +3,8 @@ package main import "text/template" var DeprecatedResourcesTemplate, _ = template.New("deprecatedResourcesTemplate").Parse( - `## Currently deprecated resources + ` +{{if gt (len .Resources) 0}} ## Currently deprecated resources {{end}} {{ range .Resources -}} - {{ .NameRelativeLink }}{{ if .ReplacementRelativeLink }} - use {{ .ReplacementRelativeLink }} instead{{ end }} @@ -11,7 +12,8 @@ var DeprecatedResourcesTemplate, _ = template.New("deprecatedResourcesTemplate") ) var DeprecatedDatasourcesTemplate, _ = template.New("deprecatedDatasourcesTemplate").Parse( - `## Currently deprecated datasources + ` +{{if gt (len .Datasources) 0}} ## Currently deprecated data sources {{end}} {{ range .Datasources -}} - {{ .NameRelativeLink }}{{ if .ReplacementRelativeLink }} - use {{ .ReplacementRelativeLink }} instead{{ end }} diff --git a/pkg/provider/provider.go b/pkg/provider/provider.go index 94318dc77f..e787fd546b 100644 --- a/pkg/provider/provider.go +++ b/pkg/provider/provider.go @@ -75,7 +75,7 @@ func Provider() *schema.Provider { Optional: true, Sensitive: true, DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.Password, nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "oauth_access_token", "oauth_refresh_token"}, + ConflictsWith: []string{"private_key", "private_key_passphrase"}, }, "warehouse": { Type: schema.TypeString, @@ -133,7 +133,7 @@ func Provider() *schema.Provider { }, "authenticator": { Type: schema.TypeString, - Description: envNameFieldDescription(fmt.Sprintf("Specifies the [authentication type](https://pkg.go.dev/github.com/snowflakedb/gosnowflake#AuthType) to use when connecting to Snowflake. Valid options are: %v. Value `JWT` is deprecated and will be removed in future releases.", docs.PossibleValuesListed(sdk.AllAuthenticationTypes)), snowflakeenvs.Authenticator), + Description: envNameFieldDescription(fmt.Sprintf("Specifies the [authentication type](https://pkg.go.dev/github.com/snowflakedb/gosnowflake#AuthType) to use when connecting to Snowflake. Valid options are: %v.", docs.PossibleValuesListed(sdk.AllAuthenticationTypes)), snowflakeenvs.Authenticator), Optional: true, DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.Authenticator, string(sdk.AuthenticationTypeSnowflake)), ValidateDiagFunc: validators.NormalizeValidation(sdk.ToAuthenticatorType), @@ -274,11 +274,11 @@ func Provider() *schema.Provider { }, "private_key": { Type: schema.TypeString, - Description: envNameFieldDescription("Private Key for username+private-key auth. Cannot be used with `browser_auth` or `password`.", snowflakeenvs.PrivateKey), + Description: envNameFieldDescription("Private Key for username+private-key auth. Cannot be used with `password`.", snowflakeenvs.PrivateKey), Optional: true, Sensitive: true, DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.PrivateKey, nil), - ConflictsWith: []string{"browser_auth", "password", "oauth_access_token", "private_key_path", "oauth_refresh_token"}, + ConflictsWith: []string{"password"}, }, "private_key_passphrase": { Type: schema.TypeString, @@ -286,7 +286,7 @@ func Provider() *schema.Provider { Optional: true, Sensitive: true, DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.PrivateKeyPassphrase, nil), - ConflictsWith: []string{"browser_auth", "password", "oauth_access_token", "oauth_refresh_token"}, + ConflictsWith: []string{"password"}, }, "disable_telemetry": { Type: schema.TypeBool, @@ -356,110 +356,6 @@ func Provider() *schema.Provider { Optional: true, DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.Profile, "default"), }, - // Deprecated attributes - "account": { - Type: schema.TypeString, - Description: envNameFieldDescription("Use `account_name` and `organization_name` instead. Specifies your Snowflake account identifier assigned, by Snowflake. The [account locator](https://docs.snowflake.com/en/user-guide/admin-account-identifier#format-2-account-locator-in-a-region) format is not supported. For information about account identifiers, see the [Snowflake documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html). Required unless using `profile`.", snowflakeenvs.Account), - Optional: true, - DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.Account, nil), - Deprecated: "Use `account_name` and `organization_name` instead of `account`", - }, - "username": { - Type: schema.TypeString, - Description: envNameFieldDescription("Username for user + password authentication. Required unless using `profile`.", snowflakeenvs.Username), - Optional: true, - DefaultFunc: schema.EnvDefaultFunc(snowflakeenvs.Username, nil), - Deprecated: "Use `user` instead of `username`", - }, - "region": { - Type: schema.TypeString, - Description: "Snowflake region, such as \"eu-central-1\", with this parameter. However, since this parameter is deprecated, it is best to specify the region as part of the account parameter. For details, see the description of the account parameter. [Snowflake region](https://docs.snowflake.com/en/user-guide/intro-regions.html) to use. Required if using the [legacy format for the `account` identifier](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#format-2-legacy-account-locator-in-a-region) in the form of `.`. Can also be sourced from the `SNOWFLAKE_REGION` environment variable. ", - Optional: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_REGION", nil), - Deprecated: "Specify the region as part of the account parameter", - }, - "session_params": { - Type: schema.TypeMap, - Description: "Sets session parameters. [Parameters](https://docs.snowflake.com/en/sql-reference/parameters)", - Optional: true, - Deprecated: "Use `params` instead", - }, - "oauth_access_token": { - Type: schema.TypeString, - Description: "Token for use with OAuth. Generating the token is left to other tools. Cannot be used with `browser_auth`, `private_key_path`, `oauth_refresh_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_ACCESS_TOKEN` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_ACCESS_TOKEN", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_refresh_token"}, - Deprecated: "Use `token` instead", - }, - "oauth_refresh_token": { - Type: schema.TypeString, - Description: "Token for use with OAuth. Setup and generation of the token is left to other tools. Should be used in conjunction with `oauth_client_id`, `oauth_client_secret`, `oauth_endpoint`, `oauth_redirect_url`. Cannot be used with `browser_auth`, `private_key_path`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_OAUTH_REFRESH_TOKEN` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_REFRESH_TOKEN", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_access_token"}, - RequiredWith: []string{"oauth_client_id", "oauth_client_secret", "oauth_endpoint", "oauth_redirect_url"}, - Deprecated: "Use `token_accessor.0.refresh_token` instead", - }, - "oauth_client_id": { - Type: schema.TypeString, - Description: "Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_CLIENT_ID` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_CLIENT_ID", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_access_token"}, - RequiredWith: []string{"oauth_refresh_token", "oauth_client_secret", "oauth_endpoint", "oauth_redirect_url"}, - Deprecated: "Use `token_accessor.0.client_id` instead", - }, - "oauth_client_secret": { - Type: schema.TypeString, - Description: "Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_CLIENT_SECRET` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_CLIENT_SECRET", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_access_token"}, - RequiredWith: []string{"oauth_client_id", "oauth_refresh_token", "oauth_endpoint", "oauth_redirect_url"}, - Deprecated: "Use `token_accessor.0.client_secret` instead", - }, - "oauth_endpoint": { - Type: schema.TypeString, - Description: "Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_ENDPOINT` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_ENDPOINT", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_access_token"}, - RequiredWith: []string{"oauth_client_id", "oauth_client_secret", "oauth_refresh_token", "oauth_redirect_url"}, - Deprecated: "Use `token_accessor.0.token_endpoint` instead", - }, - "oauth_redirect_url": { - Type: schema.TypeString, - Description: "Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_OAUTH_REDIRECT_URL` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_OAUTH_REDIRECT_URL", nil), - ConflictsWith: []string{"browser_auth", "private_key_path", "private_key", "private_key_passphrase", "password", "oauth_access_token"}, - RequiredWith: []string{"oauth_client_id", "oauth_client_secret", "oauth_endpoint", "oauth_refresh_token"}, - Deprecated: "Use `token_accessor.0.redirect_uri` instead", - }, - "browser_auth": { - Type: schema.TypeBool, - Description: "Required when `oauth_refresh_token` is used. Can also be sourced from `SNOWFLAKE_USE_BROWSER_AUTH` environment variable.", - Optional: true, - Sensitive: false, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_USE_BROWSER_AUTH", nil), - Deprecated: "Use `authenticator` instead", - }, - "private_key_path": { - Type: schema.TypeString, - Description: "Path to a private key for using keypair authentication. Cannot be used with `browser_auth`, `oauth_access_token` or `password`. Can also be sourced from `SNOWFLAKE_PRIVATE_KEY_PATH` environment variable.", - Optional: true, - Sensitive: true, - DefaultFunc: schema.EnvDefaultFunc("SNOWFLAKE_PRIVATE_KEY_PATH", nil), - ConflictsWith: []string{"browser_auth", "password", "oauth_access_token", "private_key"}, - Deprecated: "use the [file Function](https://developer.hashicorp.com/terraform/language/functions/file) instead", - }, }, ResourcesMap: getResources(), DataSourcesMap: getDataSources(), @@ -482,7 +378,6 @@ func getResources() map[string]*schema.Resource { "snowflake_api_integration": resources.APIIntegration(), "snowflake_authentication_policy": resources.AuthenticationPolicy(), "snowflake_cortex_search_service": resources.CortexSearchService(), - "snowflake_database_old": resources.DatabaseOld(), "snowflake_database": resources.Database(), "snowflake_database_role": resources.DatabaseRole(), "snowflake_dynamic_table": resources.DynamicTable(), @@ -509,7 +404,6 @@ func getResources() map[string]*schema.Resource { "snowflake_network_policy_attachment": resources.NetworkPolicyAttachment(), "snowflake_network_rule": resources.NetworkRule(), "snowflake_notification_integration": resources.NotificationIntegration(), - "snowflake_oauth_integration": resources.OAuthIntegration(), "snowflake_oauth_integration_for_partner_applications": resources.OauthIntegrationForPartnerApplications(), "snowflake_oauth_integration_for_custom_clients": resources.OauthIntegrationForCustomClients(), "snowflake_object_parameter": resources.ObjectParameter(), @@ -518,9 +412,7 @@ func getResources() map[string]*schema.Resource { "snowflake_primary_connection": resources.PrimaryConnection(), "snowflake_procedure": resources.Procedure(), "snowflake_resource_monitor": resources.ResourceMonitor(), - "snowflake_role": resources.Role(), "snowflake_row_access_policy": resources.RowAccessPolicy(), - "snowflake_saml_integration": resources.SAMLIntegration(), "snowflake_saml2_integration": resources.SAML2Integration(), "snowflake_schema": resources.Schema(), "snowflake_scim_integration": resources.SCIMIntegration(), @@ -532,12 +424,10 @@ func getResources() map[string]*schema.Resource { "snowflake_secret_with_generic_string": resources.SecretWithGenericString(), "snowflake_sequence": resources.Sequence(), "snowflake_service_user": resources.ServiceUser(), - "snowflake_session_parameter": resources.SessionParameter(), "snowflake_share": resources.Share(), "snowflake_shared_database": resources.SharedDatabase(), "snowflake_stage": resources.Stage(), "snowflake_storage_integration": resources.StorageIntegration(), - "snowflake_stream": resources.Stream(), "snowflake_stream_on_directory_table": resources.StreamOnDirectoryTable(), "snowflake_stream_on_external_table": resources.StreamOnExternalTable(), "snowflake_stream_on_table": resources.StreamOnTable(), @@ -548,7 +438,6 @@ func getResources() map[string]*schema.Resource { "snowflake_table_constraint": resources.TableConstraint(), "snowflake_tag": resources.Tag(), "snowflake_tag_association": resources.TagAssociation(), - "snowflake_tag_masking_policy_association": resources.TagMaskingPolicyAssociation(), "snowflake_task": resources.Task(), "snowflake_unsafe_execute": resources.UnsafeExecute(), "snowflake_user": resources.User(), @@ -592,7 +481,6 @@ func getDataSources() map[string]*schema.Resource { "snowflake_pipes": datasources.Pipes(), "snowflake_procedures": datasources.Procedures(), "snowflake_resource_monitors": datasources.ResourceMonitors(), - "snowflake_role": datasources.Role(), "snowflake_roles": datasources.Roles(), "snowflake_row_access_policies": datasources.RowAccessPolicies(), "snowflake_schemas": datasources.Schemas(), @@ -794,14 +682,6 @@ func getDriverConfigFromTerraform(s *schema.ResourceData) (*gosnowflake.Config, handleBooleanStringAttribute(s, "disable_console_login", &config.DisableConsoleLogin), // profile is handled in the calling function // TODO(SNOW-1761318): handle DisableSamlURLCheck after upgrading the driver to at least 1.10.1 - - // deprecated - handleStringField(s, "account", &config.Account), - handleStringField(s, "username", &config.User), - handleStringField(s, "region", &config.Region), - // session params are handled below - // browser auth is handled below - // private key path is handled below ) if err != nil { return nil, err @@ -819,11 +699,6 @@ func getDriverConfigFromTerraform(s *schema.ResourceData) (*gosnowflake.Config, m = v.(map[string]interface{}) } - // backwards compatibility until we can remove this - if v, ok := s.GetOk("session_params"); ok { - m = v.(map[string]interface{}) - } - params := make(map[string]*string) for key, value := range m { strValue := value.(string) @@ -831,11 +706,6 @@ func getDriverConfigFromTerraform(s *schema.ResourceData) (*gosnowflake.Config, } config.Params = params - // backwards compatibility until we can remove this - if v, ok := s.GetOk("browser_auth"); ok && v.(bool) { - config.Authenticator = gosnowflake.AuthTypeExternalBrowser - } - if v, ok := s.GetOk("token_accessor"); ok { if len(v.([]any)) > 0 { tokenAccessor := v.([]any)[0].(map[string]any) @@ -853,10 +723,9 @@ func getDriverConfigFromTerraform(s *schema.ResourceData) (*gosnowflake.Config, } } - privateKeyPath := s.Get("private_key_path").(string) privateKey := s.Get("private_key").(string) privateKeyPassphrase := s.Get("private_key_passphrase").(string) - v, err := getPrivateKey(privateKeyPath, privateKey, privateKeyPassphrase) + v, err := getPrivateKey(privateKey, privateKeyPassphrase) if err != nil { return nil, fmt.Errorf("could not retrieve private key: %w", err) } diff --git a/pkg/provider/provider_acceptance_test.go b/pkg/provider/provider_acceptance_test.go index 0d0daf7b67..9b8dcca9fa 100644 --- a/pkg/provider/provider_acceptance_test.go +++ b/pkg/provider/provider_acceptance_test.go @@ -30,7 +30,6 @@ func TestAcc_Provider_configHierarchy(t *testing.T) { user := acc.DefaultConfig(t).User pass := acc.DefaultConfig(t).Password - account := acc.DefaultConfig(t).Account role := acc.DefaultConfig(t).Role host := acc.DefaultConfig(t).Host @@ -104,7 +103,11 @@ func TestAcc_Provider_configHierarchy(t *testing.T) { testenvs.AssertEnvSet(t, snowflakeenvs.ConfigPath) t.Setenv(snowflakeenvs.User, user) t.Setenv(snowflakeenvs.Password, pass) - t.Setenv(snowflakeenvs.Account, account) + + accountId := configAccountId(t, acc.DefaultConfig(t)) + t.Setenv(snowflakeenvs.OrganizationName, accountId.OrganizationName()) + t.Setenv(snowflakeenvs.AccountName, accountId.AccountName()) + t.Setenv(snowflakeenvs.Role, role) t.Setenv(snowflakeenvs.Host, host) }, @@ -117,7 +120,8 @@ func TestAcc_Provider_configHierarchy(t *testing.T) { testenvs.AssertEnvSet(t, snowflakeenvs.ConfigPath) testenvs.AssertEnvSet(t, snowflakeenvs.User) testenvs.AssertEnvSet(t, snowflakeenvs.Password) - testenvs.AssertEnvSet(t, snowflakeenvs.Account) + testenvs.AssertEnvSet(t, snowflakeenvs.OrganizationName) + testenvs.AssertEnvSet(t, snowflakeenvs.AccountName) testenvs.AssertEnvSet(t, snowflakeenvs.Role) testenvs.AssertEnvSet(t, snowflakeenvs.Host) }, @@ -130,6 +134,13 @@ func TestAcc_Provider_configHierarchy(t *testing.T) { }) } +func configAccountId(t *testing.T, cfg *gosnowflake.Config) sdk.AccountIdentifier { + t.Helper() + accountIdRaw := cfg.Account + parts := strings.SplitN(accountIdRaw, "-", 2) + return sdk.NewAccountIdentifier(parts[0], parts[1]) +} + func TestAcc_Provider_configureClientOnceSwitching(t *testing.T) { t.Setenv(string(testenvs.ConfigureClientOnce), "") @@ -567,11 +578,6 @@ func TestAcc_Provider_JwtAuth(t *testing.T) { { Config: providerConfigWithAuthenticator(testprofiles.JwtAuth, sdk.AuthenticationTypeJwt), }, - // authenticate with unencrypted private key with a legacy authenticator value - // solves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2983 - { - Config: providerConfigWithAuthenticator(testprofiles.JwtAuth, sdk.AuthenticationTypeJwtLegacy), - }, // authenticate with encrypted private key { Config: providerConfigWithAuthenticator(testprofiles.EncryptedJwtAuth, sdk.AuthenticationTypeJwt), diff --git a/pkg/provider/provider_helpers.go b/pkg/provider/provider_helpers.go index dd6aa9ab92..0a412e1d62 100644 --- a/pkg/provider/provider_helpers.go +++ b/pkg/provider/provider_helpers.go @@ -3,12 +3,10 @@ package provider import ( "crypto/rsa" "encoding/json" - "errors" "fmt" "io" "net/http" "net/url" - "os" "strconv" "strings" "time" @@ -16,7 +14,6 @@ import ( "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/mitchellh/go-homedir" "github.com/snowflakedb/gosnowflake" ) @@ -44,39 +41,14 @@ func toProtocol(s string) (protocol, error) { } } -func getPrivateKey(privateKeyPath, privateKeyString, privateKeyPassphrase string) (*rsa.PrivateKey, error) { - if privateKeyPath == "" && privateKeyString == "" { +func getPrivateKey(privateKeyString, privateKeyPassphrase string) (*rsa.PrivateKey, error) { + if privateKeyString == "" { return nil, nil } privateKeyBytes := []byte(privateKeyString) - var err error - if len(privateKeyBytes) == 0 && privateKeyPath != "" { - privateKeyBytes, err = readFile(privateKeyPath) - if err != nil { - return nil, fmt.Errorf("private Key file could not be read err = %w", err) - } - } return sdk.ParsePrivateKey(privateKeyBytes, []byte(privateKeyPassphrase)) } -func readFile(privateKeyPath string) ([]byte, error) { - expandedPrivateKeyPath, err := homedir.Expand(privateKeyPath) - if err != nil { - return nil, fmt.Errorf("invalid Path to private key err = %w", err) - } - - privateKeyBytes, err := os.ReadFile(expandedPrivateKeyPath) - if err != nil { - return nil, fmt.Errorf("could not read private key err = %w", err) - } - - if len(privateKeyBytes) == 0 { - return nil, errors.New("private key is empty") - } - - return privateKeyBytes, nil -} - type GetRefreshTokenResponseBody struct { AccessToken string `json:"access_token"` TokenType string `json:"token_type"` diff --git a/pkg/provider/resources/resources.go b/pkg/provider/resources/resources.go index dc4de69296..9c547caf34 100644 --- a/pkg/provider/resources/resources.go +++ b/pkg/provider/resources/resources.go @@ -12,7 +12,6 @@ const ( ApiIntegration resource = "snowflake_api_integration" AuthenticationPolicy resource = "snowflake_authentication_policy" CortexSearchService resource = "snowflake_cortex_search_service" - DatabaseOld resource = "snowflake_database_old" Database resource = "snowflake_database" DatabaseRole resource = "snowflake_database_role" DynamicTable resource = "snowflake_dynamic_table" @@ -45,7 +44,6 @@ const ( PrimaryConnection resource = "snowflake_primary_connection" Procedure resource = "snowflake_procedure" ResourceMonitor resource = "snowflake_resource_monitor" - Role resource = "snowflake_role" RowAccessPolicy resource = "snowflake_row_access_policy" Saml2SecurityIntegration resource = "snowflake_saml2_integration" Schema resource = "snowflake_schema" @@ -62,7 +60,6 @@ const ( SharedDatabase resource = "snowflake_shared_database" Stage resource = "snowflake_stage" StorageIntegration resource = "snowflake_storage_integration" - Stream resource = "snowflake_stream" StreamOnDirectoryTable resource = "snowflake_stream_on_directory_table" StreamOnExternalTable resource = "snowflake_stream_on_external_table" StreamOnTable resource = "snowflake_stream_on_table" diff --git a/pkg/resources/database_old.go b/pkg/resources/database_old.go deleted file mode 100644 index 15d4fca440..0000000000 --- a/pkg/resources/database_old.go +++ /dev/null @@ -1,370 +0,0 @@ -package resources - -import ( - "context" - "fmt" - "log" - "slices" - "strconv" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" -) - -var databaseOldSchema = map[string]*schema.Schema{ - "name": { - Type: schema.TypeString, - Required: true, - Description: "Specifies the identifier for the database; must be unique for your account.", - }, - "comment": { - Type: schema.TypeString, - Optional: true, - Default: "", - Description: "Specifies a comment for the database.", - }, - "is_transient": { - Type: schema.TypeBool, - Optional: true, - Default: false, - Description: "Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", - ForceNew: true, - }, - "data_retention_time_in_days": { - Type: schema.TypeInt, - Optional: true, - Default: IntDefault, - Description: "Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", - ValidateFunc: validation.IntBetween(-1, 90), - }, - "from_share": { - Type: schema.TypeMap, - Elem: &schema.Schema{Type: schema.TypeString}, - Description: "Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator.", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_database", "from_replica"}, - }, - "from_database": { - Type: schema.TypeString, - Description: "Specify a database to create a clone from.", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_share", "from_replica"}, - }, - "from_replica": { - Type: schema.TypeString, - Description: "Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`. An example would be: `\"myorg1\".\"account1\".\"db1\"`", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_share", "from_database"}, - }, - "replication_configuration": { - Type: schema.TypeList, - Description: "When set, specifies the configurations for database replication.", - Optional: true, - MaxItems: 1, - Elem: &schema.Resource{ - Schema: map[string]*schema.Schema{ - "accounts": { - Type: schema.TypeList, - Required: true, - MinItems: 1, - Elem: &schema.Schema{Type: schema.TypeString}, - }, - "ignore_edition_check": { - Type: schema.TypeBool, - Default: true, - Optional: true, - }, - }, - }, - }, -} - -// Database returns a pointer to the resource representing a database. -func DatabaseOld() *schema.Resource { - return &schema.Resource{ - Create: CreateDatabaseOld, - Read: ReadDatabaseOld, - Delete: DeleteDatabaseOld, - Update: UpdateDatabaseOld, - DeprecationMessage: "This resource is deprecated and will be removed in a future major version release. Please use snowflake_database or snowflake_shared_database or snowflake_secondary_database instead.", - - Schema: databaseOldSchema, - Importer: &schema.ResourceImporter{ - StateContext: TrackingImportWrapper(resources.DatabaseOld, ImportName[sdk.AccountObjectIdentifier]), - }, - } -} - -// CreateDatabase implements schema.CreateFunc. -func CreateDatabaseOld(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - name := d.Get("name").(string) - id := sdk.NewAccountObjectIdentifier(name) - - // Is it a Shared Database? - if fromShare, ok := d.GetOk("from_share"); ok { - account := fromShare.(map[string]interface{})["provider"].(string) - share := fromShare.(map[string]interface{})["share"].(string) - shareID := sdk.NewExternalObjectIdentifier(sdk.NewAccountIdentifierFromAccountLocator(account), sdk.NewAccountObjectIdentifier(share)) - opts := &sdk.CreateSharedDatabaseOptions{} - if v, ok := d.GetOk("comment"); ok { - opts.Comment = sdk.String(v.(string)) - } - err := client.Databases.CreateShared(ctx, id, shareID, opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - return ReadDatabaseOld(d, meta) - } - // Is it a Secondary Database? - if primaryName, ok := d.GetOk("from_replica"); ok { - primaryID := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(primaryName.(string)) - opts := &sdk.CreateSecondaryDatabaseOptions{} - if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault { - opts.DataRetentionTimeInDays = sdk.Int(v.(int)) - } - err := client.Databases.CreateSecondary(ctx, id, primaryID, opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - // todo: add failover_configuration block - return ReadDatabaseOld(d, meta) - } - - // Otherwise it is a Standard Database - opts := sdk.CreateDatabaseOptions{} - if v, ok := d.GetOk("comment"); ok { - opts.Comment = sdk.String(v.(string)) - } - - if v, ok := d.GetOk("is_transient"); ok && v.(bool) { - opts.Transient = sdk.Bool(v.(bool)) - } - - if v, ok := d.GetOk("from_database"); ok { - opts.Clone = &sdk.Clone{ - SourceObject: sdk.NewAccountObjectIdentifier(v.(string)), - } - } - - if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault { - opts.DataRetentionTimeInDays = sdk.Int(v.(int)) - } - - err := client.Databases.Create(ctx, id, &opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - - if v, ok := d.GetOk("replication_configuration"); ok { - replicationConfiguration := v.([]interface{})[0].(map[string]interface{}) - accounts := replicationConfiguration["accounts"].([]interface{}) - accountIDs := make([]sdk.AccountIdentifier, len(accounts)) - for i, account := range accounts { - accountIDs[i] = sdk.NewAccountIdentifierFromAccountLocator(account.(string)) - } - opts := &sdk.AlterDatabaseReplicationOptions{ - EnableReplication: &sdk.EnableReplication{ - ToAccounts: accountIDs, - }, - } - if ignoreEditionCheck, ok := replicationConfiguration["ignore_edition_check"]; ok { - opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck.(bool)) - } - err := client.Databases.AlterReplication(ctx, id, opts) - if err != nil { - return fmt.Errorf("error enabling replication for database %v: %w", name, err) - } - } - - return ReadDatabaseOld(d, meta) -} - -func ReadDatabaseOld(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - - database, err := client.Databases.ShowByID(ctx, id) - if err != nil { - d.SetId("") - log.Printf("Database %s not found, err = %s", id.Name(), err) - return nil - } - - if err := d.Set("comment", database.Comment); err != nil { - return err - } - - dataRetention, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) - if err != nil { - return err - } - paramDataRetention, err := strconv.Atoi(dataRetention.Value) - if err != nil { - return err - } - - if dataRetentionDays := d.Get("data_retention_time_in_days"); dataRetentionDays.(int) != IntDefault || database.RetentionTime != paramDataRetention { - if err := d.Set("data_retention_time_in_days", database.RetentionTime); err != nil { - return err - } - } - - if err := d.Set("is_transient", database.Transient); err != nil { - return err - } - - return nil -} - -func UpdateDatabaseOld(d *schema.ResourceData, meta interface{}) error { - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - client := meta.(*provider.Context).Client - ctx := context.Background() - - if d.HasChange("name") { - newName := d.Get("name").(string) - newId := sdk.NewAccountObjectIdentifier(newName) - opts := &sdk.AlterDatabaseOptions{ - NewName: &newId, - } - err := client.Databases.Alter(ctx, id, opts) - if err != nil { - return fmt.Errorf("error updating database name on %v err = %w", d.Id(), err) - } - d.SetId(helpers.EncodeSnowflakeID(newId)) - id = newId - } - - if d.HasChange("comment") { - comment := "" - if c := d.Get("comment"); c != nil { - comment = c.(string) - } - opts := &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - Comment: sdk.String(comment), - }, - } - err := client.Databases.Alter(ctx, id, opts) - if err != nil { - return fmt.Errorf("error updating database comment on %v err = %w", d.Id(), err) - } - } - - if d.HasChange("data_retention_time_in_days") { - if days := d.Get("data_retention_time_in_days"); days.(int) != IntDefault { - err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - DataRetentionTimeInDays: sdk.Int(days.(int)), - }, - }) - if err != nil { - return fmt.Errorf("error when setting database data retention time on %v err = %w", d.Id(), err) - } - } else { - err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ - Unset: &sdk.DatabaseUnset{ - DataRetentionTimeInDays: sdk.Bool(true), - }, - }) - if err != nil { - return fmt.Errorf("error when usetting database data retention time on %v err = %w", d.Id(), err) - } - } - } - - // If replication configuration changes, need to update accounts that have permission to replicate database - if d.HasChange("replication_configuration") { - oldConfig, newConfig := d.GetChange("replication_configuration") - - newAccountIDs := make([]sdk.AccountIdentifier, 0) - ignoreEditionCheck := false - if len(newConfig.([]interface{})) != 0 { - newAccounts := newConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) - for _, account := range newAccounts { - newAccountIDs = append(newAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) - } - ignoreEditionCheck = newConfig.([]interface{})[0].(map[string]interface{})["ignore_edition_check"].(bool) - } - - oldAccountIDs := make([]sdk.AccountIdentifier, 0) - if len(oldConfig.([]interface{})) != 0 { - oldAccounts := oldConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) - for _, account := range oldAccounts { - oldAccountIDs = append(oldAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) - } - } - - accountsToRemove := make([]sdk.AccountIdentifier, 0) - accountsToAdd := make([]sdk.AccountIdentifier, 0) - // Find accounts to remove - for _, oldAccountID := range oldAccountIDs { - if !slices.Contains(newAccountIDs, oldAccountID) { - accountsToRemove = append(accountsToRemove, oldAccountID) - } - } - - // Find accounts to add - for _, newAccountID := range newAccountIDs { - if !slices.Contains(oldAccountIDs, newAccountID) { - accountsToAdd = append(accountsToAdd, newAccountID) - } - } - if len(accountsToAdd) > 0 { - opts := &sdk.AlterDatabaseReplicationOptions{ - EnableReplication: &sdk.EnableReplication{ - ToAccounts: accountsToAdd, - }, - } - if ignoreEditionCheck { - opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck) - } - err := client.Databases.AlterReplication(ctx, id, opts) - if err != nil { - return fmt.Errorf("error enabling replication configuration on %v err = %w", d.Id(), err) - } - } - - if len(accountsToRemove) > 0 { - opts := &sdk.AlterDatabaseReplicationOptions{ - DisableReplication: &sdk.DisableReplication{ - ToAccounts: accountsToRemove, - }, - } - err := client.Databases.AlterReplication(ctx, id, opts) - if err != nil { - return fmt.Errorf("error disabling replication configuration on %v err = %w", d.Id(), err) - } - } - } - - return ReadDatabaseOld(d, meta) -} - -func DeleteDatabaseOld(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - err := client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ - IfExists: sdk.Bool(true), - }) - if err != nil { - return err - } - d.SetId("") - return nil -} diff --git a/pkg/resources/database_old_acceptance_test.go b/pkg/resources/database_old_acceptance_test.go deleted file mode 100644 index 31d5e46702..0000000000 --- a/pkg/resources/database_old_acceptance_test.go +++ /dev/null @@ -1,448 +0,0 @@ -package resources_test - -import ( - "context" - "fmt" - "strconv" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-testing/config" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/plancheck" - "github.com/hashicorp/terraform-plugin-testing/terraform" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_DatabaseWithUnderscore(t *testing.T) { - prefix := acc.TestClient().Ids.AlphaWithPrefix("_") - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - Config: dbConfig(prefix), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), - ), - }, - }, - }) -} - -func TestAcc_Database(t *testing.T) { - prefix := acc.TestClient().Ids.Alpha() - prefix2 := acc.TestClient().Ids.Alpha() - - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - Config: dbConfig(prefix), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), - ), - }, - // RENAME - { - Config: dbConfig(prefix2), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), - ), - }, - // CHANGE PROPERTIES - { - Config: dbConfig2(prefix2), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment 2"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "data_retention_time_in_days", "3"), - ), - }, - // ADD REPLICATION - // proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2369 error - { - Config: dbConfigWithReplication(prefix2, secondaryAccountName), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment 2"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "data_retention_time_in_days", "3"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.#", "1"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.#", "1"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.0", secondaryAccountName), - ), - }, - // IMPORT - { - ResourceName: "snowflake_database_old.db", - ImportState: true, - ImportStateVerify: true, - ImportStateVerifyIgnore: []string{"replication_configuration"}, - }, - }, - }) -} - -func TestAcc_DatabaseRemovedOutsideOfTerraform(t *testing.T) { - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - name := id.Name() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - ConfigDirectory: config.TestNameDirectory(), - ConfigVariables: map[string]config.Variable{ - "db": config.StringVariable(name), - }, - ConfigPlanChecks: resource.ConfigPlanChecks{ - PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, - }, - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), - testAccCheckDatabaseExistence(t, id, true), - ), - }, - { - PreConfig: func() { acc.TestClient().Database.DropDatabaseFunc(t, id)() }, - ConfigDirectory: config.TestNameDirectory(), - ConfigVariables: map[string]config.Variable{ - "db": config.StringVariable(name), - }, - ConfigPlanChecks: resource.ConfigPlanChecks{ - PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, - }, - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), - testAccCheckDatabaseExistence(t, id, true), - ), - }, - }, - }) -} - -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2021 -func TestAcc_Database_issue2021(t *testing.T) { - name := acc.TestClient().Ids.Alpha() - - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - Config: dbConfigWithReplication(name, secondaryAccountName), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.#", "1"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.#", "1"), - resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.0", secondaryAccountName), - testAccCheckIfDatabaseIsReplicated(t, name), - ), - }, - }, - }) -} - -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. -func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - - configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { - return config.Variables{ - "database": config.StringVariable(id.Name()), - } - } - - configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { - vars := configVariablesWithoutDatabaseDataRetentionTime() - vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) - return vars - } - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - PreConfig: func() { - revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") - t.Cleanup(revertParameter) - }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), - ), - }, - { - PreConfig: func() { - _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") - }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), - ), - }, - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(5), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "5"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 5), - ), - }, - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(15), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "15"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 15), - ), - }, - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), - ), - }, - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(0), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "0"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 0), - ), - }, - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "3"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), - ), - }, - }, - }) -} - -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. -func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) { - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - - configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { - return config.Variables{ - "database": config.StringVariable(id.Name()), - } - } - - configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { - vars := configVariablesWithoutDatabaseDataRetentionTime() - vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) - return vars - } - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), - Steps: []resource.TestStep{ - { - PreConfig: func() { - revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") - t.Cleanup(revertParameter) - }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), - ), - }, - { - PreConfig: func() { acc.TestClient().Database.UpdateDataRetentionTime(t, id, 20) }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), - ), - }, - { - PreConfig: func() { - _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") - }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "3"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), - ), - ConfigPlanChecks: resource.ConfigPlanChecks{ - PostApplyPostRefresh: []plancheck.PlanCheck{ - plancheck.ExpectEmptyPlan(), - }, - }, - }, - }, - }) -} - -func dbConfig(prefix string) string { - s := ` -resource "snowflake_database_old" "db" { - name = "%s" - comment = "test comment" -} -` - return fmt.Sprintf(s, prefix) -} - -func dbConfig2(prefix string) string { - s := ` -resource "snowflake_database_old" "db" { - name = "%s" - comment = "test comment 2" - data_retention_time_in_days = 3 -} -` - return fmt.Sprintf(s, prefix) -} - -func dbConfigWithReplication(prefix string, secondaryAccountName string) string { - s := ` -resource "snowflake_database_old" "db" { - name = "%s" - comment = "test comment 2" - data_retention_time_in_days = 3 - replication_configuration { - accounts = [ - "%s" - ] - } -} -` - return fmt.Sprintf(s, prefix, secondaryAccountName) -} - -// TODO [SNOW-936093]: this is used mostly as check for unsafe execute, not as normal check destroy in other resources. Handle with the helpers cleanup. -func testAccCheckDatabaseExistence(t *testing.T, id sdk.AccountObjectIdentifier, shouldExist bool) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - _, err := acc.TestClient().Database.Show(t, id) - if shouldExist { - if err != nil { - return fmt.Errorf("error while retrieving database %s, err = %w", id, err) - } - } else { - if err == nil { - return fmt.Errorf("database %v still exists", id) - } - } - return nil - } -} - -func testAccCheckIfDatabaseIsReplicated(t *testing.T, id string) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - replicationDatabases, err := acc.TestClient().Database.ShowAllReplicationDatabases(t) - if err != nil { - return err - } - - var exists bool - for _, o := range replicationDatabases { - if o.Name == id { - exists = true - break - } - } - - if !exists { - return fmt.Errorf("database %s should be replicated", id) - } - - return nil - } -} - -func checkAccountAndDatabaseDataRetentionTime(t *testing.T, id sdk.AccountObjectIdentifier, expectedAccountRetentionDays int, expectedDatabaseRetentionsDays int) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - providerContext := acc.TestAccProvider.Meta().(*provider.Context) - client := providerContext.Client - ctx := context.Background() - - database, err := acc.TestClient().Database.Show(t, id) - if err != nil { - return err - } - - if database.RetentionTime != expectedDatabaseRetentionsDays { - return fmt.Errorf("invalid database retention time, expected: %d, got: %d", expectedDatabaseRetentionsDays, database.RetentionTime) - } - - param, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) - if err != nil { - return err - } - accountRetentionDays, err := strconv.Atoi(param.Value) - if err != nil { - return err - } - - if accountRetentionDays != expectedAccountRetentionDays { - return fmt.Errorf("invalid account retention time, expected: %d, got: %d", expectedAccountRetentionDays, accountRetentionDays) - } - - return nil - } -} diff --git a/pkg/resources/database_state_upgraders.go b/pkg/resources/database_state_upgraders.go index 06ff004771..91bcd24c1a 100644 --- a/pkg/resources/database_state_upgraders.go +++ b/pkg/resources/database_state_upgraders.go @@ -17,15 +17,15 @@ func v092DatabaseStateUpgrader(ctx context.Context, rawState map[string]any, met } if v, ok := rawState["from_share"]; ok && v != nil && len(v.(map[string]any)) > 0 { - return nil, fmt.Errorf("failed to upgrade the state with database created from share, please use snowflake_shared_database or deprecated snowflake_database_old instead") + return nil, fmt.Errorf("failed to upgrade the state with database created from share, please use snowflake_shared_database instead") } if v, ok := rawState["from_replica"]; ok && v != nil && len(v.(string)) > 0 { - return nil, fmt.Errorf("failed to upgrade the state with database created from replica, please use snowflake_secondary_database or deprecated snowflake_database_old instead") + return nil, fmt.Errorf("failed to upgrade the state with database created from replica, please use snowflake_secondary_database instead") } if v, ok := rawState["from_database"]; ok && v != nil && len(v.(string)) > 0 { - return nil, fmt.Errorf("failed to upgrade the state with database created from database, please use snowflake_database or deprecated snowflake_database_old instead. Dislaimer: Right now, database cloning is not supported. They can be imported into mentioned resources, but any differetnce in behavior from standard database won't be handled (and can result in errors)") + return nil, fmt.Errorf("failed to upgrade the state with database created from database, please use snowflake_database instead. Dislaimer: Right now, database cloning is not supported. They can be imported into the mentioned resource, but any differetnce in behavior from standard database won't be handled (and can result in errors)") } if replicationConfigurations, ok := rawState["replication_configuration"]; ok && len(replicationConfigurations.([]any)) == 1 { diff --git a/pkg/resources/deprecated_helpers_test.go b/pkg/resources/deprecated_helpers_test.go deleted file mode 100644 index 34601967d1..0000000000 --- a/pkg/resources/deprecated_helpers_test.go +++ /dev/null @@ -1,31 +0,0 @@ -package resources_test - -import ( - "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/stretchr/testify/require" -) - -/** - * Will be removed while adding security integrations to the SDK. - */ - -func samlIntegration(t *testing.T, id string, params map[string]interface{}) *schema.ResourceData { - t.Helper() - r := require.New(t) - d := schema.TestResourceDataRaw(t, resources.SAMLIntegration().Schema, params) - r.NotNil(d) - d.SetId(id) - return d -} - -func oauthIntegration(t *testing.T, id string, params map[string]interface{}) *schema.ResourceData { - t.Helper() - r := require.New(t) - d := schema.TestResourceDataRaw(t, resources.OAuthIntegration().Schema, params) - r.NotNil(d) - d.SetId(id) - return d -} diff --git a/pkg/resources/manual_tests/upgrade_cloned_database/step_2.tf b/pkg/resources/manual_tests/upgrade_cloned_database/step_2.tf index f6446ce3a2..eb556b849e 100644 --- a/pkg/resources/manual_tests/upgrade_cloned_database/step_2.tf +++ b/pkg/resources/manual_tests/upgrade_cloned_database/step_2.tf @@ -1,6 +1,6 @@ # Commands to run # - terraform init - upgrade -# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from database, please use snowflake_database or deprecated snowflake_database_old instead...) +# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from database, please use snowflake_database instead...) # - terraform state rm snowflake_database.cloned (remove cloned database from the state) terraform { @@ -15,12 +15,12 @@ terraform { provider "snowflake" {} resource "snowflake_database" "test" { - name = "test" + name = "test" data_retention_time_in_days = 0 # to avoid in-place update to -1 } resource "snowflake_database" "cloned" { - name = "cloned" - from_database = snowflake_database.test.name + name = "cloned" + from_database = snowflake_database.test.name data_retention_time_in_days = 0 # to avoid in-place update to -1 } diff --git a/pkg/resources/manual_tests/upgrade_secondary_database/step_2.tf b/pkg/resources/manual_tests/upgrade_secondary_database/step_2.tf index 599e971a51..1f48b0e6ae 100644 --- a/pkg/resources/manual_tests/upgrade_secondary_database/step_2.tf +++ b/pkg/resources/manual_tests/upgrade_secondary_database/step_2.tf @@ -1,6 +1,6 @@ # Commands to run # - terraform init - upgrade -# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from replica, please use snowflake_secondary_database or deprecated snowflake_database_old instead) +# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from replica, please use snowflake_secondary_database instead) # - terraform state rm snowflake_database.secondary (remove secondary database from the state) terraform { @@ -16,12 +16,12 @@ provider "snowflake" {} provider "snowflake" { profile = "secondary_test_account" - alias = second_account + alias = second_account } resource "snowflake_database" "primary" { - provider = snowflake.second_account - name = "test" + provider = snowflake.second_account + name = "test" data_retention_time_in_days = 0 # to avoid in-place update to -1 replication_configuration { accounts = [""] # TODO: Replace @@ -30,7 +30,7 @@ resource "snowflake_database" "primary" { } resource "snowflake_database" "secondary" { - name = "test" - data_retention_time_in_days = 0 # to avoid in-place update to -1 - from_replica = ".\"${snowflake_database.primary.name}\"" # TODO: Replace + name = "test" + data_retention_time_in_days = 0 # to avoid in-place update to -1 + from_replica = ".\"${snowflake_database.primary.name}\"" # TODO: Replace } diff --git a/pkg/resources/manual_tests/upgrade_shared_database/step_2.tf b/pkg/resources/manual_tests/upgrade_shared_database/step_2.tf index ed8bce3132..0ab2f37c8c 100644 --- a/pkg/resources/manual_tests/upgrade_shared_database/step_2.tf +++ b/pkg/resources/manual_tests/upgrade_shared_database/step_2.tf @@ -1,6 +1,6 @@ # Commands to run # - terraform init - upgrade -# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from share, please use snowflake_shared_database or deprecated snowflake_database_old instead) +# - terraform plan (should observe upgrader errors similar to: failed to upgrade the state with database created from share, please use snowflake_shared_database instead) # - terraform state rm snowflake_database.from_share (remove shared database from the state) terraform { @@ -16,32 +16,32 @@ provider "snowflake" {} provider "snowflake" { profile = "secondary_test_account" - alias = second_account + alias = second_account } resource "snowflake_share" "test" { provider = snowflake.second_account - name = "test_share" + name = "test_share" accounts = ["."] # TODO: Replace } resource "snowflake_database" "test" { provider = snowflake.second_account - name = "test_database" + name = "test_database" } resource "snowflake_grant_privileges_to_share" "test" { - provider = snowflake.second_account - privileges = ["USAGE"] + provider = snowflake.second_account + privileges = ["USAGE"] on_database = snowflake_database.test.name - to_share = snowflake_share.test.name + to_share = snowflake_share.test.name } resource "snowflake_database" "from_share" { - depends_on = [ snowflake_grant_privileges_to_share.test ] - name = snowflake_database.test.name + depends_on = [snowflake_grant_privileges_to_share.test] + name = snowflake_database.test.name from_share = { provider = "" # TODO: Replace - share = snowflake_share.test.name + share = snowflake_share.test.name } } diff --git a/pkg/resources/oauth_integration.go b/pkg/resources/oauth_integration.go deleted file mode 100644 index 2d7e4152ae..0000000000 --- a/pkg/resources/oauth_integration.go +++ /dev/null @@ -1,343 +0,0 @@ -package resources - -import ( - "fmt" - "log" - "strconv" - "strings" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" -) - -var oauthIntegrationSchema = map[string]*schema.Schema{ - "name": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account.", - }, - "oauth_client": { - Type: schema.TypeString, - Required: true, - Description: "Specifies the OAuth client type.", - ValidateFunc: validation.StringInSlice([]string{ - "TABLEAU_DESKTOP", "TABLEAU_SERVER", "LOOKER", "CUSTOM", - }, false), - }, - "oauth_redirect_uri": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies the client URI. After a user is authenticated, the web browser is redirected to this URI.", - }, - "oauth_client_type": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies the type of client being registered. Snowflake supports both confidential and public clients.", - ValidateFunc: validation.StringInSlice([]string{ - "CONFIDENTIAL", "PUBLIC", - }, false), - }, - "oauth_issue_refresh_tokens": { - Type: schema.TypeBool, - Optional: true, - Description: "Specifies whether to allow the client to exchange a refresh token for an access token when the current access token has expired.", - }, - "oauth_refresh_token_validity": { - Type: schema.TypeInt, - Optional: true, - Description: "Specifies how long refresh tokens should be valid (in seconds). OAUTH_ISSUE_REFRESH_TOKENS must be set to TRUE.", - }, - "oauth_use_secondary_roles": { - Type: schema.TypeString, - Optional: true, - Default: "NONE", - Description: "Specifies whether default secondary roles set in the user properties are activated by default in the session being opened.", - ValidateFunc: validation.StringInSlice([]string{ - "IMPLICIT", "NONE", - }, false), - }, - "blocked_roles_list": { - Type: schema.TypeSet, - Elem: &schema.Schema{Type: schema.TypeString}, - Optional: true, - Description: "List of roles that a user cannot explicitly consent to using after authenticating. Do not include ACCOUNTADMIN, ORGADMIN or SECURITYADMIN as they are already implicitly enforced and will cause in-place updates.", - }, - "comment": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies a comment for the OAuth integration.", - }, - "enabled": { - Type: schema.TypeBool, - Optional: true, - Description: "Specifies whether this OAuth integration is enabled or disabled.", - }, - "created_on": { - Type: schema.TypeString, - Computed: true, - Description: "Date and time when the OAuth integration was created.", - }, -} - -// OAuthIntegration returns a pointer to the resource representing an OAuth integration. -func OAuthIntegration() *schema.Resource { - return &schema.Resource{ - Create: CreateOAuthIntegration, - Read: ReadOAuthIntegration, - Update: UpdateOAuthIntegration, - Delete: DeleteOAuthIntegration, - DeprecationMessage: "This resource is deprecated and will be removed in a future major version release. Please use snowflake_oauth_integration_for_custom_clients or snowflake_oauth_integration_for_partner_applications instead.", - - Schema: oauthIntegrationSchema, - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - } -} - -// CreateOAuthIntegration implements schema.CreateFunc. -func CreateOAuthIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - name := d.Get("name").(string) - - stmt := snowflake.NewOAuthIntegrationBuilder(name).Create() - - // Set required fields - stmt.SetRaw(`TYPE=OAUTH`) - stmt.SetString(`OAUTH_CLIENT`, d.Get("oauth_client").(string)) - // Set optional fields - if _, ok := d.GetOk("oauth_redirect_uri"); ok { - stmt.SetString(`OAUTH_REDIRECT_URI`, d.Get("oauth_redirect_uri").(string)) - } - if _, ok := d.GetOk("oauth_client_type"); ok { - stmt.SetString(`OAUTH_CLIENT_TYPE`, d.Get("oauth_client_type").(string)) - } - if _, ok := d.GetOk("oauth_issue_refresh_tokens"); ok { - stmt.SetBool(`OAUTH_ISSUE_REFRESH_TOKENS`, d.Get("oauth_issue_refresh_tokens").(bool)) - } - if _, ok := d.GetOk("oauth_refresh_token_validity"); ok { - stmt.SetInt(`OAUTH_REFRESH_TOKEN_VALIDITY`, d.Get("oauth_refresh_token_validity").(int)) - } - if _, ok := d.GetOk("oauth_use_secondary_roles"); ok { - stmt.SetString(`OAUTH_USE_SECONDARY_ROLES`, d.Get("oauth_use_secondary_roles").(string)) - } - if _, ok := d.GetOk("blocked_roles_list"); ok { - stmt.SetStringList(`BLOCKED_ROLES_LIST`, expandStringList(d.Get("blocked_roles_list").(*schema.Set).List())) - } - if _, ok := d.GetOk("enabled"); ok { - stmt.SetBool(`ENABLED`, d.Get("enabled").(bool)) - } - if _, ok := d.GetOk("comment"); ok { - stmt.SetString(`COMMENT`, d.Get("comment").(string)) - } - - if err := snowflake.Exec(db, stmt.Statement()); err != nil { - return fmt.Errorf("error creating security integration err = %w", err) - } - - d.SetId(name) - - return ReadOAuthIntegration(d, meta) -} - -// ReadOAuthIntegration implements schema.ReadFunc. -func ReadOAuthIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - id := d.Id() - - stmt := snowflake.NewOAuthIntegrationBuilder(id).Show() - row := snowflake.QueryRow(db, stmt) - - // Some properties can come from the SHOW INTEGRATION call - - s, err := snowflake.ScanOAuthIntegration(row) - if err != nil { - return fmt.Errorf("could not show security integration err = %w", err) - } - - // Note: category must be Security or something is broken - if c := s.Category.String; c != "SECURITY" { - return fmt.Errorf("expected %v to be an Security integration, got %v err = %w", id, c, err) - } - - if err := d.Set("oauth_client", strings.TrimPrefix(s.IntegrationType.String, "OAUTH - ")); err != nil { - return err - } - - if err := d.Set("name", s.Name.String); err != nil { - return err - } - - if err := d.Set("enabled", s.Enabled.Bool); err != nil { - return err - } - - if err := d.Set("comment", s.Comment.String); err != nil { - return err - } - - if err := d.Set("created_on", s.CreatedOn.String); err != nil { - return err - } - - // Some properties come from the DESCRIBE INTEGRATION call - // We need to grab them in a loop - var k, pType string - var v, unused interface{} - stmt = snowflake.NewOAuthIntegrationBuilder(id).Describe() - rows, err := db.Query(stmt) - if err != nil { - return fmt.Errorf("could not describe security integration err = %w", err) - } - defer rows.Close() - for rows.Next() { - if err := rows.Scan(&k, &pType, &v, &unused); err != nil { - return fmt.Errorf("unable to parse security integration rows err = %w", err) - } - switch k { - case "ENABLED": - // We set this using the SHOW INTEGRATION call so let's ignore it here - case "COMMENT": - // We set this using the SHOW INTEGRATION call so let's ignore it here - case "OAUTH_ISSUE_REFRESH_TOKENS": - b, err := strconv.ParseBool(v.(string)) - if err != nil { - return fmt.Errorf("returned OAuth issue refresh tokens that is not boolean err = %w", err) - } - if err := d.Set("oauth_issue_refresh_tokens", b); err != nil { - return fmt.Errorf("unable to set OAuth issue refresh tokens for security integration err = %w", err) - } - case "OAUTH_REFRESH_TOKEN_VALIDITY": - i, err := strconv.Atoi(v.(string)) - if err != nil { - return fmt.Errorf("returned OAuth refresh token validity that is not integer err = %w", err) - } - if err := d.Set("oauth_refresh_token_validity", i); err != nil { - return fmt.Errorf("unable to set OAuth refresh token validity for security integration err = %w", err) - } - case "OAUTH_USE_SECONDARY_ROLES": - if err := d.Set("oauth_use_secondary_roles", v.(string)); err != nil { - return fmt.Errorf("unable to set OAuth use secondary roles for security integration err = %w", err) - } - case "BLOCKED_ROLES_LIST": - blockedRolesAll := strings.Split(v.(string), ",") - - // Only roles other than ACCOUNTADMIN, ORGADMIN and SECURITYADMIN can be specified custom, - // those three are enforced with no option to remove them - blockedRolesCustom := []string{} - for _, role := range blockedRolesAll { - if role != "ACCOUNTADMIN" && role != "ORGADMIN" && role != "SECURITYADMIN" { - blockedRolesCustom = append(blockedRolesCustom, role) - } - } - - if err := d.Set("blocked_roles_list", blockedRolesCustom); err != nil { - return fmt.Errorf("unable to set blocked roles list for security integration err = %w", err) - } - case "OAUTH_REDIRECT_URI": - if err := d.Set("oauth_redirect_uri", v.(string)); err != nil { - return fmt.Errorf("unable to set OAuth redirect URI for security integration err = %w", err) - } - case "OAUTH_CLIENT_TYPE": - isTableau := strings.HasSuffix(s.IntegrationType.String, "TABLEAU_DESKTOP") || - strings.HasSuffix(s.IntegrationType.String, "TABLEAU_SERVER") - if !isTableau { - if err = d.Set("oauth_client_type", v.(string)); err != nil { - return fmt.Errorf("unable to set OAuth client type for security integration err = %w", err) - } - } - case "OAUTH_ENFORCE_PKCE": - // Only used for custom OAuth clients (not supported yet) - case "OAUTH_AUTHORIZATION_ENDPOINT": - // Only used for custom OAuth clients (not supported yet) - case "OAUTH_TOKEN_ENDPOINT": - // Only used for custom OAuth clients (not supported yet) - case "OAUTH_ALLOWED_AUTHORIZATION_ENDPOINTS": - // Only used for custom OAuth clients (not supported yet) - case "OAUTH_ALLOWED_TOKEN_ENDPOINTS": - // Only used for custom OAuth clients (not supported yet) - case "PRE_AUTHORIZED_ROLES_LIST": - // Only used for custom OAuth clients (not supported yet) - - default: - log.Printf("[WARN] unexpected security integration property %v returned from Snowflake", k) - } - } - - return err -} - -// UpdateOAuthIntegration implements schema.UpdateFunc. -func UpdateOAuthIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - id := d.Id() - - stmt := snowflake.NewOAuthIntegrationBuilder(id).Alter() - - var runSetStatement bool - - if d.HasChange("oauth_client") { - runSetStatement = true - stmt.SetString(`OAUTH_CLIENT`, d.Get("oauth_client").(string)) - } - - if d.HasChange("oauth_redirect_uri") { - runSetStatement = true - stmt.SetString(`OAUTH_REDIRECT_URI`, d.Get("oauth_redirect_uri").(string)) - } - - if d.HasChange("oauth_client_type") { - runSetStatement = true - stmt.SetString(`OAUTH_CLIENT_TYPE`, d.Get("oauth_client_type").(string)) - } - - if d.HasChange("oauth_issue_refresh_tokens") { - runSetStatement = true - stmt.SetBool(`OAUTH_ISSUE_REFRESH_TOKENS`, d.Get("oauth_issue_refresh_tokens").(bool)) - } - - if d.HasChange("oauth_refresh_token_validity") { - runSetStatement = true - stmt.SetInt(`OAUTH_REFRESH_TOKEN_VALIDITY`, d.Get("oauth_refresh_token_validity").(int)) - } - - if d.HasChange("oauth_use_secondary_roles") { - runSetStatement = true - stmt.SetString(`OAUTH_USE_SECONDARY_ROLES`, d.Get("oauth_use_secondary_roles").(string)) - } - - if d.HasChange("blocked_roles_list") { - runSetStatement = true - stmt.SetStringList(`BLOCKED_ROLES_LIST`, expandStringList(d.Get("blocked_roles_list").(*schema.Set).List())) - } - - if d.HasChange("enabled") { - runSetStatement = true - stmt.SetBool(`ENABLED`, d.Get("enabled").(bool)) - } - - if d.HasChange("comment") { - runSetStatement = true - stmt.SetString(`COMMENT`, d.Get("comment").(string)) - } - - if runSetStatement { - if err := snowflake.Exec(db, stmt.Statement()); err != nil { - return fmt.Errorf("error updating security integration err = %w", err) - } - } - - return ReadOAuthIntegration(d, meta) -} - -// DeleteOAuthIntegration implements schema.DeleteFunc. -func DeleteOAuthIntegration(d *schema.ResourceData, meta interface{}) error { - return DeleteResource("", snowflake.NewOAuthIntegrationBuilder)(d, meta) -} diff --git a/pkg/resources/oauth_integration_acceptance_test.go b/pkg/resources/oauth_integration_acceptance_test.go deleted file mode 100644 index 553913c7ba..0000000000 --- a/pkg/resources/oauth_integration_acceptance_test.go +++ /dev/null @@ -1,113 +0,0 @@ -package resources_test - -import ( - "fmt" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_OAuthIntegration(t *testing.T) { - name := acc.TestClient().Ids.Alpha() - oauthClient := "CUSTOM" - clientType := "PUBLIC" - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: oauthIntegrationConfig(name, oauthClient, clientType, "SYSADMIN"), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "name", name), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_client", oauthClient), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_client_type", clientType), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_issue_refresh_tokens", "true"), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_refresh_token_validity", "3600"), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "blocked_roles_list.#", "1"), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "blocked_roles_list.0", "SYSADMIN"), - ), - }, - { - // role change proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2358 issue - Config: oauthIntegrationConfig(name, oauthClient, clientType, "USERADMIN"), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "name", name), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "blocked_roles_list.#", "1"), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "blocked_roles_list.0", "USERADMIN"), - ), - }, - { - ResourceName: "snowflake_oauth_integration.test", - ImportState: true, - ImportStateVerify: true, - }, - }, - }) -} - -func oauthIntegrationConfig(name, oauthClient, clientType string, blockedRole string) string { - return fmt.Sprintf(` - resource "snowflake_oauth_integration" "test" { - name = "%s" - oauth_client = "%s" - oauth_client_type = "%s" - oauth_redirect_uri = "https://www.example.com/oauth2/callback" - enabled = true - oauth_issue_refresh_tokens = true - oauth_refresh_token_validity = 3600 - blocked_roles_list = ["%s"] - } - `, name, oauthClient, clientType, blockedRole) -} - -func TestAcc_OAuthIntegrationTableau(t *testing.T) { - name := acc.TestClient().Ids.Alpha() - oauthClient := "TABLEAU_DESKTOP" - clientType := "PUBLIC" // not used, but left to fail the test - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: oauthIntegrationConfigTableau(name, oauthClient, clientType), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "name", name), - resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_client", oauthClient), - // resource.TestCheckResourceAttr("snowflake_oauth_integration.test", "oauth_client_type", clientType), - ), - }, - { - ResourceName: "snowflake_oauth_integration.test", - ImportState: true, - ImportStateVerify: true, - }, - }, - }) -} - -func oauthIntegrationConfigTableau(name, oauthClient, clientType string) string { - return fmt.Sprintf(` - resource "snowflake_oauth_integration" "test" { - name = "%s" - oauth_client = "%s" - # oauth_client_type = "%s" # this cannot be set for TABLEAU - enabled = true - oauth_refresh_token_validity = 36000 - oauth_issue_refresh_tokens = true - blocked_roles_list = ["SYSADMIN"] - } - `, name, oauthClient, clientType) -} diff --git a/pkg/resources/oauth_integration_test.go b/pkg/resources/oauth_integration_test.go deleted file mode 100644 index c8e2d9a126..0000000000 --- a/pkg/resources/oauth_integration_test.go +++ /dev/null @@ -1,90 +0,0 @@ -package resources_test - -import ( - "database/sql" - "testing" - - internalprovider "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - - sqlmock "github.com/DATA-DOG/go-sqlmock" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" - . "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/testhelpers/mock" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/stretchr/testify/require" -) - -func TestOAuthIntegration(t *testing.T) { - r := require.New(t) - err := resources.OAuthIntegration().InternalValidate(provider.Provider().Schema, true) - r.NoError(err) -} - -func TestOAuthIntegrationCreate(t *testing.T) { - r := require.New(t) - - in := map[string]interface{}{ - "name": "test_oauth_integration", - "oauth_client": "TABLEAU_DESKTOP", - } - d := schema.TestResourceDataRaw(t, resources.OAuthIntegration().Schema, in) - r.NotNil(d) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - mock.ExpectExec( - `^CREATE SECURITY INTEGRATION "test_oauth_integration" TYPE=OAUTH OAUTH_CLIENT='TABLEAU_DESKTOP' OAUTH_USE_SECONDARY_ROLES='NONE'$`, - ).WillReturnResult(sqlmock.NewResult(1, 1)) - expectReadOAuthIntegration(mock) - - err := resources.CreateOAuthIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func TestOAuthIntegrationRead(t *testing.T) { - r := require.New(t) - - d := oauthIntegration(t, "test_oauth_integration", map[string]interface{}{"name": "test_oauth_integration"}) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - expectReadOAuthIntegration(mock) - - err := resources.ReadOAuthIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func TestOAuthIntegrationDelete(t *testing.T) { - r := require.New(t) - - d := oauthIntegration(t, "drop_it", map[string]interface{}{"name": "drop_it"}) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - mock.ExpectExec(`DROP SECURITY INTEGRATION "drop_it"`).WillReturnResult(sqlmock.NewResult(1, 1)) - err := resources.DeleteOAuthIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func expectReadOAuthIntegration(mock sqlmock.Sqlmock) { - showRows := sqlmock.NewRows([]string{ - "name", "type", "category", "enabled", "comment", "created_on", - }, - ).AddRow("test_oauth_integration", "OAUTH - TABLEAU_DESKTOP", "SECURITY", true, nil, "now") - mock.ExpectQuery(`^SHOW SECURITY INTEGRATIONS LIKE 'test_oauth_integration'$`).WillReturnRows(showRows) - - descRows := sqlmock.NewRows([]string{ - "property", "property_type", "property_value", "property_default", - }).AddRow("OAUTH_ISSUE_REFRESH_TOKENS", "Boolean", "true", "true"). - AddRow("OAUTH_REFRESH_TOKEN_VALIDITY", "Integer", "86400", "7776000"). - AddRow("BLOCKED_ROLES_LIST", "List", "ACCOUNTADMIN,SECURITYADMIN", nil) - - mock.ExpectQuery(`DESCRIBE SECURITY INTEGRATION "test_oauth_integration"$`).WillReturnRows(descRows) -} diff --git a/pkg/resources/role.go b/pkg/resources/role.go deleted file mode 100644 index 443de76734..0000000000 --- a/pkg/resources/role.go +++ /dev/null @@ -1,9 +0,0 @@ -package resources - -import "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - -func Role() *schema.Resource { - accountRole := AccountRole() - accountRole.DeprecationMessage = "This resource is deprecated and will be removed in a future major version release. Please use snowflake_account_role instead." - return accountRole -} diff --git a/pkg/resources/saml_integration.go b/pkg/resources/saml_integration.go deleted file mode 100644 index 8062a9e40d..0000000000 --- a/pkg/resources/saml_integration.go +++ /dev/null @@ -1,480 +0,0 @@ -package resources - -import ( - "fmt" - "log" - "strconv" - "strings" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" -) - -var samlIntegrationSchema = map[string]*schema.Schema{ - "name": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "Specifies the name of the SAML2 integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account.", - }, - "enabled": { - Type: schema.TypeBool, - Optional: true, - Default: true, - Description: "Specifies whether this security integration is enabled or disabled.", - }, - "saml2_issuer": { - Type: schema.TypeString, - Required: true, - Description: "The string containing the IdP EntityID / Issuer.", - }, - "saml2_sso_url": { - Type: schema.TypeString, - Required: true, - Description: "The string containing the IdP SSO URL, where the user should be redirected by Snowflake (the Service Provider) with a SAML AuthnRequest message.", - }, - "saml2_provider": { - Type: schema.TypeString, - Required: true, - Description: "The string describing the IdP. One of the following: OKTA, ADFS, Custom.", - ValidateFunc: validation.StringInSlice([]string{ - "OKTA", "ADFS", "CUSTOM", - }, true), - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - normalize := func(s string) string { - return strings.ToUpper(strings.ReplaceAll(s, "-", "")) - } - return normalize(old) == normalize(new) - }, - }, - "saml2_x509_cert": { - Type: schema.TypeString, - Required: true, - Description: "The Base64 encoded IdP signing certificate on a single line without the leading -----BEGIN CERTIFICATE----- and ending -----END CERTIFICATE----- markers.", - }, - "saml2_sp_initiated_login_page_label": { - Type: schema.TypeString, - Optional: true, - Description: "The string containing the label to display after the Log In With button on the login page.", - }, - "saml2_enable_sp_initiated": { - Type: schema.TypeBool, - Optional: true, - Description: "The Boolean indicating if the Log In With button will be shown on the login page. TRUE: displays the Log in WIth button on the login page. FALSE: does not display the Log in With button on the login page.", - }, - // Computed and Optionally Settable. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_SNOWFLAKE_METADATA) - "saml2_snowflake_x509_cert": { - Type: schema.TypeString, - Optional: true, - Computed: true, - Description: "The Base64 encoded self-signed certificate generated by Snowflake for use with Encrypting SAML Assertions and Signed SAML Requests. You must have at least one of these features (encrypted SAML assertions or signed SAML responses) enabled in your Snowflake account to access the certificate value.", - }, - "saml2_sign_request": { - Type: schema.TypeBool, - Optional: true, - Description: "The Boolean indicating whether SAML requests are signed. TRUE: allows SAML requests to be signed. FALSE: does not allow SAML requests to be signed.", - }, - "saml2_requested_nameid_format": { - Type: schema.TypeString, - Optional: true, - Description: "The SAML NameID format allows Snowflake to set an expectation of the identifying attribute of the user (i.e. SAML Subject) in the SAML assertion from the IdP to ensure a valid authentication to Snowflake. If a value is not specified, Snowflake sends the urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress value in the authentication request to the IdP. NameID must be one of the following values: urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified, urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress, urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName, urn:oasis:names:tc:SAML:1.1:nameid-format:WindowsDomainQualifiedName, urn:oasis:names:tc:SAML:2.0:nameid-format:kerberos, urn:oasis:names:tc:SAML:2.0:nameid-format:persistent, urn:oasis:names:tc:SAML:2.0:nameid-format:transient .", - ValidateFunc: validation.StringInSlice([]string{ - "urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified", - "urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress", - "urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName", - "urn:oasis:names:tc:SAML:1.1:nameid-format:WindowsDomainQualifiedName", - "urn:oasis:names:tc:SAML:2.0:nameid-format:kerberos", - "urn:oasis:names:tc:SAML:2.0:nameid-format:persistent", - "urn:oasis:names:tc:SAML:2.0:nameid-format:transient", - }, true), - }, - "saml2_post_logout_redirect_url": { - Type: schema.TypeString, - Optional: true, - Description: "The endpoint to which Snowflake redirects users after clicking the Log Out button in the classic Snowflake web interface. Snowflake terminates the Snowflake session upon redirecting to the specified endpoint.", - }, - "saml2_force_authn": { - Type: schema.TypeBool, - Optional: true, - Description: "The Boolean indicating whether users, during the initial authentication flow, are forced to authenticate again to access Snowflake. When set to TRUE, Snowflake sets the ForceAuthn SAML parameter to TRUE in the outgoing request from Snowflake to the identity provider. TRUE: forces users to authenticate again to access Snowflake, even if a valid session with the identity provider exists. FALSE: does not force users to authenticate again to access Snowflake.", - }, - // Computed and Optionally Settable. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_SNOWFLAKE_METADATA) - "saml2_snowflake_issuer_url": { - Type: schema.TypeString, - Optional: true, - Computed: true, - Description: "The string containing the EntityID / Issuer for the Snowflake service provider. If an incorrect value is specified, Snowflake returns an error message indicating the acceptable values to use.", - }, - // Computed and Optionally Settable. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_SNOWFLAKE_METADATA) - "saml2_snowflake_acs_url": { - Type: schema.TypeString, - Optional: true, - Computed: true, - Description: "The string containing the Snowflake Assertion Consumer Service URL to which the IdP will send its SAML authentication response back to Snowflake. This property will be set in the SAML authentication request generated by Snowflake when initiating a SAML SSO operation with the IdP. If an incorrect value is specified, Snowflake returns an error message indicating the acceptable values to use. Default: https://..snowflakecomputing.com/fed/login", - }, - // Computed. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_SNOWFLAKE_METADATA) - "saml2_snowflake_metadata": { - Type: schema.TypeString, - Computed: true, - Description: "Metadata created by Snowflake to provide to SAML2 provider.", - }, - // Computed. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_DIGEST_METHODS_USED) - "saml2_digest_methods_used": { - Type: schema.TypeString, - Computed: true, - }, - // Computed. Info you get by issuing a 'DESCRIBE INTEGRATION ' command (SAML2_SIGNATURE_METHODS_USED) - "saml2_signature_methods_used": { - Type: schema.TypeString, - Computed: true, - }, - "created_on": { - Type: schema.TypeString, - Computed: true, - Description: "Date and time when the SAML integration was created.", - }, -} - -// SAMLIntegration returns a pointer to the resource representing a SAML2 security integration. -func SAMLIntegration() *schema.Resource { - return &schema.Resource{ - Create: CreateSAMLIntegration, - Read: ReadSAMLIntegration, - Update: UpdateSAMLIntegration, - Delete: DeleteSAMLIntegration, - DeprecationMessage: "This resource is deprecated and will be removed in a future major version release. Please use snowflake_saml2_integration instead.", - - Schema: samlIntegrationSchema, - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - } -} - -// CreateSAMLIntegration implements schema.CreateFunc. -func CreateSAMLIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - name := d.Get("name").(string) - - stmt := snowflake.NewSamlIntegrationBuilder(name).Create() - - // Set required fields - stmt.SetRaw(`TYPE=SAML2`) - stmt.SetBool(`ENABLED`, d.Get("enabled").(bool)) - stmt.SetString(`SAML2_ISSUER`, d.Get("saml2_issuer").(string)) - stmt.SetString(`SAML2_SSO_URL`, d.Get("saml2_sso_url").(string)) - stmt.SetString(`SAML2_PROVIDER`, d.Get("saml2_provider").(string)) - - // Set optional fields - if _, ok := d.GetOk("saml2_x509_cert"); ok { - stmt.SetString(`SAML2_X509_CERT`, d.Get("saml2_x509_cert").(string)) - } - - if _, ok := d.GetOk("saml2_sp_initiated_login_page_label"); ok { - stmt.SetString(`SAML2_SP_INITIATED_LOGIN_PAGE_LABEL`, d.Get("saml2_sp_initiated_login_page_label").(string)) - } - - if _, ok := d.GetOk("saml2_enable_sp_initiated"); ok { - stmt.SetBool(`SAML2_ENABLE_SP_INITIATED`, d.Get("saml2_enable_sp_initiated").(bool)) - } - - if _, ok := d.GetOk("saml2_snowflake_x509_cert"); ok { - stmt.SetString(`SAML2_SNOWFLAKE_X509_CERT`, d.Get("saml2_snowflake_x509_cert").(string)) - } - - if _, ok := d.GetOk("saml2_sign_request"); ok { - stmt.SetBool(`SAML2_SIGN_REQUEST`, d.Get("saml2_sign_request").(bool)) - } - - if _, ok := d.GetOk("saml2_requested_nameid_format"); ok { - stmt.SetString(`SAML2_REQUESTED_NAMEID_FORMAT`, d.Get("saml2_requested_nameid_format").(string)) - } - - if _, ok := d.GetOk("saml2_post_logout_redirect_url"); ok { - stmt.SetString(`SAML2_POST_LOGOUT_REDIRECT_URL`, d.Get("saml2_post_logout_redirect_url").(string)) - } - - if _, ok := d.GetOk("saml2_force_authn"); ok { - stmt.SetBool(`SAML2_FORCE_AUTHN`, d.Get("saml2_force_authn").(bool)) - } - - if _, ok := d.GetOk("saml2_snowflake_issuer_url"); ok { - stmt.SetString(`SAML2_SNOWFLAKE_ISSUER_URL`, d.Get("saml2_snowflake_issuer_url").(string)) - } - - if _, ok := d.GetOk("saml2_snowflake_acs_url"); ok { - stmt.SetString(`SAML2_SNOWFLAKE_ACS_URL`, d.Get("saml2_snowflake_acs_url").(string)) - } - - err := snowflake.Exec(db, stmt.Statement()) - if err != nil { - return fmt.Errorf("error creating security integration err = %w", err) - } - - d.SetId(name) - - return ReadSAMLIntegration(d, meta) -} - -// ReadSAMLIntegration implements schema.ReadFunc. -func ReadSAMLIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - id := d.Id() - - stmt := snowflake.NewSamlIntegrationBuilder(id).Show() - row := snowflake.QueryRow(db, stmt) - - // Some properties can come from the SHOW INTEGRATION call - - s, err := snowflake.ScanSamlIntegration(row) - if err != nil { - return fmt.Errorf("could not show security integration err = %w", err) - } - - // Note: category must be Security or something is broken - if c := s.Category.String; c != "SECURITY" { - return fmt.Errorf("expected %v to be an Security integration, got %v", id, c) - } - - // Note: type must be SAML2 or something is broken - if c := s.IntegrationType.String; c != "SAML2" { - return fmt.Errorf("expected %v to be a SAML2 integration type, got %v", id, c) - } - - if err := d.Set("name", s.Name.String); err != nil { - return err - } - - if err := d.Set("created_on", s.CreatedOn.String); err != nil { - return err - } - - if err := d.Set("enabled", s.Enabled.Bool); err != nil { - return err - } - - // Some properties come from the DESCRIBE INTEGRATION call - // We need to grab them in a loop - var k, pType string - var v, unused interface{} - stmt = snowflake.NewSamlIntegrationBuilder(id).Describe() - rows, err := db.Query(stmt) - if err != nil { - return fmt.Errorf("could not describe security integration err = %w", err) - } - defer rows.Close() - for rows.Next() { - if err := rows.Scan(&k, &pType, &v, &unused); err != nil { - return fmt.Errorf("unable to parse security integration rows err = %w", err) - } - switch k { - case "ENABLED": - // set using the SHOW INTEGRATION, ignoring here - case "SAML2_ISSUER": - if err := d.Set("saml2_issuer", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_issuer for security integration err = %w", err) - } - case "SAML2_SSO_URL": - if err := d.Set("saml2_sso_url", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_sso_url for security integration err = %w", err) - } - case "SAML2_PROVIDER": - if err := d.Set("saml2_provider", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_provider for security integration err = %w", err) - } - case "SAML2_X509_CERT": - if err := d.Set("saml2_x509_cert", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_x509_cert for security integration err = %w", err) - } - case "SAML2_SP_INITIATED_LOGIN_PAGE_LABEL": - if err := d.Set("saml2_sp_initiated_login_page_label", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_sp_initiated_login_page_label for security integration") - } - case "SAML2_ENABLE_SP_INITIATED": - var b bool - switch v2 := v.(type) { - case bool: - b = v2 - case string: - b, err = strconv.ParseBool(v.(string)) - if err != nil { - return fmt.Errorf("returned saml2_force_authn that is not boolean err = %w", err) - } - default: - return fmt.Errorf("returned saml2_force_authn that is not boolean") - } - if err := d.Set("saml2_enable_sp_initiated", b); err != nil { - return fmt.Errorf("unable to set saml2_enable_sp_initiated for security integration err = %w", err) - } - case "SAML2_SNOWFLAKE_X509_CERT": - if err := d.Set("saml2_snowflake_x509_cert", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_snowflake_x509_cert for security integration err = %w", err) - } - case "SAML2_SIGN_REQUEST": - var b bool - switch v2 := v.(type) { - case bool: - b = v2 - case string: - b, err = strconv.ParseBool(v.(string)) - if err != nil { - return fmt.Errorf("returned saml2_force_authn that is not boolean err = %w", err) - } - default: - return fmt.Errorf("returned saml2_force_authn that is not boolean err = %w", err) - } - if err := d.Set("saml2_sign_request", b); err != nil { - return fmt.Errorf("unable to set saml2_sign_request for security integration err = %w", err) - } - case "SAML2_REQUESTED_NAMEID_FORMAT": - if err := d.Set("saml2_requested_nameid_format", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_requested_nameid_format for security integration err = %w", err) - } - case "SAML2_POST_LOGOUT_REDIRECT_URL": - if err := d.Set("saml2_post_logout_redirect_url", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_post_logout_redirect_url for security integration err = %w", err) - } - case "SAML2_FORCE_AUTHN": - var b bool - switch v2 := v.(type) { - case bool: - b = v2 - case string: - b, err = strconv.ParseBool(v.(string)) - if err != nil { - return fmt.Errorf("returned saml2_force_authn that is not boolean err = %w", err) - } - default: - return fmt.Errorf("returned saml2_force_authn that is not boolean err = %w", err) - } - if err := d.Set("saml2_force_authn", b); err != nil { - return fmt.Errorf("unable to set saml2_force_authn for security integration err = %w", err) - } - case "SAML2_SNOWFLAKE_ISSUER_URL": - if err := d.Set("saml2_snowflake_issuer_url", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_snowflake_issuer_url for security integration err = %w", err) - } - case "SAML2_SNOWFLAKE_ACS_URL": - if err := d.Set("saml2_snowflake_acs_url", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_snowflake_acs_url for security integration err = %w", err) - } - case "SAML2_SNOWFLAKE_METADATA": - if err := d.Set("saml2_snowflake_metadata", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_snowflake_metadata for security integration err = %w", err) - } - case "SAML2_DIGEST_METHODS_USED": - if err := d.Set("saml2_digest_methods_used", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_digest_methods_used for security integration err = %w", err) - } - case "SAML2_SIGNATURE_METHODS_USED": - if err := d.Set("saml2_signature_methods_used", v.(string)); err != nil { - return fmt.Errorf("unable to set saml2_signature_methods_used for security integration err = %w", err) - } - case "COMMENT": - // COMMENT cannot be set according to snowflake docs, so ignoring - default: - log.Printf("[WARN] unexpected security integration property %v returned from Snowflake", k) - } - } - - return err -} - -// UpdateSAMLIntegration implements schema.UpdateFunc. -func UpdateSAMLIntegration(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - db := client.GetConn().DB - id := d.Id() - - stmt := snowflake.NewSamlIntegrationBuilder(id).Alter() - - var runSetStatement bool - - if d.HasChange("enabled") { - runSetStatement = true - stmt.SetBool(`ENABLED`, d.Get("enabled").(bool)) - } - - if d.HasChange("saml2_issuer") { - runSetStatement = true - stmt.SetString(`SAML2_ISSUER`, d.Get("saml2_issuer").(string)) - } - - if d.HasChange("saml2_sso_url") { - runSetStatement = true - stmt.SetString(`saml2_sso_url`, d.Get("saml2_sso_url").(string)) - } - - if d.HasChange("saml2_provider") { - runSetStatement = true - stmt.SetString(`saml2_provider`, d.Get("saml2_provider").(string)) - } - - if d.HasChange("saml2_x509_cert") { - runSetStatement = true - stmt.SetString(`saml2_x509_cert`, d.Get("saml2_x509_cert").(string)) - } - - if d.HasChange("saml2_sp_initiated_login_page_label") { - runSetStatement = true - stmt.SetString(`saml2_sp_initiated_login_page_label`, d.Get("saml2_sp_initiated_login_page_label").(string)) - } - - if d.HasChange("saml2_enable_sp_initiated") { - runSetStatement = true - stmt.SetBool(`saml2_enable_sp_initiated`, d.Get("saml2_enable_sp_initiated").(bool)) - } - - if d.HasChange("saml2_snowflake_x509_cert") { - runSetStatement = true - stmt.SetString(`saml2_snowflake_x509_cert`, d.Get("saml2_snowflake_x509_cert").(string)) - } - - if d.HasChange("saml2_sign_request") { - runSetStatement = true - stmt.SetBool(`saml2_sign_request`, d.Get("saml2_sign_request").(bool)) - } - - if d.HasChange("saml2_requested_nameid_format") { - runSetStatement = true - stmt.SetString(`saml2_requested_nameid_format`, d.Get("saml2_requested_nameid_format").(string)) - } - - if d.HasChange("saml2_post_logout_redirect_url") { - runSetStatement = true - stmt.SetString(`saml2_post_logout_redirect_url`, d.Get("saml2_post_logout_redirect_url").(string)) - } - - if d.HasChange("saml2_force_authn") { - runSetStatement = true - stmt.SetBool(`saml2_force_authn`, d.Get("saml2_force_authn").(bool)) - } - - if d.HasChange("saml2_snowflake_issuer_url") { - runSetStatement = true - stmt.SetString(`saml2_snowflake_issuer_url`, d.Get("saml2_snowflake_issuer_url").(string)) - } - - if d.HasChange("saml2_snowflake_acs_url") { - runSetStatement = true - stmt.SetString(`saml2_snowflake_acs_url`, d.Get("saml2_snowflake_acs_url").(string)) - } - - if runSetStatement { - if err := snowflake.Exec(db, stmt.Statement()); err != nil { - return fmt.Errorf("error updating security integration err = %w", err) - } - } - - return ReadSAMLIntegration(d, meta) -} - -// DeleteSAMLIntegration implements schema.DeleteFunc. -func DeleteSAMLIntegration(d *schema.ResourceData, meta interface{}) error { - return DeleteResource("", snowflake.NewSamlIntegrationBuilder)(d, meta) -} diff --git a/pkg/resources/saml_integration_acceptance_test.go b/pkg/resources/saml_integration_acceptance_test.go deleted file mode 100644 index 8d01727ba8..0000000000 --- a/pkg/resources/saml_integration_acceptance_test.go +++ /dev/null @@ -1,60 +0,0 @@ -package resources_test - -import ( - "fmt" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_SamlIntegration(t *testing.T) { - // TODO [SNOW-926148]: unskip - testenvs.SkipTestIfSet(t, testenvs.SkipSamlIntegrationTest, "because was skipped earlier") - - samlIntName := acc.TestClient().Ids.Alpha() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: samlIntegrationConfig(samlIntName), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_saml_integration.test_saml_int", "name", samlIntName), - resource.TestCheckResourceAttr("snowflake_saml_integration.test_saml_int", "saml2_issuer", "test_issuer"), - resource.TestCheckResourceAttr("snowflake_saml_integration.test_saml_int", "saml2_sso_url", "https://samltest.id/saml/sp"), - resource.TestCheckResourceAttr("snowflake_saml_integration.test_saml_int", "saml2_provider", "CUSTOM"), - resource.TestCheckResourceAttr("snowflake_saml_integration.test_saml_int", "saml2_x509_cert", "MIIERTCCAq2gAwIBAgIJAKmtzjCD1+tqMA0GCSqGSIb3DQEBCwUAMDUxMzAxBgNVBAMTKmlwLTE3Mi0zMS0yOC02NC51cy13ZXN0LTIuY29tcHV0ZS5pbnRlcm5hbDAeFw0xODA4MTgyMzI0MjNaFw0yODA4MTUyMzI0MjNaMDUxMzAxBgNVBAMTKmlwLTE3Mi0zMS0yOC02NC51cy13ZXN0LTIuY29tcHV0ZS5pbnRlcm5hbDCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALhUlY3SkIOze+l8y6dBzM6p7B8OykJWlwizszU16Lih8D7KLhNJfahoVxbPxB3YFM/81PJLOeK2krvJ5zY6CJyQY3sPQAkZKI7I8qq9lmZ2g4QPqybNstXS6YUXJNUt/ixbbK/N97+LKTiSutbD1J7AoFnouMuLjlhN5VRZ43jez4xLSHVZaYuUFKn01Y9oLKbj46LQnZnJCAGpTgPqEQJr6GpVGw43bKyUpGoaPrdDRgRgtPMUWgFDkgcI3QiV1lsKfBs1t1E2UA7ACFnlJZpEuBtwgivzo3VeitiSaF3Jxh25EY5/vABpcgQQRz3RH2l8MMKdRsxb8VT3yh2S+CX55s+cN67LiCPr6f2u+KS1iKfB9mWN6o2S4lcmo82HIBbsuXJV0oA1HrGMyyc4Y9nng/I8iuAp8or1JrWRHQ+8NzO85DWK0rtvtLPxkvw0HK32glyuOP/9F05Z7+tiVIgn67buC0EdoUm1RSpibqmB1ST2PikslOlVbJuy4Ah93wIDAQABo1gwVjA1BgNVHREELjAsgippcC0xNzItMzEtMjgtNjQudXMtd2VzdC0yLmNvbXB1dGUuaW50ZXJuYWwwHQYDVR0OBBYEFAdsTxYfulJ5yunYtgYJHC9IcevzMA0GCSqGSIb3DQEBCwUAA4IBgQB3J6i7KreiHL8NPMglfWLHk1PZOgvIEEpKL+GRebvcbyqgcuc3VVPylq70VvGqhJxp1q/mzLfraUiypzfWFGm9zfwIg0H5TqRZYEPTvgIhIICjaDWRwZBDJG8D5G/KoV60DlUG0crPBlIuCCr/SRa5ZoDQqvucTfr3Rx4Ha6koXFSjoSXllR+jn4GnInhm/WH137a+v35PUcffNxfuehoGn6i4YeXF3cwJK4e35cOFW+dLbnaLk+Ty7HOGvpw86h979C6mJ9qEHYgq9rQyzlSPbLZGZSgVcIezunOaOsWm81BsXRNNJjzHGCqKf8RMhd8oZP55+2/SVRBwnkGyUNCuDPrJcymC95ZT2NW/KeWkz28HF2i31xQmecT2r3lQRSM8acvOXQsNEDCDvJvCzJT9c2AnsnO24r6arPXs/UWAxOI+MjclXPLkLD6uTHV+Oo8XZ7bOjegD5hL6/bKUWnNMurQNGrmi/jvqsCFLDKftl7ajuxKjtodnSuwhoY7NQy8="), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "created_on"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_snowflake_x509_cert"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_snowflake_acs_url"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_snowflake_issuer_url"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_snowflake_metadata"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_digest_methods_used"), - resource.TestCheckResourceAttrSet("snowflake_saml_integration.test_saml_int", "saml2_signature_methods_used"), - ), - }, - }, - }) -} - -func samlIntegrationConfig(name string) string { - return fmt.Sprintf(` - resource "snowflake_saml_integration" "test_saml_int" { - name = "%s" - saml2_issuer = "test_issuer" - saml2_sso_url = "https://samltest.id/saml/sp" - saml2_provider = "CUSTOM" - saml2_x509_cert = "MIIERTCCAq2gAwIBAgIJAKmtzjCD1+tqMA0GCSqGSIb3DQEBCwUAMDUxMzAxBgNVBAMTKmlwLTE3Mi0zMS0yOC02NC51cy13ZXN0LTIuY29tcHV0ZS5pbnRlcm5hbDAeFw0xODA4MTgyMzI0MjNaFw0yODA4MTUyMzI0MjNaMDUxMzAxBgNVBAMTKmlwLTE3Mi0zMS0yOC02NC51cy13ZXN0LTIuY29tcHV0ZS5pbnRlcm5hbDCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALhUlY3SkIOze+l8y6dBzM6p7B8OykJWlwizszU16Lih8D7KLhNJfahoVxbPxB3YFM/81PJLOeK2krvJ5zY6CJyQY3sPQAkZKI7I8qq9lmZ2g4QPqybNstXS6YUXJNUt/ixbbK/N97+LKTiSutbD1J7AoFnouMuLjlhN5VRZ43jez4xLSHVZaYuUFKn01Y9oLKbj46LQnZnJCAGpTgPqEQJr6GpVGw43bKyUpGoaPrdDRgRgtPMUWgFDkgcI3QiV1lsKfBs1t1E2UA7ACFnlJZpEuBtwgivzo3VeitiSaF3Jxh25EY5/vABpcgQQRz3RH2l8MMKdRsxb8VT3yh2S+CX55s+cN67LiCPr6f2u+KS1iKfB9mWN6o2S4lcmo82HIBbsuXJV0oA1HrGMyyc4Y9nng/I8iuAp8or1JrWRHQ+8NzO85DWK0rtvtLPxkvw0HK32glyuOP/9F05Z7+tiVIgn67buC0EdoUm1RSpibqmB1ST2PikslOlVbJuy4Ah93wIDAQABo1gwVjA1BgNVHREELjAsgippcC0xNzItMzEtMjgtNjQudXMtd2VzdC0yLmNvbXB1dGUuaW50ZXJuYWwwHQYDVR0OBBYEFAdsTxYfulJ5yunYtgYJHC9IcevzMA0GCSqGSIb3DQEBCwUAA4IBgQB3J6i7KreiHL8NPMglfWLHk1PZOgvIEEpKL+GRebvcbyqgcuc3VVPylq70VvGqhJxp1q/mzLfraUiypzfWFGm9zfwIg0H5TqRZYEPTvgIhIICjaDWRwZBDJG8D5G/KoV60DlUG0crPBlIuCCr/SRa5ZoDQqvucTfr3Rx4Ha6koXFSjoSXllR+jn4GnInhm/WH137a+v35PUcffNxfuehoGn6i4YeXF3cwJK4e35cOFW+dLbnaLk+Ty7HOGvpw86h979C6mJ9qEHYgq9rQyzlSPbLZGZSgVcIezunOaOsWm81BsXRNNJjzHGCqKf8RMhd8oZP55+2/SVRBwnkGyUNCuDPrJcymC95ZT2NW/KeWkz28HF2i31xQmecT2r3lQRSM8acvOXQsNEDCDvJvCzJT9c2AnsnO24r6arPXs/UWAxOI+MjclXPLkLD6uTHV+Oo8XZ7bOjegD5hL6/bKUWnNMurQNGrmi/jvqsCFLDKftl7ajuxKjtodnSuwhoY7NQy8=" - enabled = false - } - `, name) -} diff --git a/pkg/resources/saml_integration_test.go b/pkg/resources/saml_integration_test.go deleted file mode 100644 index d326c12212..0000000000 --- a/pkg/resources/saml_integration_test.go +++ /dev/null @@ -1,108 +0,0 @@ -package resources_test - -import ( - "database/sql" - "testing" - - internalprovider "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - - sqlmock "github.com/DATA-DOG/go-sqlmock" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" - . "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/testhelpers/mock" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/stretchr/testify/require" -) - -func TestSAMLIntegration(t *testing.T) { - r := require.New(t) - err := resources.SAMLIntegration().InternalValidate(provider.Provider().Schema, true) - r.NoError(err) -} - -func TestSAMLIntegrationCreate(t *testing.T) { - r := require.New(t) - - in := map[string]interface{}{ - "name": "test_saml_integration", - "enabled": true, - "saml2_issuer": "test_issuer", - "saml2_sso_url": "https://testsamlissuer.com", - "saml2_provider": "CUSTOM", - "saml2_x509_cert": "MIICdummybase64certificate", - } - d := schema.TestResourceDataRaw(t, resources.SAMLIntegration().Schema, in) - r.NotNil(d) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - mock.ExpectExec( - `^CREATE SECURITY INTEGRATION "test_saml_integration" TYPE=SAML2 SAML2_ISSUER='test_issuer' SAML2_PROVIDER='CUSTOM' SAML2_SSO_URL='https://testsamlissuer.com' SAML2_X509_CERT='MIICdummybase64certificate' ENABLED=true$`, - ).WillReturnResult(sqlmock.NewResult(1, 1)) - expectReadSAMLIntegration(mock) - - err := resources.CreateSAMLIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func TestSAMLIntegrationRead(t *testing.T) { - r := require.New(t) - - d := samlIntegration(t, "test_saml_integration", map[string]interface{}{"name": "test_saml_integration"}) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - expectReadSAMLIntegration(mock) - - err := resources.ReadSAMLIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func TestSAMLIntegrationDelete(t *testing.T) { - r := require.New(t) - - d := samlIntegration(t, "drop_it", map[string]interface{}{"name": "drop_it"}) - - WithMockDb(t, func(db *sql.DB, mock sqlmock.Sqlmock) { - mock.ExpectExec(`DROP SECURITY INTEGRATION "drop_it"`).WillReturnResult(sqlmock.NewResult(1, 1)) - err := resources.DeleteSAMLIntegration(d, &internalprovider.Context{ - Client: sdk.NewClientFromDB(db), - }) - r.NoError(err) - }) -} - -func expectReadSAMLIntegration(mock sqlmock.Sqlmock) { - showRows := sqlmock.NewRows([]string{ - "name", "type", "category", "enabled", "created_on", - }, - ).AddRow("test_saml_integration", "SAML2", "SECURITY", true, "now") - mock.ExpectQuery(`^SHOW SECURITY INTEGRATIONS LIKE 'test_saml_integration'$`).WillReturnRows(showRows) - - descRows := sqlmock.NewRows([]string{ - "property", "property_type", "property_value", "property_default", - }).AddRow("SAML2_X509_CERT", "String", "MIICdummybase64certificate", nil). - AddRow("SAML2_PROVIDER", "String", "CUSTOM", nil). - AddRow("SAML2_ENABLE_SP_INITIATED", "Boolean", false, false). - AddRow("SAML2_SP_INITIATED_LOGIN_PAGE_LABEL", "String", "MyLabel", nil). - AddRow("SAML2_SSO_URL", "String", "https://testsamlissuer.com", nil). - AddRow("SAML2_ISSUER", "String", "test_issuer", nil). - AddRow("SAML2_SNOWFLAKE_X509_CERT", "String", "MIICdummybase64certificate", nil). - AddRow("SAML2_REQUESTED_NAMEID_FORMAT", "String", "urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress", "urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress"). - AddRow("SAML2_FORCE_AUTHN", "Boolean", false, false). - AddRow("SAML2_POST_LOGOUT_REDIRECT_URL", "String", "https://myredirecturl.com", nil). - AddRow("SAML2_SIGN_REQUEST", "Boolean", false, false). - AddRow("SAML2_SNOWFLAKE_ACS_URL", "String", "https://myinstance.my-region-1.snowflakecomputing.com/fed/login", nil). - AddRow("SAML2_SNOWFLAKE_ISSUER_URL", "String", "https://myinstance.my-region-1.snowflakecomputing.com", nil). - AddRow("SAML2_SNOWFLAKE_METADATA", "String", "", nil). - AddRow("SAML2_DIGEST_METHODS_USED", "http://www.w3.org/2001/04/xmlenc#sha256", "CUSTOM", nil). - AddRow("SAML2_SIGNATURE_METHODS_USED", "http://www.w3.org/2001/04/xmldsig-more#rsa-sha256", "CUSTOM", nil). - AddRow("COMMENT", "String", "Some Comment", nil) - - mock.ExpectQuery(`DESCRIBE SECURITY INTEGRATION "test_saml_integration"$`).WillReturnRows(descRows) -} diff --git a/pkg/resources/session_parameter.go b/pkg/resources/session_parameter.go deleted file mode 100644 index 9c4d13e09d..0000000000 --- a/pkg/resources/session_parameter.go +++ /dev/null @@ -1,153 +0,0 @@ -package resources - -import ( - "context" - "fmt" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" -) - -var sessionParameterSchema = map[string]*schema.Schema{ - "key": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "Name of session parameter. Valid values are those in [session parameters](https://docs.snowflake.com/en/sql-reference/parameters.html#session-parameters).", - }, - "value": { - Type: schema.TypeString, - Required: true, - Description: "Value of session parameter, as a string. Constraints are the same as those for the parameters in Snowflake documentation.", - }, - "on_account": { - Type: schema.TypeBool, - Optional: true, - Default: false, - Description: "If true, the session parameter will be set on the account level.", - }, - "user": { - Type: schema.TypeString, - Optional: true, - Description: "The user to set the session parameter for. Required if on_account is false", - }, -} - -func SessionParameter() *schema.Resource { - return &schema.Resource{ - Create: CreateSessionParameter, - Read: ReadSessionParameter, - Update: UpdateSessionParameter, - Delete: DeleteSessionParameter, - - Schema: sessionParameterSchema, - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - } -} - -// CreateSessionParameter implements schema.CreateFunc. -func CreateSessionParameter(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - key := d.Get("key").(string) - value := d.Get("value").(string) - ctx := context.Background() - onAccount := d.Get("on_account").(bool) - user := d.Get("user").(string) - parameter := sdk.SessionParameter(key) - - var err error - if onAccount { - err := client.Parameters.SetSessionParameterOnAccount(ctx, parameter, value) - if err != nil { - return err - } - } else { - if user == "" { - return fmt.Errorf("user is required if on_account is false") - } - userId := sdk.NewAccountObjectIdentifier(user) - err = client.Parameters.SetSessionParameterOnUser(ctx, userId, parameter, value) - if err != nil { - return fmt.Errorf("error creating session parameter err = %w", err) - } - } - - d.SetId(key) - - return ReadSessionParameter(d, meta) -} - -// ReadSessionParameter implements schema.ReadFunc. -func ReadSessionParameter(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - parameter := d.Id() - - onAccount := d.Get("on_account").(bool) - var err error - var p *sdk.Parameter - if onAccount { - p, err = client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameter(parameter)) - } else { - user := d.Get("user").(string) - userId := sdk.NewAccountObjectIdentifier(user) - p, err = client.Parameters.ShowUserParameter(ctx, sdk.UserParameter(parameter), userId) - } - if err != nil { - return fmt.Errorf("error reading session parameter err = %w", err) - } - err = d.Set("value", p.Value) - if err != nil { - return fmt.Errorf("error setting session parameter err = %w", err) - } - return nil -} - -// UpdateSessionParameter implements schema.UpdateFunc. -func UpdateSessionParameter(d *schema.ResourceData, meta interface{}) error { - return CreateSessionParameter(d, meta) -} - -// DeleteSessionParameter implements schema.DeleteFunc. -func DeleteSessionParameter(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - key := d.Get("key").(string) - ctx := context.Background() - - onAccount := d.Get("on_account").(bool) - parameter := sdk.SessionParameter(key) - - if onAccount { - defaultParameter, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameter(key)) - if err != nil { - return err - } - defaultValue := defaultParameter.Default - err = client.Parameters.SetSessionParameterOnAccount(ctx, parameter, defaultValue) - if err != nil { - return fmt.Errorf("error creating session parameter err = %w", err) - } - } else { - user := d.Get("user").(string) - if user == "" { - return fmt.Errorf("user is required if on_account is false") - } - userId := sdk.NewAccountObjectIdentifier(user) - defaultParameter, err := client.Parameters.ShowSessionParameter(ctx, sdk.SessionParameter(key)) - if err != nil { - return err - } - defaultValue := defaultParameter.Default - err = client.Parameters.SetSessionParameterOnUser(ctx, userId, parameter, defaultValue) - if err != nil { - return fmt.Errorf("error deleting session parameter err = %w", err) - } - } - - d.SetId(key) - return nil -} diff --git a/pkg/resources/session_parameter_acceptance_test.go b/pkg/resources/session_parameter_acceptance_test.go deleted file mode 100644 index 884fe6a95f..0000000000 --- a/pkg/resources/session_parameter_acceptance_test.go +++ /dev/null @@ -1,82 +0,0 @@ -package resources_test - -import ( - "fmt" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_SessionParameterWithUser(t *testing.T) { - _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) - - user, userCleanup := acc.TestClient().User.CreateUser(t) - t.Cleanup(userCleanup) - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: sessionParameterWithUser(user.ID(), "BINARY_OUTPUT_FORMAT", "BASE64"), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "key", "BINARY_OUTPUT_FORMAT"), - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "value", "BASE64"), - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "user", user.ID().Name()), - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "on_account", "false"), - ), - }, - }, - }) -} - -func TestAcc_SessionParameterOnAccount(t *testing.T) { - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: sessionParameterOnAccount("AUTOCOMMIT", "false"), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "key", "AUTOCOMMIT"), - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "value", "false"), - resource.TestCheckResourceAttr("snowflake_session_parameter.p", "on_account", "true"), - ), - }, - }, - }) -} - -func sessionParameterWithUser(userId sdk.AccountObjectIdentifier, key, value string) string { - return fmt.Sprintf(` -resource "snowflake_session_parameter" "p" { - key = "%[2]s" - value = "%[3]s" - user = %[1]s -} -`, userId.FullyQualifiedName(), key, value) -} - -func sessionParameterOnAccount(key, value string) string { - s := ` -resource "snowflake_session_parameter" "p" { - key = "%s" - value = "%s" - on_account = true -} -` - return fmt.Sprintf(s, key, value) -} diff --git a/pkg/resources/stream.go b/pkg/resources/stream.go deleted file mode 100644 index 1957dfedc4..0000000000 --- a/pkg/resources/stream.go +++ /dev/null @@ -1,329 +0,0 @@ -package resources - -import ( - "context" - "fmt" - "log" - "strings" - - providerresources "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" -) - -var streamSchema = map[string]*schema.Schema{ - "name": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created.", - }, - "schema": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "The schema in which to create the stream.", - }, - "database": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "The database in which to create the stream.", - }, - "comment": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies a comment for the stream.", - }, - "on_table": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - Description: "Specifies an identifier for the table the stream will monitor.", - ExactlyOneOf: []string{"on_table", "on_view", "on_stage"}, - DiffSuppressFunc: suppressIdentifierQuoting, - ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), - }, - "on_view": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - Description: "Specifies an identifier for the view the stream will monitor.", - ExactlyOneOf: []string{"on_table", "on_view", "on_stage"}, - DiffSuppressFunc: suppressIdentifierQuoting, - ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), - }, - "on_stage": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - Description: "Specifies an identifier for the stage the stream will monitor.", - ExactlyOneOf: []string{"on_table", "on_view", "on_stage"}, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // Suppress diff if the stage name is the same, even if database and schema are not specified - return strings.Trim(strings.Split(old, ".")[len(strings.Split(old, "."))-1], "\"") == strings.Trim(strings.Split(new, ".")[len(strings.Split(new, "."))-1], "\"") - }, - ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), - }, - "append_only": { - Type: schema.TypeBool, - Optional: true, - ForceNew: true, - Default: false, - Description: "Type of the stream that will be created.", - }, - "insert_only": { - Type: schema.TypeBool, - Optional: true, - ForceNew: true, - Default: false, - Description: "Create an insert only stream type.", - }, - "show_initial_rows": { - Type: schema.TypeBool, - Optional: true, - ForceNew: true, - Default: false, - Description: "Specifies whether to return all existing rows in the source table as row inserts the first time the stream is consumed.", - }, - "owner": { - Type: schema.TypeString, - Computed: true, - Description: "Name of the role that owns the stream.", - }, - FullyQualifiedNameAttributeName: schemas.FullyQualifiedNameSchema, -} - -func Stream() *schema.Resource { - return &schema.Resource{ - Create: CreateStream, - Read: ReadStream, - Update: UpdateStream, - Delete: DeleteStream, - DeprecationMessage: deprecatedResourceDescription( - string(providerresources.StreamOnDirectoryTable), - string(providerresources.StreamOnExternalTable), - string(providerresources.StreamOnTable), - string(providerresources.StreamOnView), - ), - - Schema: streamSchema, - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - } -} - -// CreateStream implements schema.CreateFunc. -func CreateStream(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - databaseName := d.Get("database").(string) - schemaName := d.Get("schema").(string) - name := d.Get("name").(string) - appendOnly := d.Get("append_only").(bool) - insertOnly := d.Get("insert_only").(bool) - showInitialRows := d.Get("show_initial_rows").(bool) - id := sdk.NewSchemaObjectIdentifier(databaseName, schemaName, name) - - ctx := context.Background() - - onTable, onTableSet := d.GetOk("on_table") - onView, onViewSet := d.GetOk("on_view") - onStage, onStageSet := d.GetOk("on_stage") - - switch { - case onTableSet: - tableObjectIdentifier, err := helpers.DecodeSnowflakeParameterID(onTable.(string)) - if err != nil { - return err - } - tableId := tableObjectIdentifier.(sdk.SchemaObjectIdentifier) - - table, err := client.Tables.ShowByID(ctx, tableId) - if err != nil { - return err - } - - if table.IsExternal { - req := sdk.NewCreateOnExternalTableStreamRequest(id, tableId) - if insertOnly { - req.WithInsertOnly(true) - } - if v, ok := d.GetOk("comment"); ok { - req.WithComment(v.(string)) - } - err := client.Streams.CreateOnExternalTable(ctx, req) - if err != nil { - return fmt.Errorf("error creating stream %v err = %w", name, err) - } - } else { - req := sdk.NewCreateOnTableStreamRequest(id, tableId) - if appendOnly { - req.WithAppendOnly(true) - } - if showInitialRows { - req.WithShowInitialRows(true) - } - if v, ok := d.GetOk("comment"); ok { - req.WithComment(v.(string)) - } - err := client.Streams.CreateOnTable(ctx, req) - if err != nil { - return fmt.Errorf("error creating stream %v err = %w", name, err) - } - } - case onViewSet: - viewObjectIdentifier, err := helpers.DecodeSnowflakeParameterID(onView.(string)) - viewId := viewObjectIdentifier.(sdk.SchemaObjectIdentifier) - if err != nil { - return err - } - - _, err = client.Views.ShowByID(ctx, viewId) - if err != nil { - return err - } - - req := sdk.NewCreateOnViewStreamRequest(id, viewId) - if appendOnly { - req.WithAppendOnly(true) - } - if showInitialRows { - req.WithShowInitialRows(true) - } - if v, ok := d.GetOk("comment"); ok { - req.WithComment(v.(string)) - } - err = client.Streams.CreateOnView(ctx, req) - if err != nil { - return fmt.Errorf("error creating stream %v err = %w", name, err) - } - case onStageSet: - stageObjectIdentifier, err := helpers.DecodeSnowflakeParameterID(onStage.(string)) - stageId := stageObjectIdentifier.(sdk.SchemaObjectIdentifier) - if err != nil { - return err - } - stageProperties, err := client.Stages.Describe(ctx, stageId) - if err != nil { - return err - } - if findStagePropertyValueByName(stageProperties, "ENABLE") != "true" { - return fmt.Errorf("directory must be enabled on stage") - } - req := sdk.NewCreateOnDirectoryTableStreamRequest(id, stageId) - if v, ok := d.GetOk("comment"); ok { - req.WithComment(v.(string)) - } - err = client.Streams.CreateOnDirectoryTable(ctx, req) - if err != nil { - return fmt.Errorf("error creating stream %v err = %w", name, err) - } - } - - d.SetId(helpers.EncodeSnowflakeID(id)) - - return ReadStream(d, meta) -} - -// ReadStream implements schema.ReadFunc. -func ReadStream(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.SchemaObjectIdentifier) - stream, err := client.Streams.ShowByID(ctx, id) - if err != nil { - log.Printf("[DEBUG] stream (%s) not found", d.Id()) - d.SetId("") - return nil - } - if err := d.Set(FullyQualifiedNameAttributeName, id.FullyQualifiedName()); err != nil { - return err - } - if err := d.Set("name", stream.Name); err != nil { - return err - } - if err := d.Set("database", stream.DatabaseName); err != nil { - return err - } - if err := d.Set("schema", stream.SchemaName); err != nil { - return err - } - switch *stream.SourceType { - case sdk.StreamSourceTypeStage: - if err := d.Set("on_stage", *stream.TableName); err != nil { - return err - } - case sdk.StreamSourceTypeView: - if err := d.Set("on_view", *stream.TableName); err != nil { - return err - } - default: - if err := d.Set("on_table", *stream.TableName); err != nil { - return err - } - } - if err := d.Set("append_only", *stream.Mode == "APPEND_ONLY"); err != nil { - return err - } - if err := d.Set("insert_only", *stream.Mode == "INSERT_ONLY"); err != nil { - return err - } - // TODO: SHOW STREAMS doesn't return that value right now (I'm not sure if it ever did), but probably we can assume - // the customers got 'false' every time and hardcode it (it's only on create thing, so it's not necessary - // to track its value after creation). - if err := d.Set("show_initial_rows", false); err != nil { - return err - } - if err := d.Set("comment", *stream.Comment); err != nil { - return err - } - if err := d.Set("owner", *stream.Owner); err != nil { - return err - } - return nil -} - -// UpdateStream implements schema.UpdateFunc. -func UpdateStream(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.SchemaObjectIdentifier) - - if d.HasChange("comment") { - comment := d.Get("comment").(string) - if comment == "" { - err := client.Streams.Alter(ctx, sdk.NewAlterStreamRequest(id).WithUnsetComment(true)) - if err != nil { - return fmt.Errorf("error unsetting stream comment on %v", d.Id()) - } - } else { - err := client.Streams.Alter(ctx, sdk.NewAlterStreamRequest(id).WithSetComment(comment)) - if err != nil { - return fmt.Errorf("error setting stream comment on %v", d.Id()) - } - } - } - - return ReadStream(d, meta) -} - -// DeleteStream implements schema.DeleteFunc. -func DeleteStream(d *schema.ResourceData, meta interface{}) error { - client := meta.(*provider.Context).Client - ctx := context.Background() - streamId := helpers.DecodeSnowflakeID(d.Id()).(sdk.SchemaObjectIdentifier) - - err := client.Streams.Drop(ctx, sdk.NewDropStreamRequest(streamId)) - if err != nil { - return fmt.Errorf("error deleting stream %v err = %w", d.Id(), err) - } - - d.SetId("") - - return nil -} diff --git a/pkg/resources/stream_acceptance_test.go b/pkg/resources/stream_acceptance_test.go deleted file mode 100644 index 54e825c3e2..0000000000 --- a/pkg/resources/stream_acceptance_test.go +++ /dev/null @@ -1,510 +0,0 @@ -package resources_test - -import ( - "fmt" - "regexp" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/plancheck" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_StreamCreateOnStageWithoutDirectoryEnabled(t *testing.T) { - accName := acc.TestClient().Ids.Alpha() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: acc.CheckDestroy(t, resources.Stream), - Steps: []resource.TestStep{ - { - Config: stageStreamConfig(accName, false), - ExpectError: regexp.MustCompile("directory must be enabled on stage"), - }, - }, - }) -} - -func TestAcc_StreamCreateOnStage(t *testing.T) { - accName := acc.TestClient().Ids.Alpha() - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: acc.CheckDestroy(t, resources.Stream), - Steps: []resource.TestStep{ - { - Config: stageStreamConfig(accName, true), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - }, - }) -} - -// proves issue https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2672 -func TestAcc_Stream_OnTable(t *testing.T) { - tableName := acc.TestClient().Ids.Alpha() - tableName2 := acc.TestClient().Ids.Alpha() - id := acc.TestClient().Ids.RandomSchemaObjectIdentifier() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: acc.CheckDestroy(t, resources.Stream), - Steps: []resource.TestStep{ - { - Config: streamConfigOnTable(acc.TestDatabaseName, acc.TestSchemaName, tableName, id.Name()), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", id.Name()), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "fully_qualified_name", id.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", acc.TestDatabaseName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", acc.TestSchemaName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("\"%s\".\"%s\".%s", acc.TestDatabaseName, acc.TestSchemaName, tableName)), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - ), - ConfigPlanChecks: resource.ConfigPlanChecks{ - PostApplyPreRefresh: []plancheck.PlanCheck{plancheck.ExpectEmptyPlan()}, - }, - }, - { - Config: streamConfigOnTable(acc.TestDatabaseName, acc.TestSchemaName, tableName2, id.Name()), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", id.Name()), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "fully_qualified_name", id.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", acc.TestDatabaseName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", acc.TestSchemaName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("\"%s\".\"%s\".%s", acc.TestDatabaseName, acc.TestSchemaName, tableName2)), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - ), - ConfigPlanChecks: resource.ConfigPlanChecks{ - PostApplyPreRefresh: []plancheck.PlanCheck{plancheck.ExpectEmptyPlan()}, - }, - }, - }, - }) -} - -// proves issue https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2672 -func TestAcc_Stream_OnView(t *testing.T) { - // TODO(SNOW-1423486): Fix using warehouse in all tests and remove unsetting testenvs.ConfigureClientOnce - t.Setenv(string(testenvs.ConfigureClientOnce), "") - - tableName := acc.TestClient().Ids.Alpha() - viewName := acc.TestClient().Ids.Alpha() - name := acc.TestClient().Ids.Alpha() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: acc.CheckDestroy(t, resources.Stream), - Steps: []resource.TestStep{ - { - Config: streamConfigOnView(acc.TestDatabaseName, acc.TestSchemaName, tableName, viewName, name), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", name), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", acc.TestDatabaseName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", acc.TestSchemaName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_view", fmt.Sprintf("\"%s\".\"%s\".%s", acc.TestDatabaseName, acc.TestSchemaName, viewName)), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - ), - ConfigPlanChecks: resource.ConfigPlanChecks{ - PostApplyPreRefresh: []plancheck.PlanCheck{plancheck.ExpectEmptyPlan()}, - }, - }, - }, - }) -} - -func TestAcc_Stream(t *testing.T) { - // Current error is User: is not authorized to perform: sts:AssumeRole on resource: duration 1.162414333s args {}] () - t.Skip("Skipping TestAcc_Stream") - - accName := acc.TestClient().Ids.Alpha() - accNameExternalTable := acc.TestClient().Ids.Alpha() - bucketURL := testenvs.GetOrSkipTest(t, testenvs.AwsExternalBucketUrl) - roleName := testenvs.GetOrSkipTest(t, testenvs.AwsExternalRoleArn) - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: acc.CheckDestroy(t, resources.Stream), - Steps: []resource.TestStep{ - { - Config: streamConfig(accName, false), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("%s.%s.%s", accName, accName, "STREAM_ON_TABLE")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - { - Config: streamConfig(accName, true), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("%s.%s.%s", accName, accName, "STREAM_ON_TABLE")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "true"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - { - Config: externalTableStreamConfig(accNameExternalTable, false, bucketURL, roleName), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("%s.%s.%s", accNameExternalTable, accNameExternalTable, "STREAM_ON_EXTERNAL_TABLE")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - { - Config: externalTableStreamConfig(accNameExternalTable, true, bucketURL, roleName), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accNameExternalTable), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_table", fmt.Sprintf("%s.%s.%s", accNameExternalTable, accNameExternalTable, "STREAM_ON_EXTERNAL_TABLE")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "true"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - { - Config: viewStreamConfig(accName, false), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_view", fmt.Sprintf("%s.%s.%s", accName, accName, "STREAM_ON_VIEW")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "append_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "insert_only", "false"), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "show_initial_rows", "false"), - ), - }, - { - Config: stageStreamConfig(accName, true), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "name", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "database", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "schema", accName), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "on_stage", fmt.Sprintf("%s.%s.%s", accName, accName, "STREAM_ON_STAGE")), - resource.TestCheckResourceAttr("snowflake_stream.test_stream", "comment", "Terraform acceptance test"), - ), - }, - { - ResourceName: "snowflake_stream.test_stream", - ImportState: true, - ImportStateVerify: true, - }, - }, - }) -} - -func streamConfigOnTable(databaseName string, schemaName string, tableName string, name string) string { - return fmt.Sprintf(` -resource "snowflake_table" "test_stream_on_table" { - database = "%[1]s" - schema = "%[2]s" - name = "%[3]s" - comment = "Terraform acceptance test" - change_tracking = true - - column { - name = "column1" - type = "VARIANT" - } - column { - name = "column2" - type = "VARCHAR" - } -} - -resource "snowflake_stream" "test_stream" { - database = "%[1]s" - schema = "%[2]s" - name = "%[4]s" - comment = "Terraform acceptance test" - on_table = "\"%[1]s\".\"%[2]s\".\"${snowflake_table.test_stream_on_table.name}\"" -} -`, databaseName, schemaName, tableName, name) -} - -func streamConfigOnView(databaseName string, schemaName string, tableName string, viewName string, name string) string { - return fmt.Sprintf(` -resource "snowflake_table" "test" { - database = "%[1]s" - schema = "%[2]s" - name = "%[3]s" - comment = "Terraform acceptance test" - change_tracking = true - - column { - name = "column1" - type = "VARIANT" - } - column { - name = "column2" - type = "VARCHAR" - } -} - -resource "snowflake_view" "test" { - database = "%[1]s" - schema = "%[2]s" - name = "%[4]s" - change_tracking = true - - statement = "select * from \"${snowflake_table.test.name}\"" - column { - column_name = "column1" - } - column { - column_name = "column2" - } -} - -resource "snowflake_stream" "test_stream" { - database = "%[1]s" - schema = "%[2]s" - name = "%[5]s" - comment = "Terraform acceptance test" - on_view = "\"%[1]s\".\"%[2]s\".\"${snowflake_view.test.name}\"" -} -`, databaseName, schemaName, tableName, viewName, name) -} - -func streamConfig(name string, appendOnly bool) string { - appendOnlyConfig := "" - if appendOnly { - appendOnlyConfig = "append_only = true" - } - - s := ` -resource "snowflake_database" "test_database" { - name = "%s" - comment = "Terraform acceptance test" -} - -resource "snowflake_schema" "test_schema" { - name = "%s" - database = snowflake_database.test_database.name - comment = "Terraform acceptance test" -} - -resource "snowflake_table" "test_stream_on_table" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "STREAM_ON_TABLE" - comment = "Terraform acceptance test" - change_tracking = true - - column { - name = "column1" - type = "VARIANT" - } - column { - name = "column2" - type = "VARCHAR" - } -} - -resource "snowflake_stream" "test_stream" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "%s" - comment = "Terraform acceptance test" - on_table = "${snowflake_database.test_database.name}.${snowflake_schema.test_schema.name}.${snowflake_table.test_stream_on_table.name}" - %s -} -` - return fmt.Sprintf(s, name, name, name, appendOnlyConfig) -} - -func externalTableStreamConfig(name string, insertOnly bool, bucketURL string, roleName string) string { - // Refer to external_table_acceptance_test.go for the original source on - // external table resources and dependents (modified slightly here). - insertOnlyConfig := "" - if insertOnly { - insertOnlyConfig = "insert_only = true" - } - - s := ` -resource "snowflake_database" "test" { - name = "%v" - comment = "Terraform acceptance test" -} -resource "snowflake_schema" "test" { - name = "%v" - database = snowflake_database.test.name - comment = "Terraform acceptance test" -} -resource "snowflake_stage" "test" { - name = "%v" - url = "%s" - database = snowflake_database.test.name - schema = snowflake_schema.test.name - comment = "Terraform acceptance test" - storage_integration = snowflake_storage_integration.external_table_stream_integration.name -} -resource "snowflake_storage_integration" "external_table_stream_integration" { - name = "%v" - storage_allowed_locations = ["%s"] - storage_provider = "S3" - storage_aws_role_arn = "%s" -} -resource "snowflake_external_table" "test_external_stream_table" { - database = snowflake_database.test.name - schema = snowflake_schema.test.name - name = "STREAM_ON_EXTERNAL_TABLE" - comment = "Terraform acceptance test" - column { - name = "column1" - type = "STRING" - as = "TO_VARCHAR(TO_TIMESTAMP_NTZ(value:unix_timestamp_property::NUMBER, 3), 'yyyy-mm-dd-hh')" - } - column { - name = "column2" - type = "TIMESTAMP_NTZ(9)" - as = "($1:\"CreatedDate\"::timestamp)" - } - file_format = "TYPE = CSV" - location = "@${snowflake_database.test.name}.${snowflake_schema.test.name}.${snowflake_stage.test.name}" -} -resource "snowflake_stream" "test_external_table_stream" { - database = snowflake_database.test.name - schema = snowflake_schema.test.name - name = "%s" - comment = "Terraform acceptance test" - on_table = "${snowflake_database.test.name}.${snowflake_schema.test.name}.${snowflake_external_table.test_external_stream_table.name}" - %s -} -` - - return fmt.Sprintf(s, name, name, name, bucketURL, name, bucketURL, roleName, name, insertOnlyConfig) -} - -func viewStreamConfig(name string, appendOnly bool) string { - appendOnlyConfig := "" - if appendOnly { - appendOnlyConfig = "append_only = true" - } - - s := ` -resource "snowflake_database" "test_database" { - name = "%s" - comment = "Terraform acceptance test" -} - -resource "snowflake_schema" "test_schema" { - name = "%s" - database = snowflake_database.test_database.name - comment = "Terraform acceptance test" -} - -resource "snowflake_table" "test_stream_on_view" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "STREAM_ON_VIEW_TABLE" - comment = "Terraform acceptance test" - change_tracking = true - - column { - name = "column1" - type = "VARIANT" - } - column { - name = "column2" - type = "VARCHAR(16777216)" - } -} - -resource "snowflake_view" "test_stream_on_view" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "STREAM_ON_VIEW" - - statement = "select * from ${snowflake_table.test_stream_on_view.name}" -} - -resource "snowflake_stream" "test_stream" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "%s" - comment = "Terraform acceptance test" - on_view = "${snowflake_database.test_database.name}.${snowflake_schema.test_schema.name}.${snowflake_view.test_stream_on_view.name}" - %s -} -` - return fmt.Sprintf(s, name, name, name, appendOnlyConfig) -} - -func stageStreamConfig(name string, directory bool) string { - s := ` -resource "snowflake_database" "test_database" { - name = "%s" - comment = "Terraform acceptance test" -} - -resource "snowflake_schema" "test_schema" { - name = "%s" - database = snowflake_database.test_database.name - comment = "Terraform acceptance test" -} - -resource "snowflake_stage" "test_stage" { - name = "%s" - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - directory = "ENABLE = %t" -} - -resource "snowflake_stream" "test_stream" { - database = snowflake_database.test_database.name - schema = snowflake_schema.test_schema.name - name = "%s" - comment = "Terraform acceptance test" - on_stage = "${snowflake_database.test_database.name}.${snowflake_schema.test_schema.name}.${snowflake_stage.test_stage.name}" -} -` - return fmt.Sprintf(s, name, name, name, directory, name) -} diff --git a/pkg/resources/tag_masking_policy_association.go b/pkg/resources/tag_masking_policy_association.go deleted file mode 100644 index 84c4ed21bf..0000000000 --- a/pkg/resources/tag_masking_policy_association.go +++ /dev/null @@ -1,190 +0,0 @@ -package resources - -import ( - "context" - "database/sql" - "errors" - "fmt" - "log" - "strings" - - "github.com/hashicorp/terraform-plugin-sdk/v2/diag" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" -) - -const ( - tagAttachmentPolicyIDDelimiter = "|" -) - -var mpAttachmentPolicySchema = map[string]*schema.Schema{ - "tag_id": { - Type: schema.TypeString, - Required: true, - Description: "Specifies the identifier for the tag. Note: format must follow: \"databaseName\".\"schemaName\".\"tagName\" or \"databaseName.schemaName.tagName\" or \"databaseName|schemaName.tagName\" (snowflake_tag.tag.id)", - ForceNew: true, - ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), - }, - "masking_policy_id": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "The resource id of the masking policy", - ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), - }, -} - -type attachmentID struct { - TagDatabaseName string - TagSchemaName string - TagName string - MaskingPolicyDatabaseName string - MaskingPolicySchemaName string - MaskingPolicyName string -} - -func (v *attachmentID) String() string { - return strings.Join([]string{ - v.TagDatabaseName, - v.TagSchemaName, - v.TagName, - v.MaskingPolicyDatabaseName, - v.MaskingPolicySchemaName, - v.MaskingPolicyName, - }, tagAttachmentPolicyIDDelimiter) -} - -func parseAttachmentID(id string) (*attachmentID, error) { - parts := strings.Split(id, tagAttachmentPolicyIDDelimiter) - if len(parts) != 6 { - return nil, fmt.Errorf("6 fields allowed") - } - return &attachmentID{ - TagDatabaseName: parts[0], - TagSchemaName: parts[1], - TagName: parts[2], - MaskingPolicyDatabaseName: parts[3], - MaskingPolicySchemaName: parts[4], - MaskingPolicyName: parts[5], - }, nil -} - -// Schema returns a pointer to the resource representing a schema. -func TagMaskingPolicyAssociation() *schema.Resource { - return &schema.Resource{ - CreateContext: TrackingCreateWrapper(resources.TagMaskingPolicyAssociation, CreateContextTagMaskingPolicyAssociation), - ReadContext: TrackingReadWrapper(resources.TagMaskingPolicyAssociation, ReadContextTagMaskingPolicyAssociation), - DeleteContext: TrackingDeleteWrapper(resources.TagMaskingPolicyAssociation, DeleteContextTagMaskingPolicyAssociation), - - Schema: mpAttachmentPolicySchema, - Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, - }, - Description: "Attach a masking policy to a tag. Requires a current warehouse to be set. Either with SNOWFLAKE_WAREHOUSE env variable or in current session. If no warehouse is provided, a temporary warehouse will be created.", - DeprecationMessage: deprecatedResourceDescription(string(resources.Tag)), - } -} - -func CreateContextTagMaskingPolicyAssociation(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { - client := meta.(*provider.Context).Client - - value := d.Get("tag_id").(string) - tagObjectIdentifier, err := helpers.DecodeSnowflakeParameterID(value) - if err != nil { - return diag.FromErr(err) - } - tagId := tagObjectIdentifier.(sdk.SchemaObjectIdentifier) - - value = d.Get("masking_policy_id").(string) - maskingPolicyObjectIdentifier, err := helpers.DecodeSnowflakeParameterID(value) - if err != nil { - return diag.FromErr(err) - } - maskingPolicyId := maskingPolicyObjectIdentifier.(sdk.SchemaObjectIdentifier) - - set := sdk.NewTagSetRequest().WithMaskingPolicies([]sdk.SchemaObjectIdentifier{maskingPolicyId}) - if err := client.Tags.Alter(ctx, sdk.NewAlterTagRequest(tagId).WithSet(set)); err != nil { - return diag.FromErr(err) - } - aid := attachmentID{ - TagDatabaseName: tagId.DatabaseName(), - TagSchemaName: tagId.SchemaName(), - TagName: tagId.Name(), - MaskingPolicyDatabaseName: maskingPolicyId.DatabaseName(), - MaskingPolicySchemaName: maskingPolicyId.SchemaName(), - MaskingPolicyName: maskingPolicyId.Name(), - } - fmt.Printf("attachment id: %s\n", aid.String()) - d.SetId(aid.String()) - return ReadContextTagMaskingPolicyAssociation(ctx, d, meta) -} - -func ReadContextTagMaskingPolicyAssociation(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { - diags := diag.Diagnostics{} - client := meta.(*provider.Context).Client - db := client.GetConn().DB - aid, err := parseAttachmentID(d.Id()) - if err != nil { - return diag.FromErr(err) - } - - // create temp warehouse to query the tag, and make sure to clean it up - warehouse, err := client.ContextFunctions.CurrentWarehouse(ctx) - if err != nil { - return diag.FromErr(err) - } - if warehouse == "" { - log.Printf("[DEBUG] no current warehouse set, creating a temporary warehouse") - randomWarehouseName := fmt.Sprintf("terraform-provider-snowflake-%v", helpers.RandomString()) - wid := sdk.NewAccountObjectIdentifier(randomWarehouseName) - if err := client.Warehouses.Create(ctx, wid, nil); err != nil { - return diag.FromErr(err) - } - defer func() { - if err := client.Warehouses.Drop(ctx, wid, nil); err != nil { - log.Printf("[WARN] error cleaning up temp warehouse %v", err) - } - }() - if err := client.Sessions.UseWarehouse(ctx, wid); err != nil { - return diag.FromErr(err) - } - } - // show attached masking policy - tid := sdk.NewSchemaObjectIdentifier(aid.TagDatabaseName, aid.TagSchemaName, aid.TagName) - mid := sdk.NewSchemaObjectIdentifier(aid.MaskingPolicyDatabaseName, aid.MaskingPolicySchemaName, aid.MaskingPolicyName) - builder := snowflake.NewTagBuilder(tid).WithMaskingPolicy(mid) - row := snowflake.QueryRow(db, builder.ShowAttachedPolicy()) - _, err = snowflake.ScanTagPolicy(row) - if errors.Is(err, sql.ErrNoRows) { - // If not found, mark resource to be removed from state file during apply or refresh - log.Printf("[DEBUG] attached policy (%s) not found", d.Id()) - d.SetId("") - return nil - } - if err != nil { - return diag.FromErr(err) - } - - return diags -} - -func DeleteContextTagMaskingPolicyAssociation(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { - client := meta.(*provider.Context).Client - aid, err := parseAttachmentID(d.Id()) - if err != nil { - return diag.FromErr(err) - } - tid := sdk.NewSchemaObjectIdentifier(aid.TagDatabaseName, aid.TagSchemaName, aid.TagName) - mid := sdk.NewSchemaObjectIdentifier(aid.MaskingPolicyDatabaseName, aid.MaskingPolicySchemaName, aid.MaskingPolicyName) - unset := sdk.NewTagUnsetRequest().WithMaskingPolicies([]sdk.SchemaObjectIdentifier{mid}) - if err := client.Tags.Alter(ctx, sdk.NewAlterTagRequest(tid).WithUnset(unset)); err != nil { - return diag.FromErr(err) - } - d.SetId("") - return nil -} diff --git a/pkg/resources/tag_masking_policy_association_acceptance_test.go b/pkg/resources/tag_masking_policy_association_acceptance_test.go deleted file mode 100644 index 2493681dd1..0000000000 --- a/pkg/resources/tag_masking_policy_association_acceptance_test.go +++ /dev/null @@ -1,121 +0,0 @@ -package resources_test - -import ( - "fmt" - "testing" - - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-testing/config" - "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/tfversion" -) - -func TestAcc_TagMaskingPolicyAssociationBasic(t *testing.T) { - _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) - acc.TestAccPreCheck(t) - tag, tagCleanup := acc.TestClient().Tag.CreateTag(t) - t.Cleanup(tagCleanup) - accName := acc.TestClient().Ids.Alpha() - - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - PreCheck: func() { acc.TestAccPreCheck(t) }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - Config: tagAttachmentConfig(accName, acc.TestDatabaseName, acc.TestSchemaName, tag.ID()), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_tag_masking_policy_association.test", "masking_policy_id", fmt.Sprintf("%s.%s.%s", acc.TestDatabaseName, acc.TestSchemaName, accName)), - resource.TestCheckResourceAttr("snowflake_tag_masking_policy_association.test", "tag_id", tag.ID().FullyQualifiedName()), - ), - }, - }, - }) -} - -func TestAcc_TagMaskingPolicyAssociationsystem_functions_integration_testComplete(t *testing.T) { - name := acc.TestClient().Ids.Alpha() - resourceName := "snowflake_tag.test" - m := func() map[string]config.Variable { - return map[string]config.Variable{ - "name": config.StringVariable(name), - "database": config.StringVariable(acc.TestDatabaseName), - "schema": config.StringVariable(acc.TestSchemaName), - "comment": config.StringVariable("Terraform acceptance test"), - } - } - variableSet2 := m() - variableSet2["comment"] = config.StringVariable("Terraform acceptance test - updated") - resource.Test(t, resource.TestCase{ - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, - TerraformVersionChecks: []tfversion.TerraformVersionCheck{ - tfversion.RequireAbove(tfversion.Version1_5_0), - }, - CheckDestroy: nil, - Steps: []resource.TestStep{ - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_TagMaskingPolicyAssociation/basic"), - ConfigVariables: m(), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr(resourceName, "name", name), - resource.TestCheckResourceAttr(resourceName, "database", acc.TestDatabaseName), - resource.TestCheckResourceAttr(resourceName, "schema", acc.TestSchemaName), - resource.TestCheckResourceAttr(resourceName, "allowed_values.#", "2"), - resource.TestCheckResourceAttr(resourceName, "allowed_values.0", "alv1"), - resource.TestCheckResourceAttr(resourceName, "allowed_values.1", "alv2"), - resource.TestCheckResourceAttr(resourceName, "comment", "Terraform acceptance test"), - - resource.TestCheckResourceAttr("snowflake_tag_masking_policy_association.test", "masking_policy_id", fmt.Sprintf("%s.%s.%s", acc.TestDatabaseName, acc.TestSchemaName, name)), - resource.TestCheckResourceAttr("snowflake_tag_masking_policy_association.test", "tag_id", fmt.Sprintf("%s.%s.%s", acc.TestDatabaseName, acc.TestSchemaName, name)), - ), - }, - // test - change comment - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_TagMaskingPolicyAssociation/basic"), - ConfigVariables: variableSet2, - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr(resourceName, "name", name), - resource.TestCheckResourceAttr(resourceName, "database", acc.TestDatabaseName), - resource.TestCheckResourceAttr(resourceName, "schema", acc.TestSchemaName), - resource.TestCheckResourceAttr(resourceName, "comment", "Terraform acceptance test - updated"), - ), - }, - // test - import - { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_TagMaskingPolicyAssociation/basic"), - ConfigVariables: variableSet2, - ResourceName: resourceName, - ImportState: true, - ImportStateVerify: true, - }, - }, - }) -} - -func tagAttachmentConfig(n string, databaseName string, schemaName string, tagId sdk.SchemaObjectIdentifier) string { - return fmt.Sprintf(` -resource "snowflake_masking_policy" "test" { - name = "%[1]v" - database = "%[2]s" - schema = "%[3]s" - argument { - name = "val" - type = "VARCHAR" - } - body = "case when current_role() in ('ANALYST') then val else sha2(val, 512) end" - return_data_type = "VARCHAR(16777216)" - comment = "Terraform acceptance test" -} - -resource "snowflake_tag_masking_policy_association" "test" { - tag_id = "\"%s\".\"%s\".\"%s\"" - masking_policy_id = "${snowflake_masking_policy.test.database}.${snowflake_masking_policy.test.schema}.${snowflake_masking_policy.test.name}" -} -`, n, databaseName, schemaName, tagId.DatabaseName(), tagId.SchemaName(), tagId.Name()) -} diff --git a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf b/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf deleted file mode 100644 index 180a0c22bd..0000000000 --- a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf +++ /dev/null @@ -1,4 +0,0 @@ -resource "snowflake_database_old" "db" { - name = var.db - comment = "test comment" -} diff --git a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/variables.tf b/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/variables.tf deleted file mode 100644 index 5ed7b249f5..0000000000 --- a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/variables.tf +++ /dev/null @@ -1,3 +0,0 @@ -variable "db" { - type = string -} diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf deleted file mode 100644 index 2f9535a0f1..0000000000 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf +++ /dev/null @@ -1,4 +0,0 @@ -resource "snowflake_database_old" "test" { - name = var.database - data_retention_time_in_days = var.database_data_retention_time -} diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/variables.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/variables.tf deleted file mode 100644 index 32f9fb7140..0000000000 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/variables.tf +++ /dev/null @@ -1,7 +0,0 @@ -variable "database" { - type = string -} - -variable "database_data_retention_time" { - type = number -} diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf deleted file mode 100644 index c3386f300a..0000000000 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf +++ /dev/null @@ -1,3 +0,0 @@ -resource "snowflake_database_old" "test" { - name = var.database -} diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/variables.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/variables.tf deleted file mode 100644 index bfdd9eeb3c..0000000000 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/variables.tf +++ /dev/null @@ -1,3 +0,0 @@ -variable "database" { - type = string -} diff --git a/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/test.tf b/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/test.tf deleted file mode 100644 index e4fe8efa84..0000000000 --- a/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/test.tf +++ /dev/null @@ -1,27 +0,0 @@ - -resource "snowflake_tag" "test" { - name = var.name - database = var.database - schema = var.schema - comment = var.comment - masking_policies = [snowflake_masking_policy.test.fully_qualified_name] - allowed_values = ["alv1", "alv2"] -} - -resource "snowflake_masking_policy" "test" { - name = var.name - database = var.database - schema = var.schema - argument { - name = "val" - type = "VARCHAR" - } - body = "case when current_role() in ('ANALYST') then val else sha2(val, 512) end" - return_data_type = "VARCHAR(16777216)" - comment = "Terraform acceptance test" -} - -resource "snowflake_tag_masking_policy_association" "test" { - tag_id = "${snowflake_tag.test.database}.${snowflake_tag.test.schema}.${snowflake_tag.test.name}" - masking_policy_id = "${snowflake_masking_policy.test.database}.${snowflake_masking_policy.test.schema}.${snowflake_masking_policy.test.name}" -} diff --git a/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/variables.tf b/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/variables.tf deleted file mode 100644 index 68541424ae..0000000000 --- a/pkg/resources/testdata/TestAcc_TagMaskingPolicyAssociation/basic/variables.tf +++ /dev/null @@ -1,15 +0,0 @@ -variable "database" { - type = string -} - -variable "schema" { - type = string -} - -variable "name" { - type = string -} - -variable "comment" { - type = string -} diff --git a/pkg/resources/unsafe_execute_acceptance_test.go b/pkg/resources/unsafe_execute_acceptance_test.go index 33a78be8b0..e20ad00dad 100644 --- a/pkg/resources/unsafe_execute_acceptance_test.go +++ b/pkg/resources/unsafe_execute_acceptance_test.go @@ -740,3 +740,21 @@ func verifyGrantExists(t *testing.T, roleId sdk.AccountObjectIdentifier, privile return nil } } + +// TODO [SNOW-1348121]: Move this to the file with check_destroy functions. +func testAccCheckDatabaseExistence(t *testing.T, id sdk.AccountObjectIdentifier, shouldExist bool) func(state *terraform.State) error { + t.Helper() + return func(state *terraform.State) error { + _, err := acc.TestClient().Database.Show(t, id) + if shouldExist { + if err != nil { + return fmt.Errorf("error while retrieving database %s, err = %w", id, err) + } + } else { + if err == nil { + return fmt.Errorf("database %v still exists", id) + } + } + return nil + } +} diff --git a/pkg/resources/user_acceptance_test.go b/pkg/resources/user_acceptance_test.go index 137dadcf2d..952fd2d98d 100644 --- a/pkg/resources/user_acceptance_test.go +++ b/pkg/resources/user_acceptance_test.go @@ -284,6 +284,8 @@ func TestAcc_User_BasicFlows(t *testing.T) { } // proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2481 has been fixed +// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2353 has been fixed +// done on user, to not interfere with other parallel tests on the same account func TestAcc_User_RemovedOutsideOfTerraform(t *testing.T) { userId := acc.TestClient().Ids.RandomAccountObjectIdentifier() diff --git a/pkg/resources/validators.go b/pkg/resources/validators.go index 071a73a33c..465d377a13 100644 --- a/pkg/resources/validators.go +++ b/pkg/resources/validators.go @@ -36,7 +36,7 @@ func IsValidIdentifier[T sdk.AccountObjectIdentifier | sdk.DatabaseObjectIdentif // IsValidAccountIdentifier is a validator that can be used for validating account identifiers passed in resources and data sources. // // Provider supported both account locators and organization name + account name pairs. -// The account locators are deprecated, so this function accepts only the new format. +// The account locators are not supported, so this function accepts only the new format. func IsValidAccountIdentifier() schema.SchemaValidateDiagFunc { return func(value any, path cty.Path) diag.Diagnostics { if _, ok := value.(string); !ok { diff --git a/pkg/sdk/client_integration_test.go b/pkg/sdk/client_integration_test.go index 57522f9fae..9e2697a286 100644 --- a/pkg/sdk/client_integration_test.go +++ b/pkg/sdk/client_integration_test.go @@ -44,8 +44,9 @@ func TestClient_NewClient(t *testing.T) { require.NoError(t, err) require.NotNil(t, config) - account := config.Account - t.Setenv(snowflakeenvs.Account, account) + account := configAccountId(t, config) + t.Setenv(snowflakeenvs.OrganizationName, account.OrganizationName()) + t.Setenv(snowflakeenvs.OrganizationName, account.AccountName()) dir, err := os.UserHomeDir() require.NoError(t, err) diff --git a/pkg/sdk/config.go b/pkg/sdk/config.go index c4f5ca99b7..a49884a5b0 100644 --- a/pkg/sdk/config.go +++ b/pkg/sdk/config.go @@ -214,7 +214,6 @@ func GetConfigFileName() (string, error) { // TODO(SNOW-1787920): improve TOML parsing type ConfigDTO struct { - Account *string `toml:"account"` AccountName *string `toml:"accountname"` OrganizationName *string `toml:"organizationname"` User *string `toml:"user"` @@ -257,7 +256,6 @@ type ConfigDTO struct { func (c *ConfigDTO) DriverConfig() (gosnowflake.Config, error) { driverCfg := gosnowflake.Config{} - pointerAttributeSet(c.Account, &driverCfg.Account) if c.AccountName != nil && c.OrganizationName != nil { driverCfg.Account = fmt.Sprintf("%s-%s", *c.OrganizationName, *c.AccountName) } @@ -410,7 +408,6 @@ const ( AuthenticationTypeOauth AuthenticationType = "OAUTH" AuthenticationTypeExternalBrowser AuthenticationType = "EXTERNALBROWSER" AuthenticationTypeOkta AuthenticationType = "OKTA" - AuthenticationTypeJwtLegacy AuthenticationType = "JWT" AuthenticationTypeJwt AuthenticationType = "SNOWFLAKE_JWT" AuthenticationTypeTokenAccessor AuthenticationType = "TOKENACCESSOR" AuthenticationTypeUsernamePasswordMfa AuthenticationType = "USERNAMEPASSWORDMFA" @@ -421,7 +418,6 @@ var AllAuthenticationTypes = []AuthenticationType{ AuthenticationTypeOauth, AuthenticationTypeExternalBrowser, AuthenticationTypeOkta, - AuthenticationTypeJwtLegacy, AuthenticationTypeJwt, AuthenticationTypeTokenAccessor, AuthenticationTypeUsernamePasswordMfa, @@ -437,7 +433,7 @@ func ToAuthenticatorType(s string) (gosnowflake.AuthType, error) { return gosnowflake.AuthTypeExternalBrowser, nil case string(AuthenticationTypeOkta): return gosnowflake.AuthTypeOkta, nil - case string(AuthenticationTypeJwt), string(AuthenticationTypeJwtLegacy): + case string(AuthenticationTypeJwt): return gosnowflake.AuthTypeJwt, nil case string(AuthenticationTypeTokenAccessor): return gosnowflake.AuthTypeTokenAccessor, nil diff --git a/pkg/sdk/config_test.go b/pkg/sdk/config_test.go index 8f67022ba0..248556c742 100644 --- a/pkg/sdk/config_test.go +++ b/pkg/sdk/config_test.go @@ -20,13 +20,15 @@ import ( func TestLoadConfigFile(t *testing.T) { c := ` [default] - account='TEST_ACCOUNT' + accountname='TEST_ACCOUNT' + organizationname='TEST_ORG' user='TEST_USER' password='abcd1234' role='ACCOUNTADMIN' [securityadmin] - account='TEST_ACCOUNT' + accountname='TEST_ACCOUNT' + organizationname='TEST_ORG' user='TEST_USER' password='abcd1234' role='SECURITYADMIN' @@ -35,16 +37,46 @@ func TestLoadConfigFile(t *testing.T) { m, err := loadConfigFile(configPath) require.NoError(t, err) - assert.Equal(t, "TEST_ACCOUNT", *m["default"].Account) + assert.Equal(t, "TEST_ACCOUNT", *m["default"].AccountName) + assert.Equal(t, "TEST_ORG", *m["default"].OrganizationName) assert.Equal(t, "TEST_USER", *m["default"].User) assert.Equal(t, "abcd1234", *m["default"].Password) assert.Equal(t, "ACCOUNTADMIN", *m["default"].Role) - assert.Equal(t, "TEST_ACCOUNT", *m["securityadmin"].Account) + assert.Equal(t, "TEST_ACCOUNT", *m["securityadmin"].AccountName) + assert.Equal(t, "TEST_ORG", *m["securityadmin"].OrganizationName) assert.Equal(t, "TEST_USER", *m["securityadmin"].User) assert.Equal(t, "abcd1234", *m["securityadmin"].Password) assert.Equal(t, "SECURITYADMIN", *m["securityadmin"].Role) } +func TestLoadConfigFileWithUnknownFields(t *testing.T) { + c := ` + [default] + unknown='TEST_ACCOUNT' + accountname='TEST_ACCOUNT' + ` + configPath := testhelpers.TestFile(t, "config", []byte(c)) + + m, err := loadConfigFile(configPath) + require.NoError(t, err) + assert.Equal(t, map[string]ConfigDTO{ + "default": { + AccountName: Pointer("TEST_ACCOUNT"), + }, + }, m) +} + +func TestLoadConfigFileWithInvalidFieldValue(t *testing.T) { + c := ` + [default] + accountname=42 + ` + configPath := testhelpers.TestFile(t, "config", []byte(c)) + + _, err := loadConfigFile(configPath) + require.ErrorContains(t, err, "toml: cannot decode TOML integer into struct field sdk.ConfigDTO.AccountName of type *string") +} + func TestProfileConfig(t *testing.T) { unencryptedKey, encryptedKey := random.GenerateRSAPrivateKeyEncrypted(t, "password") @@ -71,7 +103,7 @@ func TestProfileConfig(t *testing.T) { jwtexpiretimeout=50 externalbrowsertimeout=60 maxretrycount=1 - authenticator='jwt' + authenticator='SNOWFLAKE_JWT' insecuremode=true ocspfailopen=true token='token' @@ -284,7 +316,6 @@ func Test_toAuthenticationType(t *testing.T) { {input: "OAUTH", want: gosnowflake.AuthTypeOAuth}, {input: "EXTERNALBROWSER", want: gosnowflake.AuthTypeExternalBrowser}, {input: "OKTA", want: gosnowflake.AuthTypeOkta}, - {input: "JWT", want: gosnowflake.AuthTypeJwt}, {input: "SNOWFLAKE_JWT", want: gosnowflake.AuthTypeJwt}, {input: "TOKENACCESSOR", want: gosnowflake.AuthTypeTokenAccessor}, {input: "USERNAMEPASSWORDMFA", want: gosnowflake.AuthTypeUsernamePasswordMFA}, diff --git a/pkg/sdk/helper_test.go b/pkg/sdk/helper_test.go index fd82cb3665..5f2c2e362c 100644 --- a/pkg/sdk/helper_test.go +++ b/pkg/sdk/helper_test.go @@ -1,9 +1,11 @@ package sdk import ( + "strings" "testing" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testprofiles" + "github.com/snowflakedb/gosnowflake" ) func defaultTestClient(t *testing.T) *Client { @@ -46,3 +48,10 @@ func testClient(t *testing.T, profile string) *Client { return client } + +func configAccountId(t *testing.T, cfg *gosnowflake.Config) AccountIdentifier { + t.Helper() + accountIdRaw := cfg.Account + parts := strings.SplitN(accountIdRaw, "-", 2) + return NewAccountIdentifier(parts[0], parts[1]) +} diff --git a/pkg/snowflake/external_oauth_integration.go b/pkg/snowflake/external_oauth_integration.go deleted file mode 100644 index 80a7c45786..0000000000 --- a/pkg/snowflake/external_oauth_integration.go +++ /dev/null @@ -1,165 +0,0 @@ -package snowflake - -import ( - "database/sql" - "fmt" - "reflect" - - "github.com/jmoiron/sqlx" -) - -type ExternalOauthType string - -const ( - Okta ExternalOauthType = "OKTA" - Azure ExternalOauthType = "AZURE" - PingFederate ExternalOauthType = "PING_FEDERATE" - Custom ExternalOauthType = "CUSTOM" -) - -type SFUserMappingAttribute string - -const ( - LoginName SFUserMappingAttribute = "LOGIN_NAME" - EmailAddress SFUserMappingAttribute = "EMAIL_ADDRESS" -) - -type AnyRoleMode string - -const ( - Disable AnyRoleMode = "DISABLE" - Enable AnyRoleMode = "ENABLE" - EnableForPrivilege AnyRoleMode = "ENABLE_FOR_PRIVILEGE" -) - -type ExternalOauthIntegration3 struct { - TopLevelIdentifier - - Type string `pos:"parameter" db:"type"` - TypeOk bool - Enabled bool `pos:"parameter" db:"enabled"` - EnabledOk bool - ExternalOauthType ExternalOauthType `pos:"parameter" db:"EXTERNAL_OAUTH_TYPE"` - ExternalOauthTypeOk bool - ExternalOauthIssuer string `pos:"parameter" db:"EXTERNAL_OAUTH_ISSUER"` - ExternalOauthIssuerOk bool - ExternalOauthTokenUserMappingClaim []string `pos:"parameter" db:"EXTERNAL_OAUTH_TOKEN_USER_MAPPING_CLAIM"` - ExternalOauthTokenUserMappingClaimOk bool - ExternalOauthSnowflakeUserMappingAttribute SFUserMappingAttribute `pos:"parameter" db:"EXTERNAL_OAUTH_SNOWFLAKE_USER_MAPPING_ATTRIBUTE"` - ExternalOauthSnowflakeUserMappingAttributeOk bool - ExternalOauthJwsKeysURL []string `pos:"parameter" db:"EXTERNAL_OAUTH_JWS_KEYS_URL"` - ExternalOauthJwsKeysURLOk bool - ExternalOauthBlockedRolesList []string `pos:"parameter" db:"EXTERNAL_OAUTH_BLOCKED_ROLES_LIST"` - ExternalOauthBlockedRolesListOk bool - ExternalOauthAllowedRolesList []string `pos:"parameter" db:"EXTERNAL_OAUTH_ALLOWED_ROLES_LIST"` - ExternalOauthAllowedRolesListOk bool - ExternalOauthRsaPublicKey string `pos:"parameter" db:"EXTERNAL_OAUTH_RSA_PUBLIC_KEY"` - ExternalOauthRsaPublicKeyOk bool - ExternalOauthRsaPublicKey2 string `pos:"parameter" db:"EXTERNAL_OAUTH_RSA_PUBLIC_KEY_2"` - ExternalOauthRsaPublicKey2Ok bool - ExternalOauthAudienceList []string `pos:"parameter" db:"EXTERNAL_OAUTH_AUDIENCE_LIST"` - ExternalOauthAudienceListOk bool - ExternalOauthAnyRoleMode AnyRoleMode `pos:"parameter" db:"EXTERNAL_OAUTH_ANY_ROLE_MODE"` - ExternalOauthAnyRoleModeOk bool - ExternalOauthScopeDelimiter string `pos:"parameter" db:"EXTERNAL_OAUTH_SCOPE_DELIMITER"` - ExternalOauthScopeDelimiterOk bool - ExternalOauthScopeMappingAttribute string `pos:"parameter" db:"EXTERNAL_OAUTH_SCOPE_MAPPING_ATTRIBUTE"` - ExternalOauthScopeMappingAttributeOk bool - - Comment sql.NullString `pos:"parameter" db:"comment"` - CommentOk bool -} - -type ExternalOauthIntegration3Manager struct { - BaseManager -} - -func NewExternalOauthIntegration3Manager() (*ExternalOauthIntegration3Manager, error) { - sqlBuilder, err := newSQLBuilder( - "SECURITY INTEGRATION", - "SECURITY INTEGRATIONS", - reflect.TypeOf(ExternalOauthIntegration3CreateInput{}), - reflect.TypeOf(ExternalOauthIntegration3UpdateInput{}), - reflect.TypeOf(ExternalOauthIntegration3UpdateInput{}), - reflect.TypeOf(ExternalOauthIntegration3DeleteInput{}), - reflect.TypeOf(ExternalOauthIntegration3ReadOutput{}), - ) - if err != nil { - return nil, err - } - - return &ExternalOauthIntegration3Manager{ - BaseManager: BaseManager{ - sqlBuilder: *sqlBuilder, - }, - }, nil -} - -type ExternalOauthIntegration3CreateInput struct { - ExternalOauthIntegration3 - - OrReplace bool `pos:"beforeObjectType" value:"OR REPLACE"` - OrReplaceOk bool - IfNotExists bool `pos:"afterObjectType" value:"IF NOT EXISTS"` - IfNotExistsOk bool -} - -func (m *ExternalOauthIntegration3Manager) Create(x *ExternalOauthIntegration3CreateInput) (string, error) { - return m.sqlBuilder.Create(x) -} - -type ( - ExternalOauthIntegration3ReadInput = TopLevelIdentifier - ExternalOauthIntegration3ReadOutput = ExternalOauthIntegration3 -) - -func (m *ExternalOauthIntegration3Manager) ReadDescribe(x *ExternalOauthIntegration3ReadInput) (string, error) { - return m.sqlBuilder.Describe(x) -} - -func (m *ExternalOauthIntegration3Manager) ParseDescribe(rows *sql.Rows) (*ExternalOauthIntegration3ReadOutput, error) { - output := &ExternalOauthIntegration3ReadOutput{} - err := m.sqlBuilder.ParseDescribe(rows, output) - if err != nil { - return nil, err - } - return output, nil -} - -func (m *ExternalOauthIntegration3Manager) ReadShow(x *ExternalOauthIntegration3ReadInput) (string, error) { - return m.sqlBuilder.ShowLike(x) -} - -func (m *ExternalOauthIntegration3Manager) ParseShow(row *sqlx.Row) (*ExternalOauthIntegration3ReadOutput, error) { - result := &ExternalOauthIntegration3{} - if err := row.StructScan(result); err != nil { - return nil, fmt.Errorf("error scanning result: %w", err) - } - return result, nil -} - -type ExternalOauthIntegration3UpdateInput struct { - ExternalOauthIntegration3 - - IfExists bool `pos:"afterObjectType" value:"IF EXISTS"` - IfExistsOk bool -} - -func (m *ExternalOauthIntegration3Manager) Update(x *ExternalOauthIntegration3UpdateInput) (string, error) { - return m.sqlBuilder.Alter(x) -} - -func (m *ExternalOauthIntegration3Manager) Unset(x *ExternalOauthIntegration3UpdateInput) (string, error) { - return m.sqlBuilder.Unset(x) -} - -type ExternalOauthIntegration3DeleteInput struct { - TopLevelIdentifier - - IfExists bool `pos:"afterObjectType" value:"IF EXISTS"` - IfExistsOk bool -} - -func (m *ExternalOauthIntegration3Manager) Delete(x *ExternalOauthIntegration3DeleteInput) (string, error) { - return m.sqlBuilder.Drop(x) -} diff --git a/pkg/snowflake/external_oauth_integration_test.go b/pkg/snowflake/external_oauth_integration_test.go deleted file mode 100644 index c14d28e907..0000000000 --- a/pkg/snowflake/external_oauth_integration_test.go +++ /dev/null @@ -1,124 +0,0 @@ -package snowflake_test - -import ( - "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/stretchr/testify/require" -) - -func TestCreateExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3CreateInput{ - ExternalOauthIntegration3: snowflake.ExternalOauthIntegration3{ - TopLevelIdentifier: snowflake.TopLevelIdentifier{ - Name: "azure", - }, - Type: "EXTERNAL_OAUTH", - TypeOk: true, - ExternalOauthType: "AZURE", - ExternalOauthTypeOk: true, - }, - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - createStmt, err := mb.Create(input) - r.Nil(err) - r.Equal(`CREATE SECURITY INTEGRATION "azure" type = 'EXTERNAL_OAUTH' EXTERNAL_OAUTH_TYPE = 'AZURE';`, createStmt) -} - -func TestAlterExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3UpdateInput{ - ExternalOauthIntegration3: snowflake.ExternalOauthIntegration3{ - TopLevelIdentifier: snowflake.TopLevelIdentifier{ - Name: "azure", - }, - ExternalOauthIssuer: "someissuer", - ExternalOauthIssuerOk: true, - ExternalOauthBlockedRolesList: []string{"a", "b"}, - ExternalOauthBlockedRolesListOk: true, - }, - - IfExists: true, - IfExistsOk: true, - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - alterStmt, err := mb.Update(input) - r.Nil(err) - r.Equal( - `ALTER SECURITY INTEGRATION IF EXISTS "azure" SET EXTERNAL_OAUTH_ISSUER = 'someissuer' EXTERNAL_OAUTH_BLOCKED_ROLES_LIST = ('a', 'b');`, - alterStmt, - ) -} - -func TestUnsetExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3UpdateInput{ - ExternalOauthIntegration3: snowflake.ExternalOauthIntegration3{ - TopLevelIdentifier: snowflake.TopLevelIdentifier{ - Name: "azure", - }, - ExternalOauthTokenUserMappingClaimOk: true, - }, - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - unsetStmt, err := mb.Unset(input) - r.Nil(err) - r.Equal( - `ALTER SECURITY INTEGRATION "azure" UNSET EXTERNAL_OAUTH_TOKEN_USER_MAPPING_CLAIM;`, - unsetStmt, - ) -} - -func TestDeleteExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3DeleteInput{ - TopLevelIdentifier: snowflake.TopLevelIdentifier{ - Name: "azure", - }, - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - dropStmt, err := mb.Delete(input) - r.Nil(err) - r.Equal(`DROP SECURITY INTEGRATION "azure";`, dropStmt) -} - -func TestReadDescribeExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3ReadInput{ - Name: "azure", - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - describeStmt, err := mb.ReadDescribe(input) - r.Nil(err) - r.Equal(`DESCRIBE SECURITY INTEGRATION "azure";`, describeStmt) -} - -func TestReadShowExternalOauthIntegration3(t *testing.T) { - r := require.New(t) - - input := &snowflake.ExternalOauthIntegration3ReadInput{ - Name: "azure", - } - - mb, err := snowflake.NewExternalOauthIntegration3Manager() - r.Nil(err) - describeStmt, err := mb.ReadShow(input) - r.Nil(err) - r.Equal(`SHOW SECURITY INTEGRATIONS LIKE 'azure';`, describeStmt) -} diff --git a/pkg/snowflake/masking_policy.go b/pkg/snowflake/masking_policy.go deleted file mode 100644 index 0037a6ac7a..0000000000 --- a/pkg/snowflake/masking_policy.go +++ /dev/null @@ -1,52 +0,0 @@ -package snowflake - -import ( - "fmt" - "strings" -) - -// MaskingPolicyBuilder abstracts the creation of SQL queries for a Snowflake Masking Policy. -type MaskingPolicyBuilder struct { - name string - db string - schema string -} - -// QualifiedName prepends the db and schema if set and escapes everything nicely. -func (mpb *MaskingPolicyBuilder) QualifiedName() string { - var n strings.Builder - - if mpb.db != "" && mpb.schema != "" { - n.WriteString(fmt.Sprintf(`"%v"."%v".`, mpb.db, mpb.schema)) - } - - if mpb.db != "" && mpb.schema == "" { - n.WriteString(fmt.Sprintf(`"%v"..`, mpb.db)) - } - - if mpb.db == "" && mpb.schema != "" { - n.WriteString(fmt.Sprintf(`"%v".`, mpb.schema)) - } - - n.WriteString(fmt.Sprintf(`"%v"`, mpb.name)) - - return n.String() -} - -// MaskingPolicy returns a pointer to a Builder that abstracts the DDL operations for a masking policy. -// -// Supported DDL operations are: -// - CREATE MASKING POLICY -// - ALTER MASKING POLICY -// - DROP MASKING POLICY -// - SHOW MASKING POLICIES -// - DESCRIBE MASKING POLICY -// -// [Snowflake Reference](https://docs.snowflake.com/en/user-guide/security-column-ddm.html) -func MaskingPolicy(name, db, schema string) *MaskingPolicyBuilder { - return &MaskingPolicyBuilder{ - name: name, - db: db, - schema: schema, - } -} diff --git a/pkg/snowflake/oauth_integration.go b/pkg/snowflake/oauth_integration.go deleted file mode 100644 index 503500732b..0000000000 --- a/pkg/snowflake/oauth_integration.go +++ /dev/null @@ -1,69 +0,0 @@ -package snowflake - -import ( - "database/sql" - "errors" - "fmt" - "log" - - "github.com/jmoiron/sqlx" -) - -// OAuthIntegration returns a pointer to a Builder that abstracts the DDL operations for an api integration. -// -// Supported DDL operations are: -// - CREATE SECURITY INTEGRATION -// - ALTER SECURITY INTEGRATION -// - DROP INTEGRATION -// - SHOW INTEGRATIONS -// - DESCRIBE INTEGRATION -// -// [Snowflake Reference](https://docs.snowflake.com/en/sql-reference/ddl-user-security.html#security-integrations) -func NewOAuthIntegrationBuilder(name string) *Builder { - return &Builder{ - entityType: SecurityIntegrationType, - name: name, - } -} - -type OauthIntegration struct { - Name sql.NullString `db:"name"` - Category sql.NullString `db:"category"` - IntegrationType sql.NullString `db:"type"` - Enabled sql.NullBool `db:"enabled"` - Comment sql.NullString `db:"comment"` - CreatedOn sql.NullString `db:"created_on"` -} - -func ScanOAuthIntegration(row *sqlx.Row) (*OauthIntegration, error) { - r := &OauthIntegration{} - if err := row.StructScan(r); err != nil { - return nil, fmt.Errorf("error scanning struct err = %w", err) - } - return r, nil -} - -func ListIntegrations(db *sql.DB) ([]OauthIntegration, error) { - stmt := "SHOW INTEGRATIONS" - rows, err := db.Query(stmt) - if err != nil { - return nil, err - } - - defer rows.Close() - - r := []OauthIntegration{} - if err := sqlx.StructScan(rows, &r); err != nil { - if errors.Is(err, sql.ErrNoRows) { - log.Println("[DEBUG] no integrations found") - return nil, nil - } - return r, fmt.Errorf("failed to scan row for %s err = %w", stmt, err) - } - return r, nil -} - -func DropIntegration(db *sql.DB, name string) error { - stmt := NewOAuthIntegrationBuilder(name).Drop() - return Exec(db, stmt) -} diff --git a/pkg/snowflake/oauth_integration_test.go b/pkg/snowflake/oauth_integration_test.go deleted file mode 100644 index 144c8a6740..0000000000 --- a/pkg/snowflake/oauth_integration_test.go +++ /dev/null @@ -1,29 +0,0 @@ -package snowflake_test - -import ( - "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/stretchr/testify/require" -) - -func TestOAuthIntegration(t *testing.T) { - r := require.New(t) - builder := snowflake.NewOAuthIntegrationBuilder("tableau_desktop") - r.NotNil(builder) - - q := builder.Show() - r.Equal("SHOW SECURITY INTEGRATIONS LIKE 'tableau_desktop'", q) - - q = builder.Describe() - r.Equal("DESCRIBE SECURITY INTEGRATION \"tableau_desktop\"", q) - - c := builder.Create() - c.SetRaw(`TYPE=oauth`) - c.SetString(`oauth_client`, "tableau_desktop") - q = c.Statement() - r.Equal(`CREATE SECURITY INTEGRATION "tableau_desktop" TYPE=oauth OAUTH_CLIENT='tableau_desktop'`, q) - - e := builder.Drop() - r.Equal(`DROP SECURITY INTEGRATION "tableau_desktop"`, e) -} diff --git a/pkg/snowflake/saml_integration.go b/pkg/snowflake/saml_integration.go deleted file mode 100644 index a88d09ae74..0000000000 --- a/pkg/snowflake/saml_integration.go +++ /dev/null @@ -1,41 +0,0 @@ -package snowflake - -import ( - "database/sql" - "fmt" - - "github.com/jmoiron/sqlx" -) - -// SamlIntegration returns a pointer to a Builder that abstracts the DDL operations for a SAML2 integration. -// -// Supported DDL operations are: -// - CREATE SECURITY INTEGRATION -// - ALTER SECURITY INTEGRATION -// - DROP INTEGRATION -// - SHOW INTEGRATIONS -// - DESCRIBE INTEGRATION -// -// [Snowflake Reference](https://docs.snowflake.com/en/sql-reference/ddl-user-security.html#security-integrations) -func NewSamlIntegrationBuilder(name string) *Builder { - return &Builder{ - entityType: SecurityIntegrationType, - name: name, - } -} - -type SamlIntegration struct { - Name sql.NullString `db:"name"` - Category sql.NullString `db:"category"` - IntegrationType sql.NullString `db:"type"` - CreatedOn sql.NullString `db:"created_on"` - Enabled sql.NullBool `db:"enabled"` -} - -func ScanSamlIntegration(row *sqlx.Row) (*SamlIntegration, error) { - r := &SamlIntegration{} - if err := row.StructScan(r); err != nil { - return r, fmt.Errorf("error scanning struct err = %w", err) - } - return r, nil -} diff --git a/pkg/snowflake/saml_integration_test.go b/pkg/snowflake/saml_integration_test.go deleted file mode 100644 index 27f064fb22..0000000000 --- a/pkg/snowflake/saml_integration_test.go +++ /dev/null @@ -1,43 +0,0 @@ -package snowflake_test - -import ( - "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/stretchr/testify/require" -) - -func TestSamlIntegration(t *testing.T) { - r := require.New(t) - builder := snowflake.NewSamlIntegrationBuilder("test_saml_integration") - r.NotNil(builder) - - q := builder.Show() - r.Equal("SHOW SECURITY INTEGRATIONS LIKE 'test_saml_integration'", q) - - q = builder.Describe() - r.Equal("DESCRIBE SECURITY INTEGRATION \"test_saml_integration\"", q) - - c := builder.Create() - c.SetRaw(`TYPE=SAML2`) - c.SetString(`saml2_issuer`, "test_issuer") - c.SetString(`saml2_sso_url`, "https://testsamlissuer.com") - c.SetString(`saml2_provider`, "CUSTOM") - c.SetString(`saml2_x509_cert`, "MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO") - c.SetBool(`enabled`, true) - q = c.Statement() - r.Equal(`CREATE SECURITY INTEGRATION "test_saml_integration" TYPE=SAML2 SAML2_ISSUER='test_issuer' SAML2_PROVIDER='CUSTOM' SAML2_SSO_URL='https://testsamlissuer.com' SAML2_X509_CERT='MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO' ENABLED=true`, q) - - d := builder.Alter() - d.SetRaw(`TYPE=SAML2`) - d.SetString(`saml2_issuer`, "test_issuer") - d.SetString(`saml2_sso_url`, "https://testsamlissuer.com") - d.SetString(`saml2_provider`, "CUSTOM") - d.SetString(`saml2_x509_cert`, "MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO") - d.SetBool(`enabled`, false) - q = d.Statement() - r.Equal(`ALTER SECURITY INTEGRATION "test_saml_integration" SET TYPE=SAML2 SAML2_ISSUER='test_issuer' SAML2_PROVIDER='CUSTOM' SAML2_SSO_URL='https://testsamlissuer.com' SAML2_X509_CERT='MIICYzCCAcygAwIBAgIBADANBgkqhkiG9w0BAQUFADAuMQswCQYDVQQGEwJVUzEMMAoGA1UEChMDSUJNMREwDwYDVQQLEwhMb2NhbCBDQTAeFw05OTEyMjIwNTAwMDBaFw0wMDEyMjMwNDU5NTlaMC4xCzAJBgNVBAYTAlVTMQwwCgYDVQQKEwNJQk0xETAPBgNVBAsTCExvY2FsIENBMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQD2bZEo7xGaX2/0GHkrNFZvlxBou9v1Jmt/PDiTMPve8r9FeJAQ0QdvFST/0JPQYD20rH0bimdDLgNdNynmyRoS2S/IInfpmf69iyc2G0TPyRvmHIiOZbdCd+YBHQi1adkj17NDcWj6S14tVurFX73zx0sNoMS79q3tuXKrDsxeuwIDAQABo4GQMIGNMEsGCVUdDwGG+EIBDQQ+EzxHZW5lcmF0ZWQgYnkgdGhlIFNlY3VyZVdheSBTZWN1cml0eSBTZXJ2ZXIgZm9yIE9TLzM5MCAoUkFDRikwDgYDVR0PAQH/BAQDAgAGMA8GA1UdEwEB/wQFMAMBAf8wHQYDVR0OBBYEFJ3+ocRyCTJw067dLSwr/nalx6YMMA0GCSqGSIb3DQEBBQUAA4GBAMaQzt+zaj1GU77yzlr8iiMBXgdQrwsZZWJo5exnAucJAEYQZmOfyLiMD6oYq+ZnfvM0n8G/Y79q8nhwvuxpYOnRSAXFp6xSkrIOeZtJMY1h00LKp/JX3Ng1svZ2agE126JHsQ0bhzN5TKsYfbwfTwfjdWAGy6Vf1nYi/rO+ryMO' ENABLED=false`, q) - - e := builder.Drop() - r.Equal(`DROP SECURITY INTEGRATION "test_saml_integration"`, e) -} diff --git a/pkg/snowflake/scim_integration.go b/pkg/snowflake/scim_integration.go deleted file mode 100644 index 2b91705363..0000000000 --- a/pkg/snowflake/scim_integration.go +++ /dev/null @@ -1,40 +0,0 @@ -package snowflake - -import ( - "database/sql" - "fmt" - - "github.com/jmoiron/sqlx" -) - -// NewSCIMIntegrationBuilder returns a pointer to a Builder that abstracts the DDL operations for an api integration. -// -// Supported DDL operations are: -// - CREATE SECURITY INTEGRATION -// - ALTER SECURITY INTEGRATION -// - DROP INTEGRATION -// - SHOW INTEGRATIONS -// - DESCRIBE INTEGRATION -// -// [Snowflake Reference](https://docs.snowflake.com/en/sql-reference/ddl-user-security.html#security-integrations) -func NewSCIMIntegrationBuilder(name string) *Builder { - return &Builder{ - entityType: SecurityIntegrationType, - name: name, - } -} - -type SCIMIntegration struct { - Name sql.NullString `db:"name"` - Category sql.NullString `db:"category"` - IntegrationType sql.NullString `db:"type"` - CreatedOn sql.NullString `db:"created_on"` -} - -func ScanScimIntegration(row *sqlx.Row) (*SCIMIntegration, error) { - r := &SCIMIntegration{} - if err := row.StructScan(r); err != nil { - return r, fmt.Errorf("error scanning struct err = %w", err) - } - return r, nil -} diff --git a/pkg/snowflake/scim_integration_test.go b/pkg/snowflake/scim_integration_test.go deleted file mode 100644 index b9ee16d504..0000000000 --- a/pkg/snowflake/scim_integration_test.go +++ /dev/null @@ -1,38 +0,0 @@ -package snowflake_test - -import ( - "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" - "github.com/stretchr/testify/require" -) - -func TestScimIntegration(t *testing.T) { - r := require.New(t) - builder := snowflake.NewSCIMIntegrationBuilder("aad_provisioning") - r.NotNil(builder) - - q := builder.Show() - r.Equal("SHOW SECURITY INTEGRATIONS LIKE 'aad_provisioning'", q) - - q = builder.Describe() - r.Equal("DESCRIBE SECURITY INTEGRATION \"aad_provisioning\"", q) - - c := builder.Create() - c.SetRaw(`TYPE=scim`) - c.SetString(`scim_client`, "azure") - c.SetString(`run_as_role`, "AAD_PROVISIONER") - q = c.Statement() - r.Equal(`CREATE SECURITY INTEGRATION "aad_provisioning" TYPE=scim RUN_AS_ROLE='AAD_PROVISIONER' SCIM_CLIENT='azure'`, q) - - d := builder.Alter() - d.SetRaw(`TYPE=scim`) - d.SetString(`scim_client`, "azure") - d.SetString(`run_as_role`, "AAD_PROVISIONER") - d.SetString(`network_policy`, "aad_policy") - q = d.Statement() - r.Equal(`ALTER SECURITY INTEGRATION "aad_provisioning" SET TYPE=scim NETWORK_POLICY='aad_policy' RUN_AS_ROLE='AAD_PROVISIONER' SCIM_CLIENT='azure'`, q) - - e := builder.Drop() - r.Equal(`DROP SECURITY INTEGRATION "aad_provisioning"`, e) -} diff --git a/v1-preparations/LIST_OF_REMOVED_RESOURCES_FOR_V1.md b/v1-preparations/LIST_OF_REMOVED_RESOURCES_FOR_V1.md index d34dc99216..645bb93568 100644 --- a/v1-preparations/LIST_OF_REMOVED_RESOURCES_FOR_V1.md +++ b/v1-preparations/LIST_OF_REMOVED_RESOURCES_FOR_V1.md @@ -1,14 +1,16 @@ -Deprecated resources that will be removed with the V1: +Deprecated resources that are removed with the V1: -* [snowflake_database_old](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/database_old) -* [snowflake_tag_masking_policy_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/tag_masking_policy_association) +* [snowflake_database_old](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/database_old) +* [snowflake_role](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/role) +* [snowflake_role](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/role) (datasource) +* [snowflake_oauth_integration](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/oauth_integration) +* [snowflake_saml_integration](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/saml_integration) +* [snowflake_stream](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/stream) +* [snowflake_tag_masking_policy_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/tag_masking_policy_association) +* [snowflake_session_parameter](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/session_parameter) + + +Deprecated resources that should be removed with the V1: * [snowflake_procedure](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/procedure) * [snowflake_function](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/function) -* [snowflake_role](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/role) -* [snowflake_role](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/data-sources/role) (datasource) -* [snowflake_oauth_integration](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/oauth_integration) -* [snowflake_saml_integration](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/saml_integration) -* [snowflake_session_parameter](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/session_parameter) * [snowflake_unsafe_execute](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/unsafe_execute) - will be renamed -* [snowflake_stream](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/stream) -* [snowflake_tag_masking_policy_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/tag_masking_policy_association) diff --git a/v1-preparations/REMAINING_GA_OBJECTS.MD b/v1-preparations/REMAINING_GA_OBJECTS.MD index fae960f8d3..b0d0bc9190 100644 --- a/v1-preparations/REMAINING_GA_OBJECTS.MD +++ b/v1-preparations/REMAINING_GA_OBJECTS.MD @@ -6,13 +6,14 @@ Status is one of: - ✅ - done
- ❌ - not started
- 👨‍💻 - in progress
+- 🗑 - removed
Known issues lists open issues touching the given object. Note that some of these issues may be already fixed in the newer provider versions. We will address these while working on the given object. | Object Type | Status | Known issues | |-----------------------------|:------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | snowflake_object_parameter | 👨‍💻 | [#2446](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2446), [#1848](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1848), [#1561](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1561), [#1457](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1457) | -| snowflake_session_parameter | 👨‍💻 | [#1814](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1814), [#1783](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1783), [#1036](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1036) | +| snowflake_session_parameter | 🗑‍ | [#1814](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1814), [#1783](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1783), [#1036](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1036) | | snowflake_account_parameter | 👨‍💻 | [#1679](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1679) | | API INTEGRATION | ❌ | [#2772](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2772), [#1445](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1445) | | APPLICATION | ❌ | - |