diff --git a/MIGRATION_GUIDE.md b/MIGRATION_GUIDE.md index e85a99a66d..fde9268c6c 100644 --- a/MIGRATION_GUIDE.md +++ b/MIGRATION_GUIDE.md @@ -9,6 +9,17 @@ across different versions. ## v0.99.0 ➞ v0.100.0 +### snowflake_account resource changes + +Changes: +- `admin_user_type` is now supported. No action required during the migration. +- `grace_period_in_days` is now required. The field should be explicitly set in the following versions. +- Account renaming is now supported. +- `is_org_admin` is a settable field (previously it was read-only field). Changing its value is also supported. +- `must_change_password` and `is_org_admin` type was changed from `bool` to bool-string (more on that [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/v1-preparations/CHANGES_BEFORE_V1.md#empty-values)). No action required during the migration. +- The underlying resource identifier was changed from `` to `.`. Migration will be done automatically. Notice this introduces changes in how `snowflake_account` resource is imported. +- New `show_output` field was added (see [raw Snowflake output](./v1-preparations/CHANGES_BEFORE_V1.md#raw-snowflake-output)). + ### snowflake_accounts data source changes New filtering options: - `with_history` @@ -35,7 +46,7 @@ output "simple_output" { ### snowflake_tag_association resource changes #### *(behavior change)* new id format -In order to provide more functionality for tagging objects, we have changed the resource id from `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"` to `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"|TAG_VALUE|OBJECT_TYPE`. This allows to group tags associations per tag ID, tag value and object type in one resource. +To provide more functionality for tagging objects, we have changed the resource id from `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"` to `"TAG_DATABASE"."TAG_SCHEMA"."TAG_NAME"|TAG_VALUE|OBJECT_TYPE`. This allows to group tags associations per tag ID, tag value and object type in one resource. ``` resource "snowflake_tag_association" "gold_warehouses" { object_identifiers = [snowflake_warehouse.w1.fully_qualified_name, snowflake_warehouse.w2.fully_qualified_name] @@ -299,6 +310,9 @@ Also, we added diff suppress function that prevents Terraform from showing diffe No change is required, the state will be migrated automatically. +#### *(breaking change)* Required warehouse +For this resource, the provider now uses [tag references](https://docs.snowflake.com/en/sql-reference/functions/tag_references) to get information about masking policies attached to tags. This function requires a warehouse in the connection. Please, make sure you have either set a `DEFAULT_WAREHOUSE` for the user, or specified a warehouse in the provider configuration. + ## v0.97.0 ➞ v0.98.0 ### *(new feature)* snowflake_connections datasource @@ -357,7 +371,7 @@ On our road to v1, we have decided to rework configuration to address the most c We have added new fields to match the ones in [the driver](https://pkg.go.dev/github.com/snowflakedb/gosnowflake#Config) and to simplify setting account name. Specifically: - `include_retry_reason`, `max_retry_count`, `driver_tracing`, `tmp_directory_path` and `disable_console_login` are the new fields that are supported in the driver - `disable_saml_url_check` will be added to the provider after upgrading the driver -- `account_name` and `organization_name` were added to improve handling account names. Read more in [docs](https://docs.snowflake.com/en/user-guide/admin-account-identifier#using-an-account-name-as-an-identifier). +- `account_name` and `organization_name` were added to improve handling account names. Execute `SELECT CURRENT_ORGANIZATION_NAME(), CURRENT_ACCOUNT_NAME();` to get the required values. Read more in [docs](https://docs.snowflake.com/en/user-guide/admin-account-identifier#using-an-account-name-as-an-identifier). #### *(behavior change)* changed configuration of driver log level To be more consistent with other configuration options, we have decided to add `driver_tracing` to the configuration schema. This value can also be configured by `SNOWFLAKE_DRIVER_TRACING` environmental variable and by `drivertracing` field in the TOML file. The previous `SF_TF_GOSNOWFLAKE_LOG_LEVEL` environmental variable is not supported now, and was removed from the provider. @@ -378,6 +392,12 @@ provider "snowflake" { } ``` +This change may cause the connection host URL to change. If you get errors like +``` +Error: open snowflake connection: Post "https://ORGANIZATION-ACCOUNT.snowflakecomputing.com:443/session/v1/login-request?requestId=[guid]&request_guid=[guid]&roleName=myrole": EOF +``` +make sure that the host `ORGANIZATION-ACCOUNT.snowflakecomputing.com` is allowed to be reached from your network (i.e. not blocked by a firewall). + #### *(behavior change)* changed behavior of some fields For the fields that are not deprecated, we focused on improving validations and documentation. Also, we adjusted some fields to match our [driver's](https://github.com/snowflakedb/gosnowflake) defaults. Specifically: - Relaxed validations for enum fields like `protocol` and `authenticator`. Now, the case on such fields is ignored. @@ -809,7 +829,7 @@ Removed fields: The value of these field will be removed from the state automatically. #### *(breaking change)* Required warehouse -For this resource, the provider now uses [policy references](https://docs.snowflake.com/en/sql-reference/functions/policy_references) which requires a warehouse in the connection. Please, make sure you have either set a DEFAULT_WAREHOUSE for the user, or specified a warehouse in the provider configuration. +For this resource, the provider now uses [policy references](https://docs.snowflake.com/en/sql-reference/functions/policy_references) which requires a warehouse in the connection. Please, make sure you have either set a `DEFAULT_WAREHOUSE` for the user, or specified a warehouse in the provider configuration. ### Identifier changes diff --git a/docs/data-sources/connections.md b/docs/data-sources/connections.md index 5dff3c63ad..e794d01f9f 100644 --- a/docs/data-sources/connections.md +++ b/docs/data-sources/connections.md @@ -2,14 +2,14 @@ page_title: "snowflake_connections Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered connections. Filtering is aligned with the current possibilities for SHOW CONNECTIONS https://docs.snowflake.com/en/sql-reference/sql/show-connections query. The results of SHOW is encapsulated in one output collection connections. + Data source used to get details of filtered connections. Filtering is aligned with the current possibilities for SHOW CONNECTIONS https://docs.snowflake.com/en/sql-reference/sql/show-connections query. The results of SHOW is encapsulated in one output collection connections. --- !> **V1 release candidate** This data source is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. # snowflake_connections (Data Source) -Datasource used to get details of filtered connections. Filtering is aligned with the current possibilities for [SHOW CONNECTIONS](https://docs.snowflake.com/en/sql-reference/sql/show-connections) query. The results of SHOW is encapsulated in one output collection `connections`. +Data source used to get details of filtered connections. Filtering is aligned with the current possibilities for [SHOW CONNECTIONS](https://docs.snowflake.com/en/sql-reference/sql/show-connections) query. The results of SHOW is encapsulated in one output collection `connections`. ## Example Usage @@ -39,6 +39,18 @@ data "snowflake_connections" "like_prefix" { output "like_prefix_output" { value = data.snowflake_connections.like_prefix.connections } + +# Ensure the number of connections is equal to at exactly one element (with the use of check block) +check "connection_check" { + data "snowflake_connections" "assert_with_check_block" { + like = "connection-name" + } + + assert { + condition = length(data.snowflake_connections.assert_with_check_block.connections) == 1 + error_message = "connections filtered by '${data.snowflake_connections.assert_with_check_block.like}' returned ${length(data.snowflake_connections.assert_with_check_block.connections)} connections where one was expected" + } +} ``` diff --git a/docs/data-sources/database_roles.md b/docs/data-sources/database_roles.md index baeb080b28..ccdfd274b7 100644 --- a/docs/data-sources/database_roles.md +++ b/docs/data-sources/database_roles.md @@ -2,14 +2,14 @@ page_title: "snowflake_database_roles Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered database roles. Filtering is aligned with the current possibilities for SHOW DATABASE ROLES https://docs.snowflake.com/en/sql-reference/sql/show-database-roles query (like and limit are supported). The results of SHOW is encapsulated in show_output collection. + Data source used to get details of filtered database roles. Filtering is aligned with the current possibilities for SHOW DATABASE ROLES https://docs.snowflake.com/en/sql-reference/sql/show-database-roles query (like and limit are supported). The results of SHOW is encapsulated in show_output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_database_roles (Data Source) -Datasource used to get details of filtered database roles. Filtering is aligned with the current possibilities for [SHOW DATABASE ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-database-roles) query (`like` and `limit` are supported). The results of SHOW is encapsulated in show_output collection. +Data source used to get details of filtered database roles. Filtering is aligned with the current possibilities for [SHOW DATABASE ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-database-roles) query (`like` and `limit` are supported). The results of SHOW is encapsulated in show_output collection. ## Example Usage @@ -60,7 +60,7 @@ data "snowflake_database_roles" "assert_with_postcondition" { # Ensure the number of database roles is equal to at exactly one element (with the use of check block) check "database_role_check" { - data "snowflake_resource_monitors" "assert_with_check_block" { + data "snowflake_database_roles" "assert_with_check_block" { in_database = "database-name" like = "database_role-name" } diff --git a/docs/data-sources/databases.md b/docs/data-sources/databases.md index a32b9f9da9..691ded55b1 100644 --- a/docs/data-sources/databases.md +++ b/docs/data-sources/databases.md @@ -2,14 +2,14 @@ page_title: "snowflake_databases Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered databases. Filtering is aligned with the current possibilities for SHOW DATABASES https://docs.snowflake.com/en/sql-reference/sql/show-databases query (like, starts_with, and limit are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. + Data source used to get details of filtered databases. Filtering is aligned with the current possibilities for SHOW DATABASES https://docs.snowflake.com/en/sql-reference/sql/show-databases query (like, starts_with, and limit are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_databases (Data Source) -Datasource used to get details of filtered databases. Filtering is aligned with the current possibilities for [SHOW DATABASES](https://docs.snowflake.com/en/sql-reference/sql/show-databases) query (`like`, `starts_with`, and `limit` are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. +Data source used to get details of filtered databases. Filtering is aligned with the current possibilities for [SHOW DATABASES](https://docs.snowflake.com/en/sql-reference/sql/show-databases) query (`like`, `starts_with`, and `limit` are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. ## Example Usage diff --git a/docs/data-sources/masking_policies.md b/docs/data-sources/masking_policies.md index cc7e257c56..1facb000b9 100644 --- a/docs/data-sources/masking_policies.md +++ b/docs/data-sources/masking_policies.md @@ -2,14 +2,14 @@ page_title: "snowflake_masking_policies Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered masking policies. Filtering is aligned with the current possibilities for SHOW MASKING POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies query. The results of SHOW and DESCRIBE are encapsulated in one output collection masking_policies. + Data source used to get details of filtered masking policies. Filtering is aligned with the current possibilities for SHOW MASKING POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies query. The results of SHOW and DESCRIBE are encapsulated in one output collection masking_policies. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. # snowflake_masking_policies (Data Source) -Datasource used to get details of filtered masking policies. Filtering is aligned with the current possibilities for [SHOW MASKING POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `masking_policies`. +Data source used to get details of filtered masking policies. Filtering is aligned with the current possibilities for [SHOW MASKING POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `masking_policies`. ## Example Usage diff --git a/docs/data-sources/network_policies.md b/docs/data-sources/network_policies.md index 9a930a231b..b46596fce8 100644 --- a/docs/data-sources/network_policies.md +++ b/docs/data-sources/network_policies.md @@ -2,14 +2,14 @@ page_title: "snowflake_network_policies Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered network policies. Filtering is aligned with the current possibilities for SHOW NETWORK POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-network-policies query (like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection. + Data source used to get details of filtered network policies. Filtering is aligned with the current possibilities for SHOW NETWORK POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-network-policies query (like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_network_policies (Data Source) -Datasource used to get details of filtered network policies. Filtering is aligned with the current possibilities for [SHOW NETWORK POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-network-policies) query (`like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection. +Data source used to get details of filtered network policies. Filtering is aligned with the current possibilities for [SHOW NETWORK POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-network-policies) query (`like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection. ## Example Usage @@ -43,7 +43,7 @@ output "only_show_output" { # Ensure the number of network policies is equal to at least one element (with the use of postcondition) data "snowflake_network_policies" "assert_with_postcondition" { - starts_with = "network-policy-name" + like = "network-policy-name" lifecycle { postcondition { condition = length(self.network_policies) > 0 diff --git a/docs/data-sources/resource_monitors.md b/docs/data-sources/resource_monitors.md index f0da9f3394..6e44d1a023 100644 --- a/docs/data-sources/resource_monitors.md +++ b/docs/data-sources/resource_monitors.md @@ -2,14 +2,14 @@ page_title: "snowflake_resource_monitors Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for SHOW RESOURCE MONITORS https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors query (like is supported). The results of SHOW is encapsulated in show_output collection. + Data source used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for SHOW RESOURCE MONITORS https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors query (like is supported). The results of SHOW is encapsulated in show_output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. # snowflake_resource_monitors (Data Source) -Datasource used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for [SHOW RESOURCE MONITORS](https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors) query (`like` is supported). The results of SHOW is encapsulated in show_output collection. +Data source used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for [SHOW RESOURCE MONITORS](https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors) query (`like` is supported). The results of SHOW is encapsulated in show_output collection. ## Example Usage diff --git a/docs/data-sources/roles.md b/docs/data-sources/roles.md index 8382bffa5b..0e5db6a28f 100644 --- a/docs/data-sources/roles.md +++ b/docs/data-sources/roles.md @@ -2,14 +2,17 @@ page_title: "snowflake_roles Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered roles. Filtering is aligned with the current possibilities for SHOW ROLES https://docs.snowflake.com/en/sql-reference/sql/show-roles query (like and in_class are all supported). The results of SHOW are encapsulated in one output collection. + Data source used to get details of filtered roles. Filtering is aligned with the current possibilities for SHOW ROLES https://docs.snowflake.com/en/sql-reference/sql/show-roles query (like and in_class are all supported). The results of SHOW are encapsulated in one output collection. --- !> **V1 release candidate** This datasource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. + +-> **Note** Fields `STARTS WITH` and `LIMIT` are currently missing. They will be added in the future. + # snowflake_roles (Data Source) -Datasource used to get details of filtered roles. Filtering is aligned with the current possibilities for [SHOW ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-roles) query (`like` and `in_class` are all supported). The results of SHOW are encapsulated in one output collection. +Data source used to get details of filtered roles. Filtering is aligned with the current possibilities for [SHOW ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-roles) query (`like` and `in_class` are all supported). The results of SHOW are encapsulated in one output collection. ## Example Usage diff --git a/docs/data-sources/row_access_policies.md b/docs/data-sources/row_access_policies.md index b6ccb31cd8..1c7c7b6d28 100644 --- a/docs/data-sources/row_access_policies.md +++ b/docs/data-sources/row_access_policies.md @@ -2,14 +2,14 @@ page_title: "snowflake_row_access_policies Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered row access policies. Filtering is aligned with the current possibilities for SHOW ROW ACCESS POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies query. The results of SHOW and DESCRIBE are encapsulated in one output collection row_access_policies. + Data source used to get details of filtered row access policies. Filtering is aligned with the current possibilities for SHOW ROW ACCESS POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies query. The results of SHOW and DESCRIBE are encapsulated in one output collection row_access_policies. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. # snowflake_row_access_policies (Data Source) -Datasource used to get details of filtered row access policies. Filtering is aligned with the current possibilities for [SHOW ROW ACCESS POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `row_access_policies`. +Data source used to get details of filtered row access policies. Filtering is aligned with the current possibilities for [SHOW ROW ACCESS POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `row_access_policies`. ## Example Usage diff --git a/docs/data-sources/schemas.md b/docs/data-sources/schemas.md index 81cc107919..5787b9bd7b 100644 --- a/docs/data-sources/schemas.md +++ b/docs/data-sources/schemas.md @@ -2,14 +2,17 @@ page_title: "snowflake_schemas Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered schemas. Filtering is aligned with the current possibilities for SHOW SCHEMAS https://docs.snowflake.com/en/sql-reference/sql/show-schemas query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. + Data source used to get details of filtered schemas. Filtering is aligned with the current possibilities for SHOW SCHEMAS https://docs.snowflake.com/en/sql-reference/sql/show-schemas query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +-> **Note** Field `WITH PRIVILEGES` is currently missing. It will be added in the future. + # snowflake_schemas (Data Source) -Datasource used to get details of filtered schemas. Filtering is aligned with the current possibilities for [SHOW SCHEMAS](https://docs.snowflake.com/en/sql-reference/sql/show-schemas) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. +Data source used to get details of filtered schemas. Filtering is aligned with the current possibilities for [SHOW SCHEMAS](https://docs.snowflake.com/en/sql-reference/sql/show-schemas) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. ## Example Usage diff --git a/docs/data-sources/secrets.md b/docs/data-sources/secrets.md index 7271fd6090..397cfa9e3e 100644 --- a/docs/data-sources/secrets.md +++ b/docs/data-sources/secrets.md @@ -2,14 +2,14 @@ page_title: "snowflake_secrets Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered secrets. Filtering is aligned with the current possibilities for SHOW SECRETS https://docs.snowflake.com/en/sql-reference/sql/show-secrets query. The results of SHOW and DESCRIBE are encapsulated in one output collection secrets. + Data source used to get details of filtered secrets. Filtering is aligned with the current possibilities for SHOW SECRETS https://docs.snowflake.com/en/sql-reference/sql/show-secrets query. The results of SHOW and DESCRIBE are encapsulated in one output collection secrets. --- !> **V1 release candidate** This data source is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. # snowflake_secrets (Data Source) -Datasource used to get details of filtered secrets. Filtering is aligned with the current possibilities for [SHOW SECRETS](https://docs.snowflake.com/en/sql-reference/sql/show-secrets) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `secrets`. +Data source used to get details of filtered secrets. Filtering is aligned with the current possibilities for [SHOW SECRETS](https://docs.snowflake.com/en/sql-reference/sql/show-secrets) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `secrets`. ## Example Usage diff --git a/docs/data-sources/security_integrations.md b/docs/data-sources/security_integrations.md index 4f0bc30c5b..833eb70663 100644 --- a/docs/data-sources/security_integrations.md +++ b/docs/data-sources/security_integrations.md @@ -2,14 +2,14 @@ page_title: "snowflake_security_integrations Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered security integrations. Filtering is aligned with the current possibilities for SHOW SECURITY INTEGRATIONS https://docs.snowflake.com/en/sql-reference/sql/show-integrations query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection security_integrations. + Data source used to get details of filtered security integrations. Filtering is aligned with the current possibilities for SHOW SECURITY INTEGRATIONS https://docs.snowflake.com/en/sql-reference/sql/show-integrations query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection security_integrations. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_security_integrations (Data Source) -Datasource used to get details of filtered security integrations. Filtering is aligned with the current possibilities for [SHOW SECURITY INTEGRATIONS](https://docs.snowflake.com/en/sql-reference/sql/show-integrations) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `security_integrations`. +Data source used to get details of filtered security integrations. Filtering is aligned with the current possibilities for [SHOW SECURITY INTEGRATIONS](https://docs.snowflake.com/en/sql-reference/sql/show-integrations) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `security_integrations`. ## Example Usage diff --git a/docs/data-sources/streamlits.md b/docs/data-sources/streamlits.md index ef767a9082..3bb9c19549 100644 --- a/docs/data-sources/streamlits.md +++ b/docs/data-sources/streamlits.md @@ -2,14 +2,14 @@ page_title: "snowflake_streamlits Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered streamlits. Filtering is aligned with the current possibilities for SHOW STREAMLITS https://docs.snowflake.com/en/sql-reference/sql/show-streamlits query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection streamlits. + Data source used to get details of filtered streamlits. Filtering is aligned with the current possibilities for SHOW STREAMLITS https://docs.snowflake.com/en/sql-reference/sql/show-streamlits query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection streamlits. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. # snowflake_streamlits (Data Source) -Datasource used to get details of filtered streamlits. Filtering is aligned with the current possibilities for [SHOW STREAMLITS](https://docs.snowflake.com/en/sql-reference/sql/show-streamlits) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `streamlits`. +Data source used to get details of filtered streamlits. Filtering is aligned with the current possibilities for [SHOW STREAMLITS](https://docs.snowflake.com/en/sql-reference/sql/show-streamlits) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `streamlits`. ## Example Usage diff --git a/docs/data-sources/streams.md b/docs/data-sources/streams.md index 62ca70cb22..23acd3a192 100644 --- a/docs/data-sources/streams.md +++ b/docs/data-sources/streams.md @@ -2,14 +2,14 @@ page_title: "snowflake_streams Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered streams. Filtering is aligned with the current possibilities for SHOW STREAMS https://docs.snowflake.com/en/sql-reference/sql/show-streams query. The results of SHOW and DESCRIBE are encapsulated in one output collection streams. + Data source used to get details of filtered streams. Filtering is aligned with the current possibilities for SHOW STREAMS https://docs.snowflake.com/en/sql-reference/sql/show-streams query. The results of SHOW and DESCRIBE are encapsulated in one output collection streams. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. # snowflake_streams (Data Source) -Datasource used to get details of filtered streams. Filtering is aligned with the current possibilities for [SHOW STREAMS](https://docs.snowflake.com/en/sql-reference/sql/show-streams) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `streams`. +Data source used to get details of filtered streams. Filtering is aligned with the current possibilities for [SHOW STREAMS](https://docs.snowflake.com/en/sql-reference/sql/show-streams) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `streams`. ## Example Usage diff --git a/docs/data-sources/tags.md b/docs/data-sources/tags.md index bb8e360071..cde76cf652 100644 --- a/docs/data-sources/tags.md +++ b/docs/data-sources/tags.md @@ -2,14 +2,14 @@ page_title: "snowflake_tags Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered tags. Filtering is aligned with the current possibilities for SHOW TAGS https://docs.snowflake.com/en/sql-reference/sql/show-tags query. The results of SHOW are encapsulated in one output collection tags. + Data source used to get details of filtered tags. Filtering is aligned with the current possibilities for SHOW TAGS https://docs.snowflake.com/en/sql-reference/sql/show-tags query. The results of SHOW are encapsulated in one output collection tags. --- !> **V1 release candidate** This data source is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0980--v0990) to use it. # snowflake_tags (Data Source) -Datasource used to get details of filtered tags. Filtering is aligned with the current possibilities for [SHOW TAGS](https://docs.snowflake.com/en/sql-reference/sql/show-tags) query. The results of SHOW are encapsulated in one output collection `tags`. +Data source used to get details of filtered tags. Filtering is aligned with the current possibilities for [SHOW TAGS](https://docs.snowflake.com/en/sql-reference/sql/show-tags) query. The results of SHOW are encapsulated in one output collection `tags`. ## Example Usage diff --git a/docs/data-sources/users.md b/docs/data-sources/users.md index c5e501510f..4a068375c9 100644 --- a/docs/data-sources/users.md +++ b/docs/data-sources/users.md @@ -2,14 +2,14 @@ page_title: "snowflake_users Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered users. Filtering is aligned with the current possibilities for SHOW USERS https://docs.snowflake.com/en/sql-reference/sql/show-users query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in show_output (only empty or default values), the DESCRIBE command cannot be called, so you have to set with_describe = false. Only parameters output is not affected by the lack of privileges. + Data source used to get details of filtered users. Filtering is aligned with the current possibilities for SHOW USERS https://docs.snowflake.com/en/sql-reference/sql/show-users query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in show_output (only empty or default values), the DESCRIBE command cannot be called, so you have to set with_describe = false. Only parameters output is not affected by the lack of privileges. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_users (Data Source) -Datasource used to get details of filtered users. Filtering is aligned with the current possibilities for [SHOW USERS](https://docs.snowflake.com/en/sql-reference/sql/show-users) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in `show_output` (only empty or default values), the DESCRIBE command cannot be called, so you have to set `with_describe = false`. Only `parameters` output is not affected by the lack of privileges. +Data source used to get details of filtered users. Filtering is aligned with the current possibilities for [SHOW USERS](https://docs.snowflake.com/en/sql-reference/sql/show-users) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in `show_output` (only empty or default values), the DESCRIBE command cannot be called, so you have to set `with_describe = false`. Only `parameters` output is not affected by the lack of privileges. ## Example Usage diff --git a/docs/data-sources/views.md b/docs/data-sources/views.md index d4eeb988fc..9425ccbcde 100644 --- a/docs/data-sources/views.md +++ b/docs/data-sources/views.md @@ -2,14 +2,14 @@ page_title: "snowflake_views Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered views. Filtering is aligned with the current possibilities for SHOW VIEWS https://docs.snowflake.com/en/sql-reference/sql/show-views query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection views. + Data source used to get details of filtered views. Filtering is aligned with the current possibilities for SHOW VIEWS https://docs.snowflake.com/en/sql-reference/sql/show-views query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection views. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v094x--v0950) to use it. # snowflake_views (Data Source) -Datasource used to get details of filtered views. Filtering is aligned with the current possibilities for [SHOW VIEWS](https://docs.snowflake.com/en/sql-reference/sql/show-views) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `views`. +Data source used to get details of filtered views. Filtering is aligned with the current possibilities for [SHOW VIEWS](https://docs.snowflake.com/en/sql-reference/sql/show-views) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `views`. ## Example Usage diff --git a/docs/data-sources/warehouses.md b/docs/data-sources/warehouses.md index 99ce968572..2afcc6f502 100644 --- a/docs/data-sources/warehouses.md +++ b/docs/data-sources/warehouses.md @@ -2,14 +2,14 @@ page_title: "snowflake_warehouses Data Source - terraform-provider-snowflake" subcategory: "" description: |- - Datasource used to get details of filtered warehouses. Filtering is aligned with the current possibilities for SHOW WAREHOUSES https://docs.snowflake.com/en/sql-reference/sql/show-warehouses query (only like is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. + Data source used to get details of filtered warehouses. Filtering is aligned with the current possibilities for SHOW WAREHOUSES https://docs.snowflake.com/en/sql-reference/sql/show-warehouses query (only like is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_warehouses (Data Source) -Datasource used to get details of filtered warehouses. Filtering is aligned with the current possibilities for [SHOW WAREHOUSES](https://docs.snowflake.com/en/sql-reference/sql/show-warehouses) query (only `like` is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. +Data source used to get details of filtered warehouses. Filtering is aligned with the current possibilities for [SHOW WAREHOUSES](https://docs.snowflake.com/en/sql-reference/sql/show-warehouses) query (only `like` is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. ## Example Usage diff --git a/docs/guides/unassigning_policies.md b/docs/guides/unassigning_policies.md new file mode 100644 index 0000000000..de5de63e86 --- /dev/null +++ b/docs/guides/unassigning_policies.md @@ -0,0 +1,65 @@ +--- +page_title: "Unassigning policies" +subcategory: "" +description: |- + +--- +# Unassigning policies + +For some objects, like network policies, Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes) suggest that a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. + +Before dropping the resource: +- if the objects the policy is assigned to are managed in Terraform, follow the example below +- if they are not managed in Terraform, list them with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` + +## Example + +When you have a configuration like +```terraform +resource "snowflake_network_policy" "example" { + name = "network_policy_name" +} + +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] + network_policy = snowflake_network_policy.example.fully_qualified_name +} +``` + +and try removing the network policy, Terraform fails with +``` +│ Error deleting network policy EXAMPLE, err = 001492 (42601): SQL compilation error: +│ Cannot perform Drop operation on network policy EXAMPLE. The policy is attached to INTEGRATION with name EXAMPLE. Unset the network policy from INTEGRATION and try the +│ Drop operation again. +``` + +In order to remove the policy correctly, first adjust the configuration to +```terraform +resource "snowflake_network_policy" "example" { + name = "network_policy_name" +} + +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] +} +``` + +Note that the network policy has been unassigned. Now, run `terraform apply`. This should cause the policy to be unassigned. Now, adjust the configuration once again to +```terraform +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] +} +``` + +Now the network policy should be removed successfully. + +This behavior will be fixed in the provider in the future. diff --git a/docs/index.md b/docs/index.md index 518af87d7c..9a69fadbe9 100644 --- a/docs/index.md +++ b/docs/index.md @@ -9,7 +9,7 @@ description: Manage SnowflakeDB with Terraform. ~> **Note** Please check the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md) when changing the version of the provider. --> **Note** the current roadmap is available in our GitHub repository: [ROADMAP.md](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md). +-> **Note** The current roadmap is available in our GitHub repository: [ROADMAP.md](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md). This is a terraform provider plugin for managing [Snowflake](https://www.snowflake.com/) accounts. Coverage is focused on part of Snowflake related to access control. @@ -161,7 +161,7 @@ To export the variables into your provider: ```shell export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key" ``` ### Keypair Authentication Passphrase @@ -183,7 +183,7 @@ To export the variables into your provider: ```shell export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key.p8" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key.p8" export SNOWFLAKE_PRIVATE_KEY_PASSPHRASE="..." ``` @@ -193,7 +193,7 @@ If you have an OAuth access token, export these credentials as environment varia ```shell export SNOWFLAKE_USER='...' -export SNOWFLAKE_OAUTH_ACCESS_TOKEN='...' +export SNOWFLAKE_TOKEN='...' ``` Note that once this access token expires, you'll need to request a new one through an external application. @@ -203,11 +203,11 @@ Note that once this access token expires, you'll need to request a new one throu If you have an OAuth Refresh token, export these credentials as environment variables: ```shell -export SNOWFLAKE_OAUTH_REFRESH_TOKEN='...' -export SNOWFLAKE_OAUTH_CLIENT_ID='...' -export SNOWFLAKE_OAUTH_CLIENT_SECRET='...' -export SNOWFLAKE_OAUTH_ENDPOINT='...' -export SNOWFLAKE_OAUTH_REDIRECT_URL='https://localhost.com' +export SNOWFLAKE_TOKEN_ACCESSOR_REFRESH_TOKEN='...' +export SNOWFLAKE_TOKEN_ACCESSOR_CLIENT_ID='...' +export SNOWFLAKE_TOKEN_ACCESSOR_CLIENT_SECRET='...' +export SNOWFLAKE_TOKEN_ACCESSOR_TOKEN_ENDPOINT='...' +export SNOWFLAKE_TOKEN_ACCESSOR_REDIRECT_URI='https://localhost.com' ``` Note because access token have a short life; typically 10 minutes, by passing refresh token new access token will be generated. @@ -242,7 +242,7 @@ provider "snowflake" { ```bash export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key" ``` 3. In a TOML file (default in ~/.snowflake/config). Notice the use of different profiles. The profile name needs to be specified in the Terraform configuration file in `profile` field. When this is not specified, `default` profile is loaded. diff --git a/docs/resources/account.md b/docs/resources/account.md index 4d3a8fea48..6597e1e855 100644 --- a/docs/resources/account.md +++ b/docs/resources/account.md @@ -5,34 +5,58 @@ description: |- The account resource allows you to create and manage Snowflake accounts. --- +!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0990--v01000) to use it. + # snowflake_account (Resource) The account resource allows you to create and manage Snowflake accounts. -!> **Warning** This resource cannot be destroyed!!! The only way to delete accounts is to go through [Snowflake Support](https://docs.snowflake.com/en/user-guide/organizations-manage-accounts.html#deleting-an-account) - -~> **Note** ORGADMIN priviliges are required for this resource +~> **Note** To use this resource you have to use an account with a privilege to use the ORGADMIN role. ## Example Usage ```terraform -provider "snowflake" { - role = "ORGADMIN" - alias = "orgadmin" +## Minimal +resource "snowflake_account" "minimal" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_password = "ADMIN_PASSWORD" + email = "admin@email.com" + edition = "STANDARD" + grace_period_in_days = 3 +} + +## Complete (with SERVICE user type) +resource "snowflake_account" "complete" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_rsa_public_key = "" + admin_user_type = "SERVICE" + email = "admin@email.com" + edition = "STANDARD" + region_group = "PUBLIC" + region = "AWS_US_WEST_2" + comment = "some comment" + is_org_admin = "true" + grace_period_in_days = 3 } -resource "snowflake_account" "ac1" { - provider = snowflake.orgadmin - name = "SNOWFLAKE_TEST_ACCOUNT" - admin_name = "John Doe" - admin_password = "Abcd1234!" - email = "john.doe@snowflake.com" - first_name = "John" - last_name = "Doe" - must_change_password = true +## Complete (with PERSON user type) +resource "snowflake_account" "complete" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_password = "ADMIN_PASSWORD" + admin_user_type = "PERSON" + first_name = "first_name" + last_name = "last_name" + email = "admin@email.com" + must_change_password = "false" edition = "STANDARD" - comment = "Snowflake Test Account" + region_group = "PUBLIC" region = "AWS_US_WEST_2" + comment = "some comment" + is_org_admin = "true" + grace_period_in_days = 3 } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). @@ -43,33 +67,70 @@ resource "snowflake_account" "ac1" { ### Required -- `admin_name` (String) Login name of the initial administrative user of the account. A new user is created in the new account with this name and password and granted the ACCOUNTADMIN role in the account. A login name can be any string consisting of letters, numbers, and underscores. Login names are always case-insensitive. -- `edition` (String) [Snowflake Edition](https://docs.snowflake.com/en/user-guide/intro-editions.html) of the account. Valid values are: STANDARD | ENTERPRISE | BUSINESS_CRITICAL -- `email` (String, Sensitive) Email address of the initial administrative user of the account. This email address is used to send any notifications about the account. -- `name` (String) Specifies the identifier (i.e. name) for the account; must be unique within an organization, regardless of which Snowflake Region the account is in. In addition, the identifier must start with an alphabetic character and cannot contain spaces or special characters except for underscores (_). Note that if the account name includes underscores, features that do not accept account names with underscores (e.g. Okta SSO or SCIM) can reference a version of the account name that substitutes hyphens (-) for the underscores. +- `admin_name` (String, Sensitive) Login name of the initial administrative user of the account. A new user is created in the new account with this name and password and granted the ACCOUNTADMIN role in the account. A login name can be any string consisting of letters, numbers, and underscores. Login names are always case-insensitive. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `edition` (String) Snowflake Edition of the account. See more about Snowflake Editions in the [official documentation](https://docs.snowflake.com/en/user-guide/intro-editions). Valid options are: `STANDARD` | `ENTERPRISE` | `BUSINESS_CRITICAL` +- `email` (String, Sensitive) Email address of the initial administrative user of the account. This email address is used to send any notifications about the account. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `grace_period_in_days` (Number) Specifies the number of days during which the account can be restored (“undropped”). The minimum is 3 days and the maximum is 90 days. +- `name` (String) Specifies the identifier (i.e. name) for the account. It must be unique within an organization, regardless of which Snowflake Region the account is in and must start with an alphabetic character and cannot contain spaces or special characters except for underscores (_). Note that if the account name includes underscores, features that do not accept account names with underscores (e.g. Okta SSO or SCIM) can reference a version of the account name that substitutes hyphens (-) for the underscores. ### Optional -- `admin_password` (String, Sensitive) Password for the initial administrative user of the account. Optional if the `ADMIN_RSA_PUBLIC_KEY` parameter is specified. For more information about passwords in Snowflake, see [Snowflake-provided Password Policy](https://docs.snowflake.com/en/sql-reference/sql/create-account.html#:~:text=Snowflake%2Dprovided%20Password%20Policy). -- `admin_rsa_public_key` (String, Sensitive) Assigns a public key to the initial administrative user of the account in order to implement [key pair authentication](https://docs.snowflake.com/en/sql-reference/sql/create-account.html#:~:text=key%20pair%20authentication) for the user. Optional if the `ADMIN_PASSWORD` parameter is specified. +- `admin_password` (String, Sensitive) Password for the initial administrative user of the account. Either admin_password or admin_rsa_public_key has to be specified. This field cannot be used whenever admin_user_type is set to SERVICE. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `admin_rsa_public_key` (String) Assigns a public key to the initial administrative user of the account. Either admin_password or admin_rsa_public_key has to be specified. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `admin_user_type` (String) Used for setting the type of the first user that is assigned the ACCOUNTADMIN role during account creation. Valid options are: `PERSON` | `SERVICE` | `LEGACY_SERVICE` External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `comment` (String) Specifies a comment for the account. -- `first_name` (String, Sensitive) First name of the initial administrative user of the account -- `grace_period_in_days` (Number) Specifies the number of days to wait before dropping the account. The default is 3 days. -- `last_name` (String, Sensitive) Last name of the initial administrative user of the account -- `must_change_password` (Boolean) Specifies whether the new user created to administer the account is forced to change their password upon first login into the account. -- `region` (String) ID of the Snowflake Region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.) -- `region_group` (String) ID of the Snowflake Region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.) +- `first_name` (String, Sensitive) First name of the initial administrative user of the account. This field cannot be used whenever admin_user_type is set to SERVICE. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `is_org_admin` (String) Sets an account property that determines whether the ORGADMIN role is enabled in the account. Only an organization administrator (i.e. user with the ORGADMIN role) can set the property. +- `last_name` (String, Sensitive) Last name of the initial administrative user of the account. This field cannot be used whenever admin_user_type is set to SERVICE. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `must_change_password` (String) Specifies whether the new user created to administer the account is forced to change their password upon first login into the account. This field cannot be used whenever admin_user_type is set to SERVICE. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `region` (String) [Snowflake Region ID](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#label-snowflake-region-ids) of the region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.) +- `region_group` (String) ID of the region group where the account is created. To retrieve the region group ID for existing accounts in your organization, execute the [SHOW REGIONS](https://docs.snowflake.com/en/sql-reference/sql/show-regions) command. For information about when you might need to specify region group, see [Region groups](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#label-region-groups). ### Read-Only - `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). - `id` (String) The ID of this resource. -- `is_org_admin` (Boolean) Indicates whether the ORGADMIN role is enabled in an account. If TRUE, the role is enabled. +- `show_output` (List of Object) Outputs the result of `SHOW ACCOUNTS` for the given account. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `show_output` + +Read-Only: + +- `account_locator` (String) +- `account_locator_url` (String) +- `account_name` (String) +- `account_old_url_last_used` (String) +- `account_old_url_saved_on` (String) +- `account_url` (String) +- `comment` (String) +- `consumption_billing_entity_name` (String) +- `created_on` (String) +- `dropped_on` (String) +- `edition` (String) +- `is_events_account` (Boolean) +- `is_org_admin` (Boolean) +- `is_organization_account` (Boolean) +- `managed_accounts` (Number) +- `marketplace_consumer_billing_entity_name` (String) +- `marketplace_provider_billing_entity_name` (String) +- `moved_on` (String) +- `moved_to_organization` (String) +- `old_account_url` (String) +- `organization_name` (String) +- `organization_old_url` (String) +- `organization_old_url_last_used` (String) +- `organization_old_url_saved_on` (String) +- `organization_url_expiration_on` (String) +- `region_group` (String) +- `restored_on` (String) +- `scheduled_deletion_time` (String) +- `snowflake_region` (String) ## Import Import is supported using the following syntax: ```shell -terraform import snowflake_account.account +terraform import snowflake_account.example '"".""' ``` diff --git a/docs/resources/account_role.md b/docs/resources/account_role.md index e5c9a7068f..bc409840f6 100644 --- a/docs/resources/account_role.md +++ b/docs/resources/account_role.md @@ -33,7 +33,7 @@ resource "snowflake_account_role" "complete" { ### Required -- `name` (String) Identifier for the role; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Identifier for the role; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -66,5 +66,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_account_role.example "name" +terraform import snowflake_account_role.example '""' ``` diff --git a/docs/resources/api_authentication_integration_with_authorization_code_grant.md b/docs/resources/api_authentication_integration_with_authorization_code_grant.md index 7ff4240139..683d691d23 100644 --- a/docs/resources/api_authentication_integration_with_authorization_code_grant.md +++ b/docs/resources/api_authentication_integration_with_authorization_code_grant.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_api_authentication_integration_with_authorization_code_grant (Resource) Resource used to manage api authentication security integration objects with authorization code grant. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-api-auth). @@ -45,9 +47,9 @@ resource "snowflake_api_authentication_integration_with_authorization_code_grant ### Required - `enabled` (Boolean) Specifies whether this security integration is enabled or disabled. -- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_client_id` (String) Specifies the client ID for the OAuth application in the external service. -- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. +- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Optional @@ -234,5 +236,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_api_authentication_integration_with_authorization_code_grant.example "name" +terraform import snowflake_api_authentication_integration_with_authorization_code_grant.example '""' ``` diff --git a/docs/resources/api_authentication_integration_with_client_credentials.md b/docs/resources/api_authentication_integration_with_client_credentials.md index 098bdf6ce8..539e6b51cb 100644 --- a/docs/resources/api_authentication_integration_with_client_credentials.md +++ b/docs/resources/api_authentication_integration_with_client_credentials.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_api_authentication_integration_with_client_credentials (Resource) Resource used to manage api authentication security integration objects with client credentials. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-api-auth). @@ -43,9 +45,9 @@ resource "snowflake_api_authentication_integration_with_client_credentials" "tes ### Required - `enabled` (Boolean) Specifies whether this security integration is enabled or disabled. -- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_client_id` (String) Specifies the client ID for the OAuth application in the external service. -- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. +- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Optional @@ -231,5 +233,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_api_authentication_integration_with_client_credentials.example "name" +terraform import snowflake_api_authentication_integration_with_client_credentials.example '""' ``` diff --git a/docs/resources/api_authentication_integration_with_jwt_bearer.md b/docs/resources/api_authentication_integration_with_jwt_bearer.md index c4cdee9bdf..623a15d70e 100644 --- a/docs/resources/api_authentication_integration_with_jwt_bearer.md +++ b/docs/resources/api_authentication_integration_with_jwt_bearer.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_api_authentication_integration_with_jwt_bearer (Resource) Resource used to manage api authentication security integration objects with jwt bearer. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-api-auth). @@ -46,10 +48,10 @@ resource "snowflake_api_authentication_integration_with_jwt_bearer" "test" { ### Required - `enabled` (Boolean) Specifies whether this security integration is enabled or disabled. -- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the identifier (i.e. name) for the integration. This value must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_assertion_issuer` (String) - `oauth_client_id` (String) Specifies the client ID for the OAuth application in the external service. -- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. +- `oauth_client_secret` (String) Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Optional @@ -235,5 +237,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_api_authentication_integration_with_jwt_bearer.example "name" +terraform import snowflake_api_authentication_integration_with_jwt_bearer.example '""' ``` diff --git a/docs/resources/authentication_policy.md b/docs/resources/authentication_policy.md index 926acdd4fb..bd78a8eef8 100644 --- a/docs/resources/authentication_policy.md +++ b/docs/resources/authentication_policy.md @@ -5,8 +5,7 @@ description: |- Resource used to manage authentication policy objects. For more information, check authentication policy documentation https://docs.snowflake.com/en/sql-reference/sql/create-authentication-policy. --- -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-authentication-policy#usage-notes), an authentication policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-authentication-policy#usage-notes), an authentication policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # snowflake_authentication_policy (Resource) @@ -44,9 +43,9 @@ resource "snowflake_authentication_policy" "complete" { ### Required -- `database` (String) The database in which to create the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the authentication policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/database.md b/docs/resources/database.md index 998d9ca232..e5d16f93e4 100644 --- a/docs/resources/database.md +++ b/docs/resources/database.md @@ -7,6 +7,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # snowflake_database (Resource) Represents a standard database. If replication configuration is specified, the database is promoted to serve as a primary database for replication. @@ -26,10 +31,9 @@ resource "snowflake_database" "primary" { comment = "my standard database" data_retention_time_in_days = 10 - data_retention_time_in_days_save = 10 max_data_extension_time_in_days = 20 - external_volume = "" - catalog = "" + external_volume = snowflake_external_volume.example.fully_qualified_name + catalog = snowflake_catalog.example.fully_qualified_name replace_invalid_characters = false default_ddl_collation = "en_US" storage_serialization_policy = "COMPATIBLE" @@ -56,11 +60,11 @@ resource "snowflake_database" "primary" { locals { replication_configs = [ { - account_identifier = "." + account_identifier = "\"\".\"\"" with_failover = true }, { - account_identifier = "." + account_identifier = "\"\".\"\"" with_failover = true }, ] @@ -68,10 +72,13 @@ locals { resource "snowflake_database" "primary" { name = "database_name" - for_each = local.replication_configs + for_each = { for rc in local.replication_configs : rc.account_identifier => rc } replication { - enable_to_account = each.value + enable_to_account { + account_identifier = each.value.account_identifier + with_failover = each.value.with_failover + } ignore_edition_check = true } } @@ -84,7 +91,7 @@ resource "snowflake_database" "primary" { ### Required -- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -130,7 +137,7 @@ Optional: Required: -- `account_identifier` (String) Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `"".""`. +- `account_identifier` (String) Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `"".""`. For more information about this resource, see [docs](./account). Optional: @@ -141,5 +148,5 @@ Optional: Import is supported using the following syntax: ```shell -terraform import snowflake_database.example 'database_name' +terraform import snowflake_database.example '""' ``` diff --git a/docs/resources/database_role.md b/docs/resources/database_role.md index f0097ccb89..706ceba760 100644 --- a/docs/resources/database_role.md +++ b/docs/resources/database_role.md @@ -2,14 +2,14 @@ page_title: "snowflake_database_role Resource - terraform-provider-snowflake" subcategory: "" description: |- - + Resource used to manage database roles. For more information, check database roles documentation https://docs.snowflake.com/en/sql-reference/sql/create-database-role. --- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. # snowflake_database_role (Resource) - +Resource used to manage database roles. For more information, check [database roles documentation](https://docs.snowflake.com/en/sql-reference/sql/create-database-role). ## Example Usage @@ -32,8 +32,8 @@ resource "snowflake_database_role" "test_database_role" { ### Required -- `database` (String) The database in which to create the database role. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the database role. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the database role. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the database role. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/external_oauth_integration.md b/docs/resources/external_oauth_integration.md index 2d6473cfd9..37550af92e 100644 --- a/docs/resources/external_oauth_integration.md +++ b/docs/resources/external_oauth_integration.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_external_oauth_integration (Resource) Resource used to manage external oauth security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-oauth-external). @@ -27,7 +29,7 @@ resource "snowflake_external_oauth_integration" "test" { resource "snowflake_external_oauth_integration" "test" { comment = "comment" enabled = true - external_oauth_allowed_roles_list = ["user1"] + external_oauth_allowed_roles_list = [snowflake_role.one.fully_qualified_name] external_oauth_any_role_mode = "ENABLE" external_oauth_audience_list = ["https://example.com"] external_oauth_issuer = "issuer" @@ -45,7 +47,7 @@ resource "snowflake_external_oauth_integration" "test" { enabled = true external_oauth_any_role_mode = "ENABLE" external_oauth_audience_list = ["https://example.com"] - external_oauth_blocked_roles_list = ["user1"] + external_oauth_blocked_roles_list = [snowflake_role.one.fully_qualified_name] external_oauth_issuer = "issuer" external_oauth_rsa_public_key = file("key.pem") external_oauth_rsa_public_key_2 = file("key2.pem") @@ -70,15 +72,15 @@ resource "snowflake_external_oauth_integration" "test" { - `external_oauth_snowflake_user_mapping_attribute` (String) Indicates which Snowflake user record attribute should be used to map the access token to a Snowflake user record. Valid values are (case-insensitive): `LOGIN_NAME` | `EMAIL_ADDRESS`. - `external_oauth_token_user_mapping_claim` (Set of String) Specifies the access token claim or claims that can be used to map the access token to a Snowflake user record. If removed from the config, the resource is recreated. - `external_oauth_type` (String) Specifies the OAuth 2.0 authorization server to be Okta, Microsoft Azure AD, Ping Identity PingFederate, or a Custom OAuth 2.0 authorization server. Valid values are (case-insensitive): `OKTA` | `AZURE` | `PING_FEDERATE` | `CUSTOM`. -- `name` (String) Specifies the name of the External Oath integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the name of the External Oath integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional - `comment` (String) Specifies a comment for the OAuth integration. -- `external_oauth_allowed_roles_list` (Set of String) Specifies the list of roles that the client can set as the primary role. +- `external_oauth_allowed_roles_list` (Set of String) Specifies the list of roles that the client can set as the primary role. For more information about this resource, see [docs](./account_role). - `external_oauth_any_role_mode` (String) Specifies whether the OAuth client or user can use a role that is not defined in the OAuth access token. Valid values are (case-insensitive): `DISABLE` | `ENABLE` | `ENABLE_FOR_PRIVILEGE`. - `external_oauth_audience_list` (Set of String) Specifies additional values that can be used for the access token's audience validation on top of using the Customer's Snowflake Account URL -- `external_oauth_blocked_roles_list` (Set of String) Specifies the list of roles that a client cannot set as the primary role. By default, this list includes the ACCOUNTADMIN, ORGADMIN and SECURITYADMIN roles. To remove these privileged roles from the list, use the ALTER ACCOUNT command to set the EXTERNAL_OAUTH_ADD_PRIVILEGED_ROLES_TO_BLOCKED_LIST account parameter to FALSE. +- `external_oauth_blocked_roles_list` (Set of String) Specifies the list of roles that a client cannot set as the primary role. By default, this list includes the ACCOUNTADMIN, ORGADMIN and SECURITYADMIN roles. To remove these privileged roles from the list, use the ALTER ACCOUNT command to set the EXTERNAL_OAUTH_ADD_PRIVILEGED_ROLES_TO_BLOCKED_LIST account parameter to FALSE. For more information about this resource, see [docs](./account_role). - `external_oauth_jws_keys_url` (Set of String) Specifies the endpoint or a list of endpoints from which to download public keys or certificates to validate an External OAuth access token. The maximum number of URLs that can be specified in the list is 3. If removed from the config, the resource is recreated. - `external_oauth_rsa_public_key` (String) Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. If removed from the config, the resource is recreated. - `external_oauth_rsa_public_key_2` (String) Specifies a second RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. Used for key rotation. If removed from the config, the resource is recreated. @@ -293,5 +295,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_external_oauth_integration.example "name" +terraform import snowflake_external_oauth_integration.example '""' ``` diff --git a/docs/resources/external_volume.md b/docs/resources/external_volume.md index 249d2a60f4..943f610e93 100644 --- a/docs/resources/external_volume.md +++ b/docs/resources/external_volume.md @@ -16,7 +16,7 @@ Resource used to manage external volume objects. For more information, check [ex ### Required -- `name` (String) Identifier for the external volume; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Identifier for the external volume; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `storage_location` (Block List, Min: 1) List of named cloud storage locations in different regions and, optionally, cloud platforms. Minimum 1 required. The order of the list is important as it impacts the active storage location, and updates will be triggered if it changes. Note that not all parameter combinations are valid as they depend on the given storage_provider. Consult [the docs](https://docs.snowflake.com/en/sql-reference/sql/create-external-volume#cloud-provider-parameters-cloudproviderparams) for more details on this. (see [below for nested schema](#nestedblock--storage_location)) ### Optional @@ -37,7 +37,7 @@ Resource used to manage external volume objects. For more information, check [ex Required: - `storage_base_url` (String) Specifies the base URL for your cloud storage location. -- `storage_location_name` (String) Name of the storage location. Must be unique for the external volume. Do not use the name `terraform_provider_sentinel_storage_location` - this is reserved for the provider for performing update operations. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `storage_location_name` (String) Name of the storage location. Must be unique for the external volume. Do not use the name `terraform_provider_sentinel_storage_location` - this is reserved for the provider for performing update operations. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `storage_provider` (String) Specifies the cloud storage provider that stores your data files. Valid values are (case-insensitive): `GCS` | `AZURE` | `S3` | `S3GOV`. Optional: diff --git a/docs/resources/function_java.md b/docs/resources/function_java.md new file mode 100644 index 0000000000..eb8062232b --- /dev/null +++ b/docs/resources/function_java.md @@ -0,0 +1,105 @@ +--- +page_title: "snowflake_function_java Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + Resource used to manage java function objects. For more information, check function documentation https://docs.snowflake.com/en/sql-reference/sql/create-function. +--- + +# snowflake_function_java (Resource) + +Resource used to manage java function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function). + + + + +## Schema + +### Required + +- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be Java source code. For more information, see [Introduction to Java UDFs](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-introduction). +- `handler` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class. +- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). +- `schema` (String) The schema in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. + +### Optional + +- `arguments` (Block List) List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details. (see [below for nested schema](#nestedblock--arguments)) +- `comment` (String) Specifies a comment for the function. +- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output). +- `external_access_integrations` (Set of String) The names of [external access integrations](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) needed in order for this function’s handler code to access external networks. An external access integration specifies [network rules](https://docs.snowflake.com/en/sql-reference/sql/create-network-rule) and [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) that specify external locations and credentials (if any) allowed for use by handler code when making requests of an external network, such as an external REST API. +- `imports` (Set of String) The location (stage), path, and name of the file(s) to import. A file can be a JAR file or another type of file. If the file is a JAR file, it can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). Java UDFs can also read non-JAR files. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#java). +- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level). +- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level). +- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`. +- `packages` (Set of String) The name and version number of Snowflake system packages required as dependencies. The value should be of the form `package_name:version_number`, where `package_name` is `snowflake_domain:package`. +- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`. +- `runtime_version` (String) Specifies the Java JDK runtime version to use. The supported versions of Java are 11.x and 17.x. If RUNTIME_VERSION is not set, Java JDK 11 is used. +- `secrets` (Block Set) Assigns the names of secrets to variables so that you can use the variables to reference the secrets when retrieving information from secrets in handler code. Secrets you specify here must be allowed by the [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) specified as a value of this CREATE FUNCTION command’s EXTERNAL_ACCESS_INTEGRATIONS parameter. (see [below for nested schema](#nestedblock--secrets)) +- `target_path` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class. +- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level). + +### Read-Only + +- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). +- `function_language` (String) Specifies language for the user. Used to detect external changes. +- `id` (String) The ID of this resource. +- `parameters` (List of Object) Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function. (see [below for nested schema](#nestedatt--parameters)) +- `show_output` (List of Object) Outputs the result of `SHOW FUNCTION` for the given function. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `arguments` + +Required: + +- `arg_data_type` (String) The argument type. +- `arg_name` (String) The argument name. + + + +### Nested Schema for `secrets` + +Required: + +- `secret_id` (String) Fully qualified name of the allowed secret. You will receive an error if you specify a SECRETS value whose secret isn’t also included in an integration specified by the EXTERNAL_ACCESS_INTEGRATIONS parameter. +- `secret_variable_name` (String) The variable that will be used in handler code when retrieving information from the secret. + + + +### Nested Schema for `parameters` + +Read-Only: + +- `enable_console_output` (Boolean) +- `log_level` (String) +- `metric_level` (String) +- `trace_level` (String) + + + +### Nested Schema for `show_output` + +Read-Only: + +- `arguments_raw` (String) +- `catalog_name` (String) +- `created_on` (String) +- `description` (String) +- `external_access_integrations` (String) +- `is_aggregate` (Boolean) +- `is_ansi` (Boolean) +- `is_builtin` (Boolean) +- `is_data_metric` (Boolean) +- `is_external_function` (Boolean) +- `is_memoizable` (Boolean) +- `is_secure` (Boolean) +- `is_table_function` (Boolean) +- `language` (String) +- `max_num_arguments` (Number) +- `min_num_arguments` (Number) +- `name` (String) +- `schema_name` (String) +- `secrets` (String) +- `valid_for_clustering` (Boolean) diff --git a/docs/resources/function_javascript.md b/docs/resources/function_javascript.md new file mode 100644 index 0000000000..7925c74c59 --- /dev/null +++ b/docs/resources/function_javascript.md @@ -0,0 +1,89 @@ +--- +page_title: "snowflake_function_javascript Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + Resource used to manage javascript function objects. For more information, check function documentation https://docs.snowflake.com/en/sql-reference/sql/create-function. +--- + +# snowflake_function_javascript (Resource) + +Resource used to manage javascript function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function). + + + + +## Schema + +### Required + +- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be JavaScript source code. For more information, see [Introduction to JavaScript UDFs](https://docs.snowflake.com/en/developer-guide/udf/javascript/udf-javascript-introduction). +- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). +- `schema` (String) The schema in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. + +### Optional + +- `arguments` (Block List) List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details. (see [below for nested schema](#nestedblock--arguments)) +- `comment` (String) Specifies a comment for the function. +- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output). +- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level). +- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level). +- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`. +- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`. +- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level). + +### Read-Only + +- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). +- `function_language` (String) Specifies language for the user. Used to detect external changes. +- `id` (String) The ID of this resource. +- `parameters` (List of Object) Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function. (see [below for nested schema](#nestedatt--parameters)) +- `show_output` (List of Object) Outputs the result of `SHOW FUNCTION` for the given function. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `arguments` + +Required: + +- `arg_data_type` (String) The argument type. +- `arg_name` (String) The argument name. + + + +### Nested Schema for `parameters` + +Read-Only: + +- `enable_console_output` (Boolean) +- `log_level` (String) +- `metric_level` (String) +- `trace_level` (String) + + + +### Nested Schema for `show_output` + +Read-Only: + +- `arguments_raw` (String) +- `catalog_name` (String) +- `created_on` (String) +- `description` (String) +- `external_access_integrations` (String) +- `is_aggregate` (Boolean) +- `is_ansi` (Boolean) +- `is_builtin` (Boolean) +- `is_data_metric` (Boolean) +- `is_external_function` (Boolean) +- `is_memoizable` (Boolean) +- `is_secure` (Boolean) +- `is_table_function` (Boolean) +- `language` (String) +- `max_num_arguments` (Number) +- `min_num_arguments` (Number) +- `name` (String) +- `schema_name` (String) +- `secrets` (String) +- `valid_for_clustering` (Boolean) diff --git a/docs/resources/function_python.md b/docs/resources/function_python.md new file mode 100644 index 0000000000..814e7a38de --- /dev/null +++ b/docs/resources/function_python.md @@ -0,0 +1,105 @@ +--- +page_title: "snowflake_function_python Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + Resource used to manage python function objects. For more information, check function documentation https://docs.snowflake.com/en/sql-reference/sql/create-function. +--- + +# snowflake_function_python (Resource) + +Resource used to manage python function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function). + + + + +## Schema + +### Required + +- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be Python source code. For more information, see [Introduction to Python UDFs](https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-introduction). +- `handler` (String) The name of the handler function or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a function name. If the handler code is in-line with the CREATE FUNCTION statement, you can use the function name alone. When the handler code is referenced at a stage, this value should be qualified with the module name, as in the following form: `my_module.my_function`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class. +- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). +- `runtime_version` (String) Specifies the Python version to use. The supported versions of Python are: 3.9, 3.10, and 3.11. +- `schema` (String) The schema in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. + +### Optional + +- `arguments` (Block List) List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details. (see [below for nested schema](#nestedblock--arguments)) +- `comment` (String) Specifies a comment for the function. +- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output). +- `external_access_integrations` (Set of String) The names of [external access integrations](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) needed in order for this function’s handler code to access external networks. An external access integration specifies [network rules](https://docs.snowflake.com/en/sql-reference/sql/create-network-rule) and [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) that specify external locations and credentials (if any) allowed for use by handler code when making requests of an external network, such as an external REST API. +- `imports` (Set of String) The location (stage), path, and name of the file(s) to import. A file can be a `.py` file or another type of file. Python UDFs can also read non-Python files, such as text files. For an example, see [Reading a file](https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-examples.html#label-udf-python-read-files). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#python). +- `is_aggregate` (String) Specifies that the function is an aggregate function. For more information about user-defined aggregate functions, see [Python user-defined aggregate functions](https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-aggregate-functions). Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level). +- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level). +- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`. +- `packages` (Set of String) The name and version number of packages required as dependencies. The value should be of the form `package_name==version_number`. +- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`. +- `secrets` (Block Set) Assigns the names of secrets to variables so that you can use the variables to reference the secrets when retrieving information from secrets in handler code. Secrets you specify here must be allowed by the [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) specified as a value of this CREATE FUNCTION command’s EXTERNAL_ACCESS_INTEGRATIONS parameter. (see [below for nested schema](#nestedblock--secrets)) +- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level). + +### Read-Only + +- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). +- `function_language` (String) Specifies language for the user. Used to detect external changes. +- `id` (String) The ID of this resource. +- `parameters` (List of Object) Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function. (see [below for nested schema](#nestedatt--parameters)) +- `show_output` (List of Object) Outputs the result of `SHOW FUNCTION` for the given function. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `arguments` + +Required: + +- `arg_data_type` (String) The argument type. +- `arg_name` (String) The argument name. + + + +### Nested Schema for `secrets` + +Required: + +- `secret_id` (String) Fully qualified name of the allowed secret. You will receive an error if you specify a SECRETS value whose secret isn’t also included in an integration specified by the EXTERNAL_ACCESS_INTEGRATIONS parameter. +- `secret_variable_name` (String) The variable that will be used in handler code when retrieving information from the secret. + + + +### Nested Schema for `parameters` + +Read-Only: + +- `enable_console_output` (Boolean) +- `log_level` (String) +- `metric_level` (String) +- `trace_level` (String) + + + +### Nested Schema for `show_output` + +Read-Only: + +- `arguments_raw` (String) +- `catalog_name` (String) +- `created_on` (String) +- `description` (String) +- `external_access_integrations` (String) +- `is_aggregate` (Boolean) +- `is_ansi` (Boolean) +- `is_builtin` (Boolean) +- `is_data_metric` (Boolean) +- `is_external_function` (Boolean) +- `is_memoizable` (Boolean) +- `is_secure` (Boolean) +- `is_table_function` (Boolean) +- `language` (String) +- `max_num_arguments` (Number) +- `min_num_arguments` (Number) +- `name` (String) +- `schema_name` (String) +- `secrets` (String) +- `valid_for_clustering` (Boolean) diff --git a/docs/resources/function_scala.md b/docs/resources/function_scala.md new file mode 100644 index 0000000000..5fffff6762 --- /dev/null +++ b/docs/resources/function_scala.md @@ -0,0 +1,105 @@ +--- +page_title: "snowflake_function_scala Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + Resource used to manage scala function objects. For more information, check function documentation https://docs.snowflake.com/en/sql-reference/sql/create-function. +--- + +# snowflake_function_scala (Resource) + +Resource used to manage scala function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function). + + + + +## Schema + +### Required + +- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be Scala source code. For more information, see [Introduction to Scala UDFs](https://docs.snowflake.com/en/developer-guide/udf/scala/udf-scala-introduction). +- `handler` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. +- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). +- `runtime_version` (String) Specifies the Scala runtime version to use. The supported versions of Scala are: 2.12. +- `schema` (String) The schema in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. + +### Optional + +- `arguments` (Block List) List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details. (see [below for nested schema](#nestedblock--arguments)) +- `comment` (String) Specifies a comment for the function. +- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output). +- `external_access_integrations` (Set of String) The names of [external access integrations](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) needed in order for this function’s handler code to access external networks. An external access integration specifies [network rules](https://docs.snowflake.com/en/sql-reference/sql/create-network-rule) and [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) that specify external locations and credentials (if any) allowed for use by handler code when making requests of an external network, such as an external REST API. +- `imports` (Set of String) The location (stage), path, and name of the file(s) to import, such as a JAR or other kind of file. The JAR file might contain handler dependency libraries. It can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). A non-JAR file might a file read by handler code. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#scala). +- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level). +- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level). +- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`. +- `packages` (Set of String) The name and version number of Snowflake system packages required as dependencies. The value should be of the form `package_name:version_number`, where `package_name` is `snowflake_domain:package`. +- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`. +- `secrets` (Block Set) Assigns the names of secrets to variables so that you can use the variables to reference the secrets when retrieving information from secrets in handler code. Secrets you specify here must be allowed by the [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) specified as a value of this CREATE FUNCTION command’s EXTERNAL_ACCESS_INTEGRATIONS parameter. (see [below for nested schema](#nestedblock--secrets)) +- `target_path` (String) The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. +- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level). + +### Read-Only + +- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). +- `function_language` (String) Specifies language for the user. Used to detect external changes. +- `id` (String) The ID of this resource. +- `parameters` (List of Object) Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function. (see [below for nested schema](#nestedatt--parameters)) +- `show_output` (List of Object) Outputs the result of `SHOW FUNCTION` for the given function. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `arguments` + +Required: + +- `arg_data_type` (String) The argument type. +- `arg_name` (String) The argument name. + + + +### Nested Schema for `secrets` + +Required: + +- `secret_id` (String) Fully qualified name of the allowed secret. You will receive an error if you specify a SECRETS value whose secret isn’t also included in an integration specified by the EXTERNAL_ACCESS_INTEGRATIONS parameter. +- `secret_variable_name` (String) The variable that will be used in handler code when retrieving information from the secret. + + + +### Nested Schema for `parameters` + +Read-Only: + +- `enable_console_output` (Boolean) +- `log_level` (String) +- `metric_level` (String) +- `trace_level` (String) + + + +### Nested Schema for `show_output` + +Read-Only: + +- `arguments_raw` (String) +- `catalog_name` (String) +- `created_on` (String) +- `description` (String) +- `external_access_integrations` (String) +- `is_aggregate` (Boolean) +- `is_ansi` (Boolean) +- `is_builtin` (Boolean) +- `is_data_metric` (Boolean) +- `is_external_function` (Boolean) +- `is_memoizable` (Boolean) +- `is_secure` (Boolean) +- `is_table_function` (Boolean) +- `language` (String) +- `max_num_arguments` (Number) +- `min_num_arguments` (Number) +- `name` (String) +- `schema_name` (String) +- `secrets` (String) +- `valid_for_clustering` (Boolean) diff --git a/docs/resources/function_sql.md b/docs/resources/function_sql.md new file mode 100644 index 0000000000..e37e57514d --- /dev/null +++ b/docs/resources/function_sql.md @@ -0,0 +1,89 @@ +--- +page_title: "snowflake_function_sql Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + Resource used to manage sql function objects. For more information, check function documentation https://docs.snowflake.com/en/sql-reference/sql/create-function. +--- + +# snowflake_function_sql (Resource) + +Resource used to manage sql function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function). + + + + +## Schema + +### Required + +- `database` (String) The database in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `function_definition` (String) Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be SQL source code. For more information, see [Introduction to SQL UDFs](https://docs.snowflake.com/en/developer-guide/udf/sql/udf-sql-introduction). +- `name` (String) The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `return_type` (String) Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages). +- `schema` (String) The schema in which to create the function. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. + +### Optional + +- `arguments` (Block List) List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details. (see [below for nested schema](#nestedblock--arguments)) +- `comment` (String) Specifies a comment for the function. +- `enable_console_output` (Boolean) Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL). For more information, check [ENABLE_CONSOLE_OUTPUT docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-console-output). +- `is_secure` (String) Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. +- `log_level` (String) LOG_LEVEL to use when filtering events For more information, check [LOG_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#log-level). +- `metric_level` (String) METRIC_LEVEL value to control whether to emit metrics to Event Table For more information, check [METRIC_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#metric-level). +- `null_input_behavior` (String) Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): `CALLED ON NULL INPUT` | `RETURNS NULL ON NULL INPUT`. +- `return_behavior` (String) Specifies the behavior of the function when returning results. Valid values are (case-insensitive): `VOLATILE` | `IMMUTABLE`. +- `trace_level` (String) Trace level value to use when generating/filtering trace events For more information, check [TRACE_LEVEL docs](https://docs.snowflake.com/en/sql-reference/parameters#trace-level). + +### Read-Only + +- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution). +- `function_language` (String) Specifies language for the user. Used to detect external changes. +- `id` (String) The ID of this resource. +- `parameters` (List of Object) Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function. (see [below for nested schema](#nestedatt--parameters)) +- `show_output` (List of Object) Outputs the result of `SHOW FUNCTION` for the given function. (see [below for nested schema](#nestedatt--show_output)) + + +### Nested Schema for `arguments` + +Required: + +- `arg_data_type` (String) The argument type. +- `arg_name` (String) The argument name. + + + +### Nested Schema for `parameters` + +Read-Only: + +- `enable_console_output` (Boolean) +- `log_level` (String) +- `metric_level` (String) +- `trace_level` (String) + + + +### Nested Schema for `show_output` + +Read-Only: + +- `arguments_raw` (String) +- `catalog_name` (String) +- `created_on` (String) +- `description` (String) +- `external_access_integrations` (String) +- `is_aggregate` (Boolean) +- `is_ansi` (Boolean) +- `is_builtin` (Boolean) +- `is_data_metric` (Boolean) +- `is_external_function` (Boolean) +- `is_memoizable` (Boolean) +- `is_secure` (Boolean) +- `is_table_function` (Boolean) +- `language` (String) +- `max_num_arguments` (Number) +- `min_num_arguments` (Number) +- `name` (String) +- `schema_name` (String) +- `secrets` (String) +- `valid_for_clustering` (Boolean) diff --git a/docs/resources/grant_account_role.md b/docs/resources/grant_account_role.md index b01aad31af..c98a6b08f7 100644 --- a/docs/resources/grant_account_role.md +++ b/docs/resources/grant_account_role.md @@ -19,11 +19,11 @@ description: |- ################################## resource "snowflake_account_role" "role" { - name = var.role_name + name = "ROLE" } resource "snowflake_account_role" "parent_role" { - name = var.parent_role_name + name = "PARENT_ROLE" } resource "snowflake_grant_account_role" "g" { @@ -37,11 +37,11 @@ resource "snowflake_grant_account_role" "g" { ################################## resource "snowflake_account_role" "role" { - name = var.role_name + name = "ROLE" } resource "snowflake_user" "user" { - name = var.user_name + name = "USER" } resource "snowflake_grant_account_role" "g" { @@ -58,12 +58,12 @@ resource "snowflake_grant_account_role" "g" { ### Required -- `role_name` (String) The fully qualified name of the role which will be granted to the user or parent role. +- `role_name` (String) The fully qualified name of the role which will be granted to the user or parent role. For more information about this resource, see [docs](./account_role). ### Optional -- `parent_role_name` (String) The fully qualified name of the parent role which will create a parent-child relationship between the roles. -- `user_name` (String) The fully qualified name of the user on which specified role will be granted. +- `parent_role_name` (String) The fully qualified name of the parent role which will create a parent-child relationship between the roles. For more information about this resource, see [docs](./account_role). +- `user_name` (String) The fully qualified name of the user on which specified role will be granted. For more information about this resource, see [docs](./user). ### Read-Only diff --git a/docs/resources/grant_application_role.md b/docs/resources/grant_application_role.md index 9578e2b830..989c27e7c2 100644 --- a/docs/resources/grant_application_role.md +++ b/docs/resources/grant_application_role.md @@ -55,7 +55,7 @@ resource "snowflake_grant_application_role" "g" { ### Optional - `application_name` (String) The fully qualified name of the application on which application role will be granted. -- `parent_account_role_name` (String) The fully qualified name of the account role on which application role will be granted. +- `parent_account_role_name` (String) The fully qualified name of the account role on which application role will be granted. For more information about this resource, see [docs](./account_role). ### Read-Only diff --git a/docs/resources/grant_database_role.md b/docs/resources/grant_database_role.md index 0071eaf5a0..9ac1dcf848 100644 --- a/docs/resources/grant_database_role.md +++ b/docs/resources/grant_database_role.md @@ -69,13 +69,13 @@ resource "snowflake_grant_database_role" "g" { ### Required -- `database_role_name` (String) The fully qualified name of the database role which will be granted to share or parent role. +- `database_role_name` (String) The fully qualified name of the database role which will be granted to share or parent role. For more information about this resource, see [docs](./database_role). ### Optional -- `parent_database_role_name` (String) The fully qualified name of the parent database role which will create a parent-child relationship between the roles. -- `parent_role_name` (String) The fully qualified name of the parent account role which will create a parent-child relationship between the roles. -- `share_name` (String) The fully qualified name of the share on which privileges will be granted. +- `parent_database_role_name` (String) The fully qualified name of the parent database role which will create a parent-child relationship between the roles. For more information about this resource, see [docs](./database_role). +- `parent_role_name` (String) The fully qualified name of the parent account role which will create a parent-child relationship between the roles. For more information about this resource, see [docs](./account_role). +- `share_name` (String) The fully qualified name of the share on which privileges will be granted. For more information about this resource, see [docs](./share). ### Read-Only diff --git a/docs/resources/grant_ownership.md b/docs/resources/grant_ownership.md index 03eb48c244..9536f727b4 100644 --- a/docs/resources/grant_ownership.md +++ b/docs/resources/grant_ownership.md @@ -252,8 +252,8 @@ To set the `AUTO_REFRESH` property back to `TRUE` (after you transfer ownership) ### Optional -- `account_role_name` (String) The fully qualified name of the account role to which privileges will be granted. -- `database_role_name` (String) The fully qualified name of the database role to which privileges will be granted. +- `account_role_name` (String) The fully qualified name of the account role to which privileges will be granted. For more information about this resource, see [docs](./account_role). +- `database_role_name` (String) The fully qualified name of the database role to which privileges will be granted. For more information about this resource, see [docs](./database_role). - `outbound_privileges` (String) Specifies whether to remove or transfer all existing outbound privileges on the object when ownership is transferred to a new role. Available options are: REVOKE for removing existing privileges and COPY to transfer them with ownership. For more information head over to [Snowflake documentation](https://docs.snowflake.com/en/sql-reference/sql/grant-ownership#optional-parameters). ### Read-Only @@ -279,8 +279,8 @@ Required: Optional: -- `in_database` (String) The fully qualified name of the database. -- `in_schema` (String) The fully qualified name of the schema. +- `in_database` (String) The fully qualified name of the database. For more information about this resource, see [docs](./database). +- `in_schema` (String) The fully qualified name of the schema. For more information about this resource, see [docs](./schema). @@ -292,8 +292,8 @@ Required: Optional: -- `in_database` (String) The fully qualified name of the database. -- `in_schema` (String) The fully qualified name of the schema. +- `in_database` (String) The fully qualified name of the database. For more information about this resource, see [docs](./database). +- `in_schema` (String) The fully qualified name of the schema. For more information about this resource, see [docs](./schema). ## Import diff --git a/docs/resources/grant_privileges_to_account_role.md b/docs/resources/grant_privileges_to_account_role.md index a314c5dd16..dfc4dee7f0 100644 --- a/docs/resources/grant_privileges_to_account_role.md +++ b/docs/resources/grant_privileges_to_account_role.md @@ -264,7 +264,7 @@ resource "snowflake_grant_privileges_to_account_role" "example" { ### Required -- `account_role_name` (String) The fully qualified name of the account role to which privileges will be granted. +- `account_role_name` (String) The fully qualified name of the account role to which privileges will be granted. For more information about this resource, see [docs](./account_role). ### Optional @@ -275,7 +275,7 @@ resource "snowflake_grant_privileges_to_account_role" "example" { - `on_account_object` (Block List, Max: 1) Specifies the account object on which privileges will be granted (see [below for nested schema](#nestedblock--on_account_object)) - `on_schema` (Block List, Max: 1) Specifies the schema on which privileges will be granted. (see [below for nested schema](#nestedblock--on_schema)) - `on_schema_object` (Block List, Max: 1) Specifies the schema object on which privileges will be granted. (see [below for nested schema](#nestedblock--on_schema_object)) -- `privileges` (Set of String) The privileges to grant on the account role. +- `privileges` (Set of String) The privileges to grant on the account role. This field is case-sensitive; use only upper-case privileges. - `with_grant_option` (Boolean) Specifies whether the grantee can grant the privileges to other users. ### Read-Only diff --git a/docs/resources/grant_privileges_to_database_role.md b/docs/resources/grant_privileges_to_database_role.md index 81f34a561a..9ad169db1b 100644 --- a/docs/resources/grant_privileges_to_database_role.md +++ b/docs/resources/grant_privileges_to_database_role.md @@ -182,14 +182,14 @@ resource "snowflake_grant_privileges_to_database_role" "example" { ### Required -- `database_role_name` (String) The fully qualified name of the database role to which privileges will be granted. +- `database_role_name` (String) The fully qualified name of the database role to which privileges will be granted. For more information about this resource, see [docs](./database_role). ### Optional - `all_privileges` (Boolean) Grant all privileges on the database role. - `always_apply` (Boolean) If true, the resource will always produce a “plan” and on “apply” it will re-grant defined privileges. It is supposed to be used only in “grant privileges on all X’s in database / schema Y” or “grant all privileges to X” scenarios to make sure that every new object in a given database / schema is granted by the account role and every new privilege is granted to the database role. Important note: this flag is not compliant with the Terraform assumptions of the config being eventually convergent (producing an empty plan). - `always_apply_trigger` (String) This is a helper field and should not be set. Its main purpose is to help to achieve the functionality described by the always_apply field. -- `on_database` (String) The fully qualified name of the database on which privileges will be granted. +- `on_database` (String) The fully qualified name of the database on which privileges will be granted. For more information about this resource, see [docs](./database). - `on_schema` (Block List, Max: 1) Specifies the schema on which privileges will be granted. (see [below for nested schema](#nestedblock--on_schema)) - `on_schema_object` (Block List, Max: 1) Specifies the schema object on which privileges will be granted. (see [below for nested schema](#nestedblock--on_schema_object)) - `privileges` (Set of String) The privileges to grant on the database role. diff --git a/docs/resources/grant_privileges_to_share.md b/docs/resources/grant_privileges_to_share.md index cbfb9e14fb..f22c2cb496 100644 --- a/docs/resources/grant_privileges_to_share.md +++ b/docs/resources/grant_privileges_to_share.md @@ -106,17 +106,17 @@ resource "snowflake_grant_privileges_to_share" "example" { ### Required - `privileges` (Set of String) The privileges to grant on the share. See available list of privileges: https://docs.snowflake.com/en/sql-reference/sql/grant-privilege-share#syntax -- `to_share` (String) The fully qualified name of the share on which privileges will be granted. +- `to_share` (String) The fully qualified name of the share on which privileges will be granted. For more information about this resource, see [docs](./share). ### Optional - `on_all_tables_in_schema` (String) The fully qualified identifier for the schema for which the specified privilege will be granted for all tables. -- `on_database` (String) The fully qualified name of the database on which privileges will be granted. +- `on_database` (String) The fully qualified name of the database on which privileges will be granted. For more information about this resource, see [docs](./database). - `on_function` (String) The fully qualified name of the function on which privileges will be granted. -- `on_schema` (String) The fully qualified name of the schema on which privileges will be granted. -- `on_table` (String) The fully qualified name of the table on which privileges will be granted. -- `on_tag` (String) The fully qualified name of the tag on which privileges will be granted. -- `on_view` (String) The fully qualified name of the view on which privileges will be granted. +- `on_schema` (String) The fully qualified name of the schema on which privileges will be granted. For more information about this resource, see [docs](./schema). +- `on_table` (String) The fully qualified name of the table on which privileges will be granted. For more information about this resource, see [docs](./table). +- `on_tag` (String) The fully qualified name of the tag on which privileges will be granted. For more information about this resource, see [docs](./tag). +- `on_view` (String) The fully qualified name of the view on which privileges will be granted. For more information about this resource, see [docs](./view). ### Read-Only diff --git a/docs/resources/legacy_service_user.md b/docs/resources/legacy_service_user.md index 57ef504370..35847df2a0 100644 --- a/docs/resources/legacy_service_user.md +++ b/docs/resources/legacy_service_user.md @@ -123,7 +123,7 @@ resource "snowflake_legacy_service_user" "u" { ### Required -- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -144,9 +144,9 @@ resource "snowflake_legacy_service_user" "u" { - `date_output_format` (String) Specifies the display format for the DATE data type. For more information, see [Date and time input and output formats](https://docs.snowflake.com/en/sql-reference/date-time-input-output). For more information, check [DATE_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#date-output-format). - `days_to_expiry` (Number) Specifies the number of days after which the user status is set to `Expired` and the user is no longer allowed to log in. This is useful for defining temporary users (i.e. users who should only have access to Snowflake for a limited time period). In general, you should not set this property for [account administrators](https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html#label-accountadmin-users) (i.e. users with the `ACCOUNTADMIN` role) because Snowflake locks them out when they become `Expired`. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `default_namespace` (String) Specifies the namespace (database only or database and schema) that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the namespace exists. -- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. +- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. For more information about this resource, see [docs](./account_role). - `default_secondary_roles_option` (String) Specifies the secondary roles that are active for the user’s session upon login. Valid values are (case-insensitive): `DEFAULT` | `NONE` | `ALL`. More information can be found in [doc](https://docs.snowflake.com/en/sql-reference/sql/create-user#optional-object-properties-objectproperties). -- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. +- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. For more information about this resource, see [docs](./warehouse). - `disabled` (String) Specifies whether the user is disabled, which prevents logging in and aborts all the currently-running queries for the user. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `display_name` (String) Name displayed for the user in the Snowflake web interface. - `email` (String, Sensitive) Email address for the user. @@ -169,7 +169,7 @@ resource "snowflake_legacy_service_user" "u" { - `network_policy` (String) Specifies the network policy to enforce for your account. Network policies enable restricting access to your account based on users’ IP address. For more details, see [Controlling network traffic with network policies](https://docs.snowflake.com/en/user-guide/network-policies). Any existing network policy (created using [CREATE NETWORK POLICY](https://docs.snowflake.com/en/sql-reference/sql/create-network-policy)). For more information, check [NETWORK_POLICY docs](https://docs.snowflake.com/en/sql-reference/parameters#network-policy). - `noorder_sequence_as_default` (Boolean) Specifies whether the ORDER or NOORDER property is set by default when you create a new sequence or add a new table column. The ORDER and NOORDER properties determine whether or not the values are generated for the sequence or auto-incremented column in [increasing or decreasing order](https://docs.snowflake.com/en/user-guide/querying-sequences.html#label-querying-sequences-increasing-values). For more information, check [NOORDER_SEQUENCE_AS_DEFAULT docs](https://docs.snowflake.com/en/sql-reference/parameters#noorder-sequence-as-default). - `odbc_treat_decimal_as_int` (Boolean) Specifies how ODBC processes columns that have a scale of zero (0). For more information, check [ODBC_TREAT_DECIMAL_AS_INT docs](https://docs.snowflake.com/en/sql-reference/parameters#odbc-treat-decimal-as-int). -- `password` (String, Sensitive) Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully. +- `password` (String, Sensitive) Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `prevent_unload_to_internal_stages` (Boolean) Specifies whether to prevent data unload operations to internal (Snowflake) stages using [COPY INTO ](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location) statements. For more information, check [PREVENT_UNLOAD_TO_INTERNAL_STAGES docs](https://docs.snowflake.com/en/sql-reference/parameters#prevent-unload-to-internal-stages). - `query_tag` (String) Optional string that can be used to tag queries and other SQL statements executed within a session. The tags are displayed in the output of the [QUERY_HISTORY, QUERY_HISTORY_BY_*](https://docs.snowflake.com/en/sql-reference/functions/query_history) functions. For more information, check [QUERY_TAG docs](https://docs.snowflake.com/en/sql-reference/parameters#query-tag). - `quoted_identifiers_ignore_case` (Boolean) Specifies whether letters in double-quoted object identifiers are stored and resolved as uppercase letters. By default, Snowflake preserves the case of alphabetic characters when storing and resolving double-quoted identifiers (see [Identifier resolution](https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html#label-identifier-casing)). You can use this parameter in situations in which [third-party applications always use double quotes around identifiers](https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html#label-identifier-casing-parameter). For more information, check [QUOTED_IDENTIFIERS_IGNORE_CASE docs](https://docs.snowflake.com/en/sql-reference/parameters#quoted-identifiers-ignore-case). diff --git a/docs/resources/masking_policy.md b/docs/resources/masking_policy.md index cb34591a7f..4adb212f90 100644 --- a/docs/resources/masking_policy.md +++ b/docs/resources/masking_policy.md @@ -7,8 +7,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-masking-policy#usage-notes), a masking policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-masking-policy#usage-notes), a masking policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # snowflake_masking_policy (Resource) @@ -89,10 +88,10 @@ EOF - `argument` (Block List, Min: 1) List of the arguments for the masking policy. The first column and its data type always indicate the column data type values to mask or tokenize in the subsequent policy conditions. Note that you can not specify a virtual column as the first column argument in a conditional masking policy. (see [below for nested schema](#nestedblock--argument)) - `body` (String) Specifies the SQL expression that transforms the data. To mitigate permadiff on this field, the provider replaces blank characters with a space. This can lead to false positives in cases where a change in case or run of whitespace is semantically significant. -- `database` (String) The database in which to create the masking policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the masking policy; must be unique for the database and schema in which the masking policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the masking policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the masking policy; must be unique for the database and schema in which the masking policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `return_data_type` (String) The return data type must match the input data type of the first column that is specified as an input column. For more information about data types, check [Snowflake docs](https://docs.snowflake.com/en/sql-reference/intro-summary-data-types). -- `schema` (String) The schema in which to create the masking policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `schema` (String) The schema in which to create the masking policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/network_policy.md b/docs/resources/network_policy.md index f6eda6e0d8..b77d2776f1 100644 --- a/docs/resources/network_policy.md +++ b/docs/resources/network_policy.md @@ -7,8 +7,9 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes), a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes), a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + +!> **Note** Due to technical limitations in Terraform SDK, changes in `allowed_network_rule_list` and `blocked_network_rule_list` do not cause diff for `show_output` and `describe_output`. # snowflake_network_policy (Resource) @@ -25,8 +26,8 @@ resource "snowflake_network_policy" "basic" { ## Complete (with every optional set) resource "snowflake_network_policy" "complete" { name = "network_policy_name" - allowed_network_rule_list = [""] - blocked_network_rule_list = [""] + allowed_network_rule_list = [snowflake_network_rule.one.fully_qualified_name] + blocked_network_rule_list = [snowflake_network_rule.two.fully_qualified_name] allowed_ip_list = ["192.168.1.0/24"] blocked_ip_list = ["192.168.1.99"] comment = "my network policy" @@ -40,14 +41,14 @@ resource "snowflake_network_policy" "complete" { ### Required -- `name` (String) Specifies the identifier for the network policy; must be unique for the account in which the network policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the identifier for the network policy; must be unique for the account in which the network policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional - `allowed_ip_list` (Set of String) Specifies one or more IPv4 addresses (CIDR notation) that are allowed access to your Snowflake account. -- `allowed_network_rule_list` (Set of String) Specifies a list of fully qualified network rules that contain the network identifiers that are allowed access to Snowflake. +- `allowed_network_rule_list` (Set of String) Specifies a list of fully qualified network rules that contain the network identifiers that are allowed access to Snowflake. For more information about this resource, see [docs](./network_rule). - `blocked_ip_list` (Set of String) Specifies one or more IPv4 addresses (CIDR notation) that are denied access to your Snowflake account. **Do not** add `0.0.0.0/0` to `blocked_ip_list`, in order to block all IP addresses except a select list, you only need to add IP addresses to `allowed_ip_list`. -- `blocked_network_rule_list` (Set of String) Specifies a list of fully qualified network rules that contain the network identifiers that are denied access to Snowflake. +- `blocked_network_rule_list` (Set of String) Specifies a list of fully qualified network rules that contain the network identifiers that are denied access to Snowflake. For more information about this resource, see [docs](./network_rule). - `comment` (String) Specifies a comment for the network policy. ### Read-Only @@ -86,5 +87,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_network_policy.example "name" +terraform import snowflake_network_policy.example '""' ``` diff --git a/docs/resources/network_rule.md b/docs/resources/network_rule.md index a9a723dd4e..64052fbcfb 100644 --- a/docs/resources/network_rule.md +++ b/docs/resources/network_rule.md @@ -5,6 +5,8 @@ description: |- --- +!> **Note** A network rule cannot be dropped successfully if it is currently assigned to a network policy. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # snowflake_network_rule (Resource) @@ -22,7 +24,6 @@ resource "snowflake_network_rule" "rule" { value_list = ["192.168.0.100/24", "29.254.123.20"] } ``` - -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). diff --git a/docs/resources/oauth_integration_for_custom_clients.md b/docs/resources/oauth_integration_for_custom_clients.md index 2ed8b0521e..8a5182a45d 100644 --- a/docs/resources/oauth_integration_for_custom_clients.md +++ b/docs/resources/oauth_integration_for_custom_clients.md @@ -7,6 +7,10 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** Setting a network policy with lowercase letters does not work correctly in Snowflake (see [issue](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3229)). As a workaround, set the network policy with uppercase letters only, or use unsafe_execute with network policy ID wrapped in `'`. + +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_oauth_integration_for_custom_clients (Resource) Resource used to manage oauth security integration for custom clients objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-oauth-snowflake). @@ -16,7 +20,7 @@ Resource used to manage oauth security integration for custom clients objects. F ```terraform # basic resource resource "snowflake_oauth_integration_for_custom_clients" "basic" { - name = "saml_integration" + name = "integration" oauth_client_type = "CONFIDENTIAL" oauth_redirect_uri = "https://example.com" blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] @@ -24,18 +28,18 @@ resource "snowflake_oauth_integration_for_custom_clients" "basic" { # resource with all fields set resource "snowflake_oauth_integration_for_custom_clients" "complete" { - name = "saml_integration" + name = "integration" oauth_client_type = "CONFIDENTIAL" oauth_redirect_uri = "https://example.com" enabled = "true" oauth_allow_non_tls_redirect_uri = "true" oauth_enforce_pkce = "true" oauth_use_secondary_roles = "NONE" - pre_authorized_roles_list = ["role_id1", "role_id2"] - blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", "role_id1", "role_id2"] + pre_authorized_roles_list = [snowflake_role.one.fully_qualified_name, snowflake_role.two.fully_qualified_name] + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", snowflake_role.three.fully_qualified_name, snowflake_role.four.fully_qualified_name] oauth_issue_refresh_tokens = "true" oauth_refresh_token_validity = 87600 - network_policy = "network_policy_id" + network_policy = snowflake_network_policy.example.fully_qualified_name oauth_client_rsa_public_key = file("rsa.pub") oauth_client_rsa_public_key_2 = file("rsa2.pub") comment = "my oauth integration" @@ -49,8 +53,8 @@ resource "snowflake_oauth_integration_for_custom_clients" "complete" { ### Required -- `blocked_roles_list` (Set of String) A set of Snowflake roles that a user cannot explicitly consent to using after authenticating. -- `name` (String) Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `blocked_roles_list` (Set of String) A set of Snowflake roles that a user cannot explicitly consent to using after authenticating. For more information about this resource, see [docs](./account_role). +- `name` (String) Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_client_type` (String) Specifies the type of client being registered. Snowflake supports both confidential and public clients. Valid options are: `PUBLIC` | `CONFIDENTIAL`. - `oauth_redirect_uri` (String) Specifies the client URI. After a user is authenticated, the web browser is redirected to this URI. @@ -58,15 +62,15 @@ resource "snowflake_oauth_integration_for_custom_clients" "complete" { - `comment` (String) Specifies a comment for the OAuth integration. - `enabled` (String) Specifies whether this OAuth integration is enabled or disabled. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. -- `network_policy` (String) Specifies an existing network policy. This network policy controls network traffic that is attempting to exchange an authorization code for an access or refresh token or to use a refresh token to obtain a new access token. +- `network_policy` (String) Specifies an existing network policy. This network policy controls network traffic that is attempting to exchange an authorization code for an access or refresh token or to use a refresh token to obtain a new access token. For more information about this resource, see [docs](./network_policy). - `oauth_allow_non_tls_redirect_uri` (String) If true, allows setting oauth_redirect_uri to a URI not protected by TLS. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. -- `oauth_client_rsa_public_key` (String) Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource using `terraform taint`. -- `oauth_client_rsa_public_key_2` (String) Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource using `terraform taint`. +- `oauth_client_rsa_public_key` (String) Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". +- `oauth_client_rsa_public_key_2` (String) Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `oauth_enforce_pkce` (String) Boolean that specifies whether Proof Key for Code Exchange (PKCE) should be required for the integration. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `oauth_issue_refresh_tokens` (String) Specifies whether to allow the client to exchange a refresh token for an access token when the current access token has expired. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `oauth_refresh_token_validity` (Number) Specifies how long refresh tokens should be valid (in seconds). OAUTH_ISSUE_REFRESH_TOKENS must be set to TRUE. - `oauth_use_secondary_roles` (String) Specifies whether default secondary roles set in the user properties are activated by default in the session being opened. Valid options are: `IMPLICIT` | `NONE`. -- `pre_authorized_roles_list` (Set of String) A set of Snowflake roles that a user does not need to explicitly consent to using after authenticating. +- `pre_authorized_roles_list` (Set of String) A set of Snowflake roles that a user does not need to explicitly consent to using after authenticating. For more information about this resource, see [docs](./account_role). ### Read-Only @@ -327,5 +331,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_oauth_integration_for_custom_clients.example "name" +terraform import snowflake_oauth_integration_for_custom_clients.example '""' ``` diff --git a/docs/resources/oauth_integration_for_partner_applications.md b/docs/resources/oauth_integration_for_partner_applications.md index 0d9f1c139e..48311dbca2 100644 --- a/docs/resources/oauth_integration_for_partner_applications.md +++ b/docs/resources/oauth_integration_for_partner_applications.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_oauth_integration_for_partner_applications (Resource) Resource used to manage oauth security integration for partner applications objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-oauth-snowflake). @@ -30,7 +32,7 @@ resource "snowflake_oauth_integration_for_partner_applications" "test" { oauth_issue_refresh_tokens = "true" oauth_refresh_token_validity = 3600 oauth_use_secondary_roles = "IMPLICIT" - blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", "role_id1", "role_id2"] + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", snowflake_role.one.fully_qualified_name, snowflake_role.two.fully_qualified_name] comment = "example oauth integration for partner applications" } ``` @@ -42,8 +44,8 @@ resource "snowflake_oauth_integration_for_partner_applications" "test" { ### Required -- `blocked_roles_list` (Set of String) A set of Snowflake roles that a user cannot explicitly consent to using after authenticating. -- `name` (String) Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `blocked_roles_list` (Set of String) A set of Snowflake roles that a user cannot explicitly consent to using after authenticating. For more information about this resource, see [docs](./account_role). +- `name` (String) Specifies the name of the OAuth integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_client` (String) Creates an OAuth interface between Snowflake and a partner application. Valid options are: `LOOKER` | `TABLEAU_DESKTOP` | `TABLEAU_SERVER`. ### Optional diff --git a/docs/resources/password_policy.md b/docs/resources/password_policy.md index c88f79b2f5..e3131cb1a3 100644 --- a/docs/resources/password_policy.md +++ b/docs/resources/password_policy.md @@ -5,8 +5,7 @@ description: |- A password policy specifies the requirements that must be met to create and reset a password to authenticate to Snowflake. --- -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-password-policy#usage-notes), a password policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-password-policy#usage-notes), a password policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # snowflake_password_policy (Resource) diff --git a/docs/resources/primary_connection.md b/docs/resources/primary_connection.md index 27d726a362..20009789d2 100644 --- a/docs/resources/primary_connection.md +++ b/docs/resources/primary_connection.md @@ -24,14 +24,14 @@ resource "snowflake_primary_connection" "complete" { name = "connection_name" comment = "my complete connection" enable_failover_to_accounts = [ - "." + "\"\".\"\"" ] } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](../docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). --> **Note** To demote `snowflake_primary_connection` to [`snowflake_secondary_connection`](./secondary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state, then recreate it in manually using: +-> **Note** To demote `snowflake_primary_connection` to [`snowflake_secondary_connection`](./secondary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state with [terraform state rm](https://developer.hashicorp.com/terraform/cli/commands/state/rm), then recreate it in manually using: ``` CREATE CONNECTION AS REPLICA OF ..; ``` @@ -43,12 +43,12 @@ and then import it as the `snowflake_secondary_connection`. ### Required -- `name` (String) String that specifies the identifier (i.e. name) for the connection. Must start with an alphabetic character and may only contain letters, decimal digits (0-9), and underscores (_). For a primary connection, the name must be unique across connection names and account names in the organization. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) String that specifies the identifier (i.e. name) for the connection. Must start with an alphabetic character and may only contain letters, decimal digits (0-9), and underscores (_). For a primary connection, the name must be unique across connection names and account names in the organization. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional - `comment` (String) Specifies a comment for the connection. -- `enable_failover_to_accounts` (List of String) Enables failover for given connection to provided accounts. Specifies a list of accounts in your organization where a secondary connection for this primary connection can be promoted to serve as the primary connection. Include your organization name for each account in the list. +- `enable_failover_to_accounts` (List of String) Enables failover for given connection to provided accounts. Specifies a list of accounts in your organization where a secondary connection for this primary connection can be promoted to serve as the primary connection. Include your organization name for each account in the list. For more information about this resource, see [docs](./account). ### Read-Only @@ -80,5 +80,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_primary_connection.example 'connection_name' +terraform import snowflake_primary_connection.example '""' ``` diff --git a/docs/resources/resource_monitor.md b/docs/resources/resource_monitor.md index c5eb401268..a674411c2f 100644 --- a/docs/resources/resource_monitor.md +++ b/docs/resources/resource_monitor.md @@ -31,7 +31,7 @@ resource "snowflake_resource_monitor" "minimal_working" { name = "resource-monitor-name" credit_quota = 100 suspend_trigger = 100 - notify_users = ["USERONE", "USERTWO"] + notify_users = [snowflake_user.one.fully_qualified_name, snowflake_user.two.fully_qualified_name] } resource "snowflake_resource_monitor" "complete" { @@ -46,7 +46,7 @@ resource "snowflake_resource_monitor" "complete" { suspend_trigger = 50 suspend_immediate_trigger = 90 - notify_users = ["USERONE", "USERTWO"] + notify_users = [snowflake_user.one.fully_qualified_name, snowflake_user.two.fully_qualified_name] } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). @@ -57,7 +57,7 @@ resource "snowflake_resource_monitor" "complete" { ### Required -- `name` (String) Identifier for the resource monitor; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Identifier for the resource monitor; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -65,7 +65,7 @@ resource "snowflake_resource_monitor" "complete" { - `end_timestamp` (String) The date and time when the resource monitor suspends the assigned warehouses. - `frequency` (String) The frequency interval at which the credit usage resets to 0. Valid values are (case-insensitive): `MONTHLY` | `DAILY` | `WEEKLY` | `YEARLY` | `NEVER`. If you set a `frequency` for a resource monitor, you must also set `start_timestamp`. If you specify `NEVER` for the frequency, the credit usage for the warehouse does not reset. After removing this field from the config, the previously set value will be preserved on the Snowflake side, not the default value. That's due to Snowflake limitation and the lack of unset functionality for this parameter. - `notify_triggers` (Set of Number) Specifies a list of percentages of the credit quota. After reaching any of the values the users passed in the notify_users field will be notified (to receive the notification they should have notifications enabled). Values over 100 are supported. -- `notify_users` (Set of String) Specifies the list of users (their identifiers) to receive email notifications on resource monitors. +- `notify_users` (Set of String) Specifies the list of users (their identifiers) to receive email notifications on resource monitors. For more information about this resource, see [docs](./user). - `start_timestamp` (String) The date and time when the resource monitor starts monitoring credit usage for the assigned warehouses. If you set a `start_timestamp` for a resource monitor, you must also set `frequency`. After removing this field from the config, the previously set value will be preserved on the Snowflake side, not the default value. That's due to Snowflake limitation and the lack of unset functionality for this parameter. - `suspend_immediate_trigger` (Number) Represents a numeric value specified as a percentage of the credit quota. Values over 100 are supported. After reaching this value, all assigned warehouses immediately cancel any currently running queries or statements. In addition, this action sends a notification to all users who have enabled notifications for themselves. - `suspend_trigger` (Number) Represents a numeric value specified as a percentage of the credit quota. Values over 100 are supported. After reaching this value, all assigned warehouses while allowing currently running queries to complete will be suspended. No new queries can be executed by the warehouses until the credit quota for the resource monitor is increased. In addition, this action sends a notification to all users who have enabled notifications for themselves. @@ -100,6 +100,5 @@ Read-Only: Import is supported using the following syntax: ```shell -# format is the resource monitor name -terraform import snowflake_resource_monitor.example 'resourceMonitorName' +terraform import snowflake_resource_monitor.example '""' ``` diff --git a/docs/resources/role.md b/docs/resources/role.md index cf79b2cdf3..6dba422fba 100644 --- a/docs/resources/role.md +++ b/docs/resources/role.md @@ -34,7 +34,7 @@ resource "snowflake_role" "complete" { ### Required -- `name` (String) Identifier for the role; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Identifier for the role; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/row_access_policy.md b/docs/resources/row_access_policy.md index 0023f99391..5f243e1b1c 100644 --- a/docs/resources/row_access_policy.md +++ b/docs/resources/row_access_policy.md @@ -7,8 +7,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-row-access-policy#usage-notes), a row access policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-row-access-policy#usage-notes), a row access policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # snowflake_row_access_policy (Resource) @@ -17,6 +16,7 @@ Resource used to manage row access policy objects. For more information, check [ ## Example Usage ```terraform +# resource with all fields set resource "snowflake_row_access_policy" "example_row_access_policy" { name = "EXAMPLE_ROW_ACCESS_POLICY" database = "EXAMPLE_DB" @@ -47,9 +47,9 @@ resource "snowflake_row_access_policy" "example_row_access_policy" { - `argument` (Block List, Min: 1) List of the arguments for the row access policy. A signature specifies a set of attributes that must be considered to determine whether the row is accessible. The attribute values come from the database object (e.g. table or view) to be protected by the row access policy. If any argument name or type is changed, the resource is recreated. (see [below for nested schema](#nestedblock--argument)) - `body` (String) Specifies the SQL expression. The expression can be any boolean-valued SQL expression. To mitigate permadiff on this field, the provider replaces blank characters with a space. This can lead to false positives in cases where a change in case or run of whitespace is semantically significant. -- `database` (String) The database in which to create the row access policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the row access policy; must be unique for the database and schema in which the row access policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the row access policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the row access policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the row access policy; must be unique for the database and schema in which the row access policy is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the row access policy. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/saml2_integration.md b/docs/resources/saml2_integration.md index 38e45dc4d8..fa55be5e87 100644 --- a/docs/resources/saml2_integration.md +++ b/docs/resources/saml2_integration.md @@ -2,14 +2,16 @@ page_title: "snowflake_saml2_integration Resource - terraform-provider-snowflake" subcategory: "" description: |- - Resource used to manage saml2 security integration objects. For more information, check security integrations documentation https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2. + Resource used to manage SAML2 security integration objects. For more information, check security integrations documentation https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2. --- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_saml2_integration (Resource) -Resource used to manage saml2 security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2). +Resource used to manage SAML2 security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2). ## Example Usage @@ -53,7 +55,7 @@ resource "snowflake_saml2_integration" "test" { ### Required -- `name` (String) Specifies the name of the SAML2 integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Specifies the name of the SAML2 integration. This name follows the rules for Object Identifiers. The name should be unique among security integrations in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `saml2_issuer` (String) The string containing the IdP EntityID / Issuer. - `saml2_provider` (String) The string describing the IdP. Valid options are: `OKTA` | `ADFS` | `CUSTOM`. - `saml2_sso_url` (String) The string containing the IdP SSO URL, where the user should be redirected by Snowflake (the Service Provider) with a SAML AuthnRequest message. @@ -333,5 +335,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_saml2_integration.example "name" +terraform import snowflake_saml2_integration.example '""' ``` diff --git a/docs/resources/schema.md b/docs/resources/schema.md index 622d864fef..3c919e9d74 100644 --- a/docs/resources/schema.md +++ b/docs/resources/schema.md @@ -7,6 +7,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +-> **Note** Field `CLASSIFICATION_ROLE` is currently missing. It will be added in the future. + +!> **Note** A schema cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098508 (2BP01): Cannot drop schema SCHEMA as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # snowflake_schema (Resource) Resource used to manage schema objects. For more information, check [schema documentation](https://docs.snowflake.com/en/sql-reference/sql/create-schema). @@ -56,8 +62,8 @@ resource "snowflake_schema" "schema" { ### Required -- `database` (String) The database in which to create the schema. -- `name` (String) Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state. +- `database` (String) The database in which to create the schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -350,6 +356,5 @@ Read-Only: Import is supported using the following syntax: ```shell -# format is . terraform import snowflake_schema.example '"".""' ``` diff --git a/docs/resources/scim_integration.md b/docs/resources/scim_integration.md index c3e8ee35a9..ae782bcbe0 100644 --- a/docs/resources/scim_integration.md +++ b/docs/resources/scim_integration.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # snowflake_scim_integration (Resource) Resource used to manage scim security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-scim). @@ -20,14 +22,16 @@ resource "snowflake_scim_integration" "test" { enabled = true scim_client = "GENERIC" sync_password = true + run_as_role = "GENERIC_SCIM_PROVISIONER" } + # resource with all fields set resource "snowflake_scim_integration" "test" { name = "test" enabled = true scim_client = "GENERIC" sync_password = true - network_policy = "network_policy_test" + network_policy = snowflake_network_policy.example.fully_qualified_name run_as_role = "GENERIC_SCIM_PROVISIONER" comment = "foo" } @@ -41,14 +45,14 @@ resource "snowflake_scim_integration" "test" { ### Required - `enabled` (Boolean) Specify whether the security integration is enabled. -- `name` (String) String that specifies the identifier (i.e. name) for the integration; must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) String that specifies the identifier (i.e. name) for the integration; must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `run_as_role` (String) Specify the SCIM role in Snowflake that owns any users and roles that are imported from the identity provider into Snowflake using SCIM. Provider assumes that the specified role is already provided. Valid options are: `OKTA_PROVISIONER` | `AAD_PROVISIONER` | `GENERIC_SCIM_PROVISIONER`. - `scim_client` (String) Specifies the client type for the scim integration. Valid options are: `OKTA` | `AZURE` | `GENERIC`. ### Optional - `comment` (String) Specifies a comment for the integration. -- `network_policy` (String) Specifies an existing network policy that controls SCIM network traffic. +- `network_policy` (String) Specifies an existing network policy that controls SCIM network traffic. For more information about this resource, see [docs](./network_policy). - `sync_password` (String) Specifies whether to enable or disable the synchronization of a user password from an Okta SCIM client as part of the API request to Snowflake. This property is not supported for Azure SCIM. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. ### Read-Only @@ -142,5 +146,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_scim_integration.example "name" +terraform import snowflake_scim_integration.example '""' ``` diff --git a/docs/resources/secondary_connection.md b/docs/resources/secondary_connection.md index 416e593392..566c4e507b 100644 --- a/docs/resources/secondary_connection.md +++ b/docs/resources/secondary_connection.md @@ -17,20 +17,20 @@ Resource used to manage secondary (replicated) connections. To manage primary co ## Minimal resource "snowflake_secondary_connection" "basic" { name = "connection_name" - as_replica_of = ".." + as_replica_of = "\"\".\"\".\"\"" } ## Complete (with every optional set) resource "snowflake_secondary_connection" "complete" { name = "connection_name" - as_replica_of = ".." + as_replica_of = "\"\".\"\".\"\"" comment = "my complete secondary connection" } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](../guides/identifiers#new-computed-fully-qualified-name-field-in-resources). --> **Note** To promote `snowflake_secondary_connection` to [`snowflake_primary_connection`](./primary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state, then promote it manually using: +-> **Note** To promote `snowflake_secondary_connection` to [`snowflake_primary_connection`](./primary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state with [terraform state rm](https://developer.hashicorp.com/terraform/cli/commands/state/rm), then promote it manually using: ``` ALTER CONNECTION PRIMARY; ``` @@ -42,8 +42,8 @@ and then import it as the `snowflake_primary_connection`. ### Required -- `as_replica_of` (String) Specifies the identifier for a primary connection from which to create a replica (i.e. a secondary connection). -- `name` (String) String that specifies the identifier (i.e. name) for the connection. Must start with an alphabetic character and may only contain letters, decimal digits (0-9), and underscores (_). For a secondary connection, the name must match the name of its primary connection. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `as_replica_of` (String) Specifies the identifier for a primary connection from which to create a replica (i.e. a secondary connection). For more information about this resource, see [docs](./primary_connection). +- `name` (String) String that specifies the identifier (i.e. name) for the connection. Must start with an alphabetic character and may only contain letters, decimal digits (0-9), and underscores (_). For a secondary connection, the name must match the name of its primary connection. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -79,5 +79,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_secondary_connection.example 'secondary_connection_name' +terraform import snowflake_secondary_connection.example '""' ``` diff --git a/docs/resources/secondary_database.md b/docs/resources/secondary_database.md index 95fbaf8815..652f875871 100644 --- a/docs/resources/secondary_database.md +++ b/docs/resources/secondary_database.md @@ -8,6 +8,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # snowflake_secondary_database (Resource) ~> **Note** The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. Check out the examples to see how to set up the refresh task. For SQL-based replication guide, see the [official documentation](https://docs.snowflake.com/en/user-guide/db-replication-config#replicating-a-database-to-another-account). @@ -92,8 +97,8 @@ resource "snowflake_task" "refresh_secondary_database" { ### Required -- `as_replica_of` (String) A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. -- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `as_replica_of` (String) A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. For more information about this resource, see [docs](./database). +- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -126,5 +131,5 @@ resource "snowflake_task" "refresh_secondary_database" { Import is supported using the following syntax: ```shell -terraform import snowflake_secondary_database.example 'secondary_database_name' +terraform import snowflake_secondary_database.example '""' ``` diff --git a/docs/resources/secret_with_authorization_code_grant.md b/docs/resources/secret_with_authorization_code_grant.md index 3217db19dc..5f4ea693fd 100644 --- a/docs/resources/secret_with_authorization_code_grant.md +++ b/docs/resources/secret_with_authorization_code_grant.md @@ -19,7 +19,7 @@ resource "snowflake_secret_with_authorization_code_grant" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_authorization_code_grant.example.fully_qualified_name oauth_refresh_token = "EXAMPLE_TOKEN" oauth_refresh_token_expiry_time = "2025-01-02 15:04:01" } @@ -29,7 +29,7 @@ resource "snowflake_secret_with_authorization_code_grant" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_authorization_code_grant.example.fully_qualified_name oauth_refresh_token = "EXAMPLE_TOKEN" oauth_refresh_token_expiry_time = "2025-01-02 15:04:01" comment = "EXAMPLE_COMMENT" @@ -43,12 +43,12 @@ resource "snowflake_secret_with_authorization_code_grant" "test" { ### Required -- `api_authentication` (String) Specifies the name value of the Snowflake security integration that connects Snowflake to an external service. -- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `api_authentication` (String) Specifies the name value of the Snowflake security integration that connects Snowflake to an external service. For more information about this resource, see [docs](./api_authentication_integration_with_authorization_code_grant). +- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_refresh_token` (String, Sensitive) Specifies the token as a string that is used to obtain a new access token from the OAuth authorization server when the access token expires. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `oauth_refresh_token_expiry_time` (String) Specifies the timestamp as a string when the OAuth refresh token expires. Accepted string formats: YYYY-MM-DD, YYYY-MM-DD HH:MI, YYYY-MM-DD HH:MI:SS, YYYY-MM-DD HH:MI -- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/secret_with_basic_authentication.md b/docs/resources/secret_with_basic_authentication.md index 66d99164b3..e6b0d88c1b 100644 --- a/docs/resources/secret_with_basic_authentication.md +++ b/docs/resources/secret_with_basic_authentication.md @@ -41,10 +41,10 @@ resource "snowflake_secret_with_basic_authentication" "test" { ### Required -- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `password` (String, Sensitive) Specifies the password value to store in the secret. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". -- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `username` (String, Sensitive) Specifies the username value to store in the secret. ### Optional diff --git a/docs/resources/secret_with_client_credentials.md b/docs/resources/secret_with_client_credentials.md index 0e5ad14903..6dd8757a59 100644 --- a/docs/resources/secret_with_client_credentials.md +++ b/docs/resources/secret_with_client_credentials.md @@ -19,7 +19,7 @@ resource "snowflake_secret_with_client_credentials" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_client_credentials.example.fully_qualified_name oauth_scopes = ["useraccount", "testscope"] } @@ -28,7 +28,7 @@ resource "snowflake_secret_with_client_credentials" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_client_credentials.example.fully_qualified_name oauth_scopes = ["useraccount", "testscope"] comment = "EXAMPLE_COMMENT" } @@ -41,11 +41,11 @@ resource "snowflake_secret_with_client_credentials" "test" { ### Required -- `api_authentication` (String) Specifies the name value of the Snowflake security integration that connects Snowflake to an external service. -- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `api_authentication` (String) Specifies the name value of the Snowflake security integration that connects Snowflake to an external service. For more information about this resource, see [docs](./api_authentication_integration_with_client_credentials). +- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `oauth_scopes` (Set of String) Specifies a list of scopes to use when making a request from the OAuth server by a role with USAGE on the integration during the OAuth client credentials flow. -- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional diff --git a/docs/resources/secret_with_generic_string.md b/docs/resources/secret_with_generic_string.md index 408e71592d..4c0b426ab9 100644 --- a/docs/resources/secret_with_generic_string.md +++ b/docs/resources/secret_with_generic_string.md @@ -39,9 +39,9 @@ resource "snowflake_secret_with_generic_string" "test" { ### Required -- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the secret Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) String that specifies the identifier (i.e. name) for the secret, must be unique in your schema. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the secret. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `secret_string` (String, Sensitive) Specifies the string to store in the secret. The string can be an API token or a string of sensitive value that can be used in the handler code of a UDF or stored procedure. For details, see [Creating and using an external access integration](https://docs.snowflake.com/en/developer-guide/external-network-access/creating-using-external-network-access). You should not use this property to store any kind of OAuth token; use one of the other secret types for your OAuth use cases. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Optional diff --git a/docs/resources/service_user.md b/docs/resources/service_user.md index eba0597df1..7daa93d14b 100644 --- a/docs/resources/service_user.md +++ b/docs/resources/service_user.md @@ -120,7 +120,7 @@ resource "snowflake_service_user" "u" { ### Required -- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -141,9 +141,9 @@ resource "snowflake_service_user" "u" { - `date_output_format` (String) Specifies the display format for the DATE data type. For more information, see [Date and time input and output formats](https://docs.snowflake.com/en/sql-reference/date-time-input-output). For more information, check [DATE_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#date-output-format). - `days_to_expiry` (Number) Specifies the number of days after which the user status is set to `Expired` and the user is no longer allowed to log in. This is useful for defining temporary users (i.e. users who should only have access to Snowflake for a limited time period). In general, you should not set this property for [account administrators](https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html#label-accountadmin-users) (i.e. users with the `ACCOUNTADMIN` role) because Snowflake locks them out when they become `Expired`. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `default_namespace` (String) Specifies the namespace (database only or database and schema) that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the namespace exists. -- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. +- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. For more information about this resource, see [docs](./account_role). - `default_secondary_roles_option` (String) Specifies the secondary roles that are active for the user’s session upon login. Valid values are (case-insensitive): `DEFAULT` | `NONE` | `ALL`. More information can be found in [doc](https://docs.snowflake.com/en/sql-reference/sql/create-user#optional-object-properties-objectproperties). -- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. +- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. For more information about this resource, see [docs](./warehouse). - `disabled` (String) Specifies whether the user is disabled, which prevents logging in and aborts all the currently-running queries for the user. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `display_name` (String) Name displayed for the user in the Snowflake web interface. - `email` (String, Sensitive) Email address for the user. diff --git a/docs/resources/shared_database.md b/docs/resources/shared_database.md index 574ff4f28c..daf52b60bd 100644 --- a/docs/resources/shared_database.md +++ b/docs/resources/shared_database.md @@ -7,6 +7,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # snowflake_shared_database (Resource) A shared database creates a database from a share provided by another Snowflake account. For more information about shares, see [Introduction to Secure Data Sharing](https://docs.snowflake.com/en/user-guide/data-sharing-intro). @@ -75,8 +80,8 @@ resource "snowflake_shared_database" "test" { ### Required -- `from_share` (String) A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `""."".""`. -- `name` (String) Specifies the identifier for the database; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `from_share` (String) A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `""."".""`. For more information about this resource, see [docs](./share). +- `name` (String) Specifies the identifier for the database; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -106,5 +111,5 @@ resource "snowflake_shared_database" "test" { Import is supported using the following syntax: ```shell -terraform import snowflake_shared_database.example 'shared_database_name' +terraform import snowflake_shared_database.example '""' ``` diff --git a/docs/resources/stream.md b/docs/resources/stream.md index 1a36f91eb3..2cf478e3ba 100644 --- a/docs/resources/stream.md +++ b/docs/resources/stream.md @@ -7,7 +7,7 @@ description: |- # snowflake_stream (Resource) -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_stream_on_directory_table` | `snowflake_stream_on_external_table` | `snowflake_stream_on_table` | `snowflake_stream_on_view` +~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_stream_on_directory_table` | `snowflake_stream_on_external_table` | `snowflake_stream_on_table` | `snowflake_stream_on_view`. ## Example Usage diff --git a/docs/resources/stream_on_directory_table.md b/docs/resources/stream_on_directory_table.md index 1913610345..4f1ebf9772 100644 --- a/docs/resources/stream_on_directory_table.md +++ b/docs/resources/stream_on_directory_table.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. +~> **Note about copy_grants** Fields like `stage`, and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + # snowflake_stream_on_directory_table (Resource) Resource used to manage streams on directory tables. For more information, check [stream documentation](https://docs.snowflake.com/en/sql-reference/sql/create-stream). @@ -14,21 +16,13 @@ Resource used to manage streams on directory tables. For more information, check ## Example Usage ```terraform -resource "snowflake_stage" "example_stage" { - name = "EXAMPLE_STAGE" - url = "s3://com.example.bucket/prefix" - database = "EXAMPLE_DB" - schema = "EXAMPLE_SCHEMA" - credentials = "AWS_KEY_ID='${var.example_aws_key_id}' AWS_SECRET_KEY='${var.example_aws_secret_key}'" -} - # basic resource resource "snowflake_stream_on_directory_table" "stream" { name = "stream" schema = "schema" database = "database" - stage = snowflake_stage.stage.fully_qualified_name + stage = snowflake_stage.example.fully_qualified_name } @@ -39,11 +33,7 @@ resource "snowflake_stream_on_directory_table" "stream" { database = "database" copy_grants = true - stage = snowflake_stage.stage.fully_qualified_name - - at { - statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726" - } + stage = snowflake_stage.example.fully_qualified_name comment = "A stream." } @@ -56,15 +46,15 @@ resource "snowflake_stream_on_directory_table" "stream" { ### Required -- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `stage` (String) Specifies an identifier for the stage the stream will monitor. Due to Snowflake limitations, the provider can not read the stage's database and schema. For stages, Snowflake returns only partially qualified name instead of fully qualified name. Please use stages located in the same schema as the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `stage` (String) Specifies an identifier for the stage the stream will monitor. Due to Snowflake limitations, the provider can not read the stage's database and schema. For stages, Snowflake returns only partially qualified name instead of fully qualified name. Please use stages located in the same schema as the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. For more information about this resource, see [docs](./stage). ### Optional - `comment` (String) Specifies a comment for the stream. -- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream. +- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform. ### Read-Only diff --git a/docs/resources/stream_on_external_table.md b/docs/resources/stream_on_external_table.md index a01c1a93bc..48ca9d2e26 100644 --- a/docs/resources/stream_on_external_table.md +++ b/docs/resources/stream_on_external_table.md @@ -7,7 +7,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0960--v0970) to use it. -!> **Note about copy_grants** Fields like `external_table`, `insert_only`, `at`, `before` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. +~> **Note about copy_grants** Fields like `external_table`, `insert_only`, `at`, `before` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. # snowflake_stream_on_external_table (Resource) @@ -16,32 +16,13 @@ Resource used to manage streams on external tables. For more information, check ## Example Usage ```terraform -resource "snowflake_external_table" "external_table" { - database = "db" - schema = "schema" - name = "external_table" - comment = "External table" - file_format = "TYPE = CSV FIELD_DELIMITER = '|'" - location = "@stage/directory/" - - column { - name = "id" - type = "int" - } - - column { - name = "data" - type = "text" - } -} - # basic resource resource "snowflake_stream_on_external_table" "stream" { name = "stream" schema = "schema" database = "database" - external_table = snowflake_external_table.external_table.fully_qualified_name + external_table = snowflake_external_table.example.fully_qualified_name } @@ -52,7 +33,7 @@ resource "snowflake_stream_on_external_table" "stream" { database = "database" copy_grants = true - external_table = snowflake_external_table.external_table.fully_qualified_name + external_table = snowflake_external_table.example.fully_qualified_name insert_only = "true" at { @@ -70,17 +51,17 @@ resource "snowflake_stream_on_external_table" "stream" { ### Required -- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `external_table` (String) Specifies an identifier for the external table the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `external_table` (String) Specifies an identifier for the external table the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. For more information about this resource, see [docs](./external_table). +- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional - `at` (Block List, Max: 1) This field specifies that the request is inclusive of any changes made by a statement or transaction with a timestamp equal to the specified parameter. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--at)) - `before` (Block List, Max: 1) This field specifies that the request refers to a point immediately preceding the specified parameter. This point in time is just before the statement, identified by its query ID, is completed. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--before)) - `comment` (String) Specifies a comment for the stream. -- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream. +- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform. - `insert_only` (String) Specifies whether this is an insert-only stream. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. ### Read-Only diff --git a/docs/resources/stream_on_table.md b/docs/resources/stream_on_table.md index 244f5b97a5..67361e3aaa 100644 --- a/docs/resources/stream_on_table.md +++ b/docs/resources/stream_on_table.md @@ -7,7 +7,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0960--v0970) to use it. -!> **Note about copy_grants** Fields like `table`, `append_only`, `at`, `before`, `show_initial_rows` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. +~> **Note about copy_grants** Fields like `table`, `append_only`, `at`, `before`, `show_initial_rows` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. # snowflake_stream_on_table (Resource) @@ -16,18 +16,15 @@ Resource used to manage streams on tables. For more information, check [stream d ## Example Usage ```terraform -resource "snowflake_table" "table" { - database = "database" +# basic resource +resource "snowflake_stream_on_table" "stream" { + name = "stream" schema = "schema" - name = "name" + database = "database" - column { - type = "NUMBER(38,0)" - name = "id" - } + table = snowflake_table.example.fully_qualified_name } - # resource with more fields set resource "snowflake_stream_on_table" "stream" { name = "stream" @@ -35,7 +32,7 @@ resource "snowflake_stream_on_table" "stream" { database = "database" copy_grants = true - table = snowflake_table.table.fully_qualified_name + table = snowflake_table.example.fully_qualified_name append_only = "true" show_initial_rows = "true" @@ -54,10 +51,10 @@ resource "snowflake_stream_on_table" "stream" { ### Required -- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `table` (String) Specifies an identifier for the table the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `table` (String) Specifies an identifier for the table the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. For more information about this resource, see [docs](./table). ### Optional @@ -65,7 +62,7 @@ resource "snowflake_stream_on_table" "stream" { - `at` (Block List, Max: 1) This field specifies that the request is inclusive of any changes made by a statement or transaction with a timestamp equal to the specified parameter. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--at)) - `before` (Block List, Max: 1) This field specifies that the request refers to a point immediately preceding the specified parameter. This point in time is just before the statement, identified by its query ID, is completed. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--before)) - `comment` (String) Specifies a comment for the stream. -- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream. +- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform. - `show_initial_rows` (String) Specifies whether to return all existing rows in the source table as row inserts the first time the stream is consumed. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Read-Only diff --git a/docs/resources/stream_on_view.md b/docs/resources/stream_on_view.md index ea8c406eb4..4a9ae5607b 100644 --- a/docs/resources/stream_on_view.md +++ b/docs/resources/stream_on_view.md @@ -7,6 +7,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. +~> **Note about copy_grants** Fields like `view`, `append_only`, `at`, `before`, `show_initial_rows` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + # snowflake_stream_on_view (Resource) Resource used to manage streams on views. For more information, check [stream documentation](https://docs.snowflake.com/en/sql-reference/sql/create-stream). @@ -14,22 +16,13 @@ Resource used to manage streams on views. For more information, check [stream do ## Example Usage ```terraform -resource "snowflake_view" "view" { - database = "database" - schema = "schema" - name = "view" - statement = <<-SQL - select * from foo; -SQL -} - # basic resource resource "snowflake_stream_on_view" "stream" { name = "stream" schema = "schema" database = "database" - view = snowflake_view.view.fully_qualified_name + view = snowflake_view.example.fully_qualified_name } # resource with additional fields @@ -39,7 +32,7 @@ resource "snowflake_stream_on_view" "stream" { database = "database" copy_grants = true - view = snowflake_view.view.fully_qualified_name + view = snowflake_view.example.fully_qualified_name append_only = "true" show_initial_rows = "true" @@ -58,10 +51,10 @@ resource "snowflake_stream_on_view" "stream" { ### Required -- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `view` (String) Specifies an identifier for the view the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `view` (String) Specifies an identifier for the view the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. For more information about this resource, see [docs](./view). ### Optional @@ -69,7 +62,7 @@ resource "snowflake_stream_on_view" "stream" { - `at` (Block List, Max: 1) This field specifies that the request is inclusive of any changes made by a statement or transaction with a timestamp equal to the specified parameter. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--at)) - `before` (Block List, Max: 1) This field specifies that the request refers to a point immediately preceding the specified parameter. This point in time is just before the statement, identified by its query ID, is completed. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--before)) - `comment` (String) Specifies a comment for the stream. -- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream. +- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform. - `show_initial_rows` (String) Specifies whether to return all existing rows in the source table as row inserts the first time the stream is consumed. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". ### Read-Only diff --git a/docs/resources/streamlit.md b/docs/resources/streamlit.md index 25b6d5a28d..2087870c40 100644 --- a/docs/resources/streamlit.md +++ b/docs/resources/streamlit.md @@ -7,6 +7,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +!> **Note** Setting a query warehouse with lowercase letters does not work correctly in Snowflake. As a workaround, set the query warehouse with uppercase letters only, or use unsafe_execute with query warehouse ID wrapped in `'`. + + +-> **Note** Field `IMPORTS` is currently missing. It will be added in the future. + # snowflake_streamlit (Resource) Resource used to manage streamlits objects. For more information, check [streamlit documentation](https://docs.snowflake.com/en/sql-reference/commands-streamlit). @@ -19,18 +25,19 @@ resource "snowflake_streamlit" "streamlit" { database = "database" schema = "schema" name = "streamlit" - stage = "streamlit_db.streamlit_schema.streamlit_stage" + stage = snowflake_stage.example.fully_qualified_name main_file = "/streamlit_main.py" } + # resource with all fields set resource "snowflake_streamlit" "streamlit" { database = "database" schema = "schema" name = "streamlit" - stage = "streamlit_db.streamlit_schema.streamlit_stage" + stage = snowflake_stage.example.fully_qualified_name directory_location = "src" main_file = "streamlit_main.py" - query_warehouse = "warehouse" + query_warehouse = snowflake_warehouse.example.fully_qualified_name external_access_integrations = ["integration_id"] title = "title" comment = "comment" @@ -44,18 +51,18 @@ resource "snowflake_streamlit" "streamlit" { ### Required -- `database` (String) The database in which to create the streamlit -- `main_file` (String) Specifies the filename of the Streamlit Python application. This filename is relative to the value of `root_location` -- `name` (String) String that specifies the identifier (i.e. name) for the streamlit; must be unique in your account. -- `schema` (String) The schema in which to create the streamlit. -- `stage` (String) The stage in which streamlit files are located. +- `database` (String) The database in which to create the streamlit Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `main_file` (String) Specifies the filename of the Streamlit Python application. This filename is relative to the value of `directory_location` +- `name` (String) String that specifies the identifier (i.e. name) for the streamlit; must be unique in your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the streamlit. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `stage` (String) The stage in which streamlit files are located. For more information about this resource, see [docs](./stage). ### Optional - `comment` (String) Specifies a comment for the streamlit. - `directory_location` (String) Specifies the full path to the named stage containing the Streamlit Python files, media files, and the environment.yml file. - `external_access_integrations` (Set of String) External access integrations connected to the Streamlit. -- `query_warehouse` (String) Specifies the warehouse where SQL queries issued by the Streamlit application are run. +- `query_warehouse` (String) Specifies the warehouse where SQL queries issued by the Streamlit application are run. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters. For more information about this resource, see [docs](./warehouse). - `title` (String) Specifies a title for the Streamlit app to display in Snowsight. ### Read-Only @@ -104,6 +111,5 @@ Read-Only: Import is supported using the following syntax: ```shell -# format is .. terraform import snowflake_schema.example '""."".""' ``` diff --git a/docs/resources/tag.md b/docs/resources/tag.md index 0d8c2ba31c..92a4c51caf 100644 --- a/docs/resources/tag.md +++ b/docs/resources/tag.md @@ -2,14 +2,16 @@ page_title: "snowflake_tag Resource - terraform-provider-snowflake" subcategory: "" description: |- - Resource used to manage tags. For more information, check tag documentation https://docs.snowflake.com/en/sql-reference/sql/create-tag. + Resource used to manage tags. For more information, check tag documentation https://docs.snowflake.com/en/sql-reference/sql/create-tag. For asssigning tags to Snowflake objects, see tag_association resource https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/tag_association. --- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0980--v0990) to use it. +~> **Required warehouse** For this resource, the provider now uses [tag references](https://docs.snowflake.com/en/sql-reference/functions/tag_references) to get information about masking policies attached to tags. This function requires a warehouse in the connection. Please, make sure you have either set a `DEFAULT_WAREHOUSE` for the user, or specified a warehouse in the provider configuration. + # snowflake_tag (Resource) -Resource used to manage tags. For more information, check [tag documentation](https://docs.snowflake.com/en/sql-reference/sql/create-tag). +Resource used to manage tags. For more information, check [tag documentation](https://docs.snowflake.com/en/sql-reference/sql/create-tag). For asssigning tags to Snowflake objects, see [tag_association resource](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/tag_association). ## Example Usage @@ -28,7 +30,7 @@ resource "snowflake_tag" "tag" { schema = "schema" comment = "comment" allowed_values = ["finance", "engineering", ""] - masking_policies = [snowfalke_masking_policy.masking_policy.fully_qualified_name] + masking_policies = [snowfalke_masking_policy.example.fully_qualified_name] } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). @@ -39,15 +41,15 @@ resource "snowflake_tag" "tag" { ### Required -- `database` (String) The database in which to create the tag. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the tag; must be unique for the database in which the tag is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the tag. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the tag. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the tag; must be unique for the database in which the tag is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the tag. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional - `allowed_values` (Set of String) Set of allowed values for the tag. - `comment` (String) Specifies a comment for the tag. -- `masking_policies` (Set of String) Set of masking policies for the tag. A tag can support one masking policy for each data type. If masking policies are assigned to the tag, before dropping the tag, the provider automatically unassigns them. +- `masking_policies` (Set of String) Set of masking policies for the tag. A tag can support one masking policy for each data type. If masking policies are assigned to the tag, before dropping the tag, the provider automatically unassigns them. For more information about this resource, see [docs](./masking_policy). ### Read-Only diff --git a/docs/resources/tag_masking_policy_association.md b/docs/resources/tag_masking_policy_association.md index 36d6c8943a..ed23cfb3dc 100644 --- a/docs/resources/tag_masking_policy_association.md +++ b/docs/resources/tag_masking_policy_association.md @@ -7,7 +7,7 @@ description: |- # snowflake_tag_masking_policy_association (Resource) -~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_tag` +~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_tag`. Attach a masking policy to a tag. Requires a current warehouse to be set. Either with SNOWFLAKE_WAREHOUSE env variable or in current session. If no warehouse is provided, a temporary warehouse will be created. diff --git a/docs/resources/task.md b/docs/resources/task.md index 8343f80763..8f9a2c6034 100644 --- a/docs/resources/task.md +++ b/docs/resources/task.md @@ -69,13 +69,13 @@ resource "snowflake_task" "test" { database = "database" schema = "schema" name = "task" - warehouse = "warehouse" + warehouse = snowflake_warehouse.example.fully_qualified_name started = true sql_statement = "select 1" config = "{\"key\":\"value\"}" allow_overlapping_execution = true - error_integration = "" + error_integration = snowflake_notification_integration.example.fully_qualified_name when = "SYSTEM$STREAM_HAS_DATA('')" comment = "complete task" @@ -151,16 +151,16 @@ resource "snowflake_task" "test" { ### Required -- `database` (String) The database in which to create the task. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the task; must be unique for the database and schema in which the task is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the task. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `database` (String) The database in which to create the task. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the task; must be unique for the database and schema in which the task is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the task. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `sql_statement` (String) Any single SQL statement, or a call to a stored procedure, executed when the task runs. - `started` (Boolean) Specifies if the task should be started or suspended. ### Optional - `abort_detached_query` (Boolean) Specifies the action that Snowflake performs for in-progress queries if connectivity is lost due to abrupt termination of a session (e.g. network outage, browser termination, service interruption). For more information, check [ABORT_DETACHED_QUERY docs](https://docs.snowflake.com/en/sql-reference/parameters#abort-detached-query). -- `after` (Set of String) Specifies one or more predecessor tasks for the current task. Use this option to [create a DAG](https://docs.snowflake.com/en/user-guide/tasks-graphs.html#label-task-dag) of tasks or add this task to an existing DAG. A DAG is a series of tasks that starts with a scheduled root task and is linked together by dependencies. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `after` (Set of String) Specifies one or more predecessor tasks for the current task. Use this option to [create a DAG](https://docs.snowflake.com/en/user-guide/tasks-graphs.html#label-task-dag) of tasks or add this task to an existing DAG. A DAG is a series of tasks that starts with a scheduled root task and is linked together by dependencies. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `allow_overlapping_execution` (String) By default, Snowflake ensures that only one instance of a particular DAG is allowed to run at a time, setting the parameter value to TRUE permits DAG runs to overlap. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `autocommit` (Boolean) Specifies whether autocommit is enabled for the session. Autocommit determines whether a DML statement, when executed without an active transaction, is automatically committed after the statement successfully completes. For more information, see [Transactions](https://docs.snowflake.com/en/sql-reference/transactions). For more information, check [AUTOCOMMIT docs](https://docs.snowflake.com/en/sql-reference/parameters#autocommit). - `binary_input_format` (String) The format of VARCHAR values passed as input to VARCHAR-to-BINARY conversion functions. For more information, see [Binary input and output](https://docs.snowflake.com/en/sql-reference/binary-input-output). For more information, check [BINARY_INPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#binary-input-format). @@ -178,10 +178,10 @@ resource "snowflake_task" "test" { - `date_input_format` (String) Specifies the input format for the DATE data type. For more information, see [Date and time input and output formats](https://docs.snowflake.com/en/sql-reference/date-time-input-output). For more information, check [DATE_INPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#date-input-format). - `date_output_format` (String) Specifies the display format for the DATE data type. For more information, see [Date and time input and output formats](https://docs.snowflake.com/en/sql-reference/date-time-input-output). For more information, check [DATE_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#date-output-format). - `enable_unload_physical_type_optimization` (Boolean) Specifies whether to set the schema for unloaded Parquet files based on the logical column data types (i.e. the types in the unload SQL query or source table) or on the unloaded column values (i.e. the smallest data types and precision that support the values in the output columns of the unload SQL statement or source table). For more information, check [ENABLE_UNLOAD_PHYSICAL_TYPE_OPTIMIZATION docs](https://docs.snowflake.com/en/sql-reference/parameters#enable-unload-physical-type-optimization). -- `error_integration` (String) Specifies the name of the notification integration used for error notifications. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `error_integration` (String) Specifies the name of the notification integration used for error notifications. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. For more information about this resource, see [docs](./notification_integration). - `error_on_nondeterministic_merge` (Boolean) Specifies whether to return an error when the [MERGE](https://docs.snowflake.com/en/sql-reference/sql/merge) command is used to update or delete a target row that joins multiple source rows and the system cannot determine the action to perform on the target row. For more information, check [ERROR_ON_NONDETERMINISTIC_MERGE docs](https://docs.snowflake.com/en/sql-reference/parameters#error-on-nondeterministic-merge). - `error_on_nondeterministic_update` (Boolean) Specifies whether to return an error when the [UPDATE](https://docs.snowflake.com/en/sql-reference/sql/update) command is used to update a target row that joins multiple source rows and the system cannot determine the action to perform on the target row. For more information, check [ERROR_ON_NONDETERMINISTIC_UPDATE docs](https://docs.snowflake.com/en/sql-reference/parameters#error-on-nondeterministic-update). -- `finalize` (String) Specifies the name of a root task that the finalizer task is associated with. Finalizer tasks run after all other tasks in the task graph run to completion. You can define the SQL of a finalizer task to handle notifications and the release and cleanup of resources that a task graph uses. For more information, see [Release and cleanup of task graphs](https://docs.snowflake.com/en/user-guide/tasks-graphs.html#label-finalizer-task). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `finalize` (String) Specifies the name of a root task that the finalizer task is associated with. Finalizer tasks run after all other tasks in the task graph run to completion. You can define the SQL of a finalizer task to handle notifications and the release and cleanup of resources that a task graph uses. For more information, see [Release and cleanup of task graphs](https://docs.snowflake.com/en/user-guide/tasks-graphs.html#label-finalizer-task). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. - `geography_output_format` (String) Display format for [GEOGRAPHY values](https://docs.snowflake.com/en/sql-reference/data-types-geospatial.html#label-data-types-geography). For more information, check [GEOGRAPHY_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#geography-output-format). - `geometry_output_format` (String) Display format for [GEOMETRY values](https://docs.snowflake.com/en/sql-reference/data-types-geospatial.html#label-data-types-geometry). For more information, check [GEOMETRY_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#geometry-output-format). - `jdbc_treat_timestamp_ntz_as_utc` (Boolean) Specifies how JDBC processes TIMESTAMP_NTZ values. For more information, check [JDBC_TREAT_TIMESTAMP_NTZ_AS_UTC docs](https://docs.snowflake.com/en/sql-reference/parameters#jdbc-treat-timestamp-ntz-as-utc). @@ -219,10 +219,10 @@ resource "snowflake_task" "test" { - `two_digit_century_start` (Number) Specifies the “century start” year for 2-digit years (i.e. the earliest year such dates can represent). This parameter prevents ambiguous dates when importing or converting data with the `YY` date format component (i.e. years represented as 2 digits). For more information, check [TWO_DIGIT_CENTURY_START docs](https://docs.snowflake.com/en/sql-reference/parameters#two-digit-century-start). - `unsupported_ddl_action` (String) Determines if an unsupported (i.e. non-default) value specified for a constraint property returns an error. For more information, check [UNSUPPORTED_DDL_ACTION docs](https://docs.snowflake.com/en/sql-reference/parameters#unsupported-ddl-action). - `use_cached_result` (Boolean) Specifies whether to reuse persisted query results, if available, when a matching query is submitted. For more information, check [USE_CACHED_RESULT docs](https://docs.snowflake.com/en/sql-reference/parameters#use-cached-result). -- `user_task_managed_initial_warehouse_size` (String) Specifies the size of the compute resources to provision for the first run of the task, before a task history is available for Snowflake to determine an ideal size. Once a task has successfully completed a few runs, Snowflake ignores this parameter setting. Valid values are (case-insensitive): %s. (Conflicts with warehouse) For more information, check [USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE docs](https://docs.snowflake.com/en/sql-reference/parameters#user-task-managed-initial-warehouse-size). +- `user_task_managed_initial_warehouse_size` (String) Specifies the size of the compute resources to provision for the first run of the task, before a task history is available for Snowflake to determine an ideal size. Once a task has successfully completed a few runs, Snowflake ignores this parameter setting. Valid values are (case-insensitive): %s. (Conflicts with warehouse). For more information about warehouses, see [docs](./warehouse). For more information, check [USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE docs](https://docs.snowflake.com/en/sql-reference/parameters#user-task-managed-initial-warehouse-size). - `user_task_minimum_trigger_interval_in_seconds` (Number) Minimum amount of time between Triggered Task executions in seconds For more information, check [USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS docs](https://docs.snowflake.com/en/sql-reference/parameters#user-task-minimum-trigger-interval-in-seconds). - `user_task_timeout_ms` (Number) Specifies the time limit on a single run of the task before it times out (in milliseconds). For more information, check [USER_TASK_TIMEOUT_MS docs](https://docs.snowflake.com/en/sql-reference/parameters#user-task-timeout-ms). -- `warehouse` (String) The warehouse the task will use. Omit this parameter to use Snowflake-managed compute resources for runs of this task. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters. (Conflicts with user_task_managed_initial_warehouse_size) +- `warehouse` (String) The warehouse the task will use. Omit this parameter to use Snowflake-managed compute resources for runs of this task. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters. (Conflicts with user_task_managed_initial_warehouse_size) For more information about this resource, see [docs](./warehouse). - `week_of_year_policy` (Number) Specifies how the weeks in a given year are computed. `0`: The semantics used are equivalent to the ISO semantics, in which a week belongs to a given year if at least 4 days of that week are in that year. `1`: January 1 is included in the first week of the year and December 31 is included in the last week of the year. For more information, check [WEEK_OF_YEAR_POLICY docs](https://docs.snowflake.com/en/sql-reference/parameters#week-of-year-policy). - `week_start` (Number) Specifies the first day of the week (used by week-related date functions). `0`: Legacy Snowflake behavior is used (i.e. ISO-like semantics). `1` (Monday) to `7` (Sunday): All the week-related functions use weeks that start on the specified day of the week. For more information, check [WEEK_START docs](https://docs.snowflake.com/en/sql-reference/parameters#week-start). - `when` (String) Specifies a Boolean SQL expression; multiple conditions joined with AND/OR are supported. When a task is triggered (based on its SCHEDULE or AFTER setting), it validates the conditions of the expression to determine whether to execute. If the conditions of the expression are not met, then the task skips the current run. Any tasks that identify this task as a predecessor also don’t run. @@ -1046,6 +1046,5 @@ Read-Only: Import is supported using the following syntax: ```shell -# format is database name | schema name | task name -terraform import snowflake_task.example 'dbName|schemaName|taskName' +terraform import snowflake_task.example '""."".""' ``` diff --git a/docs/resources/user.md b/docs/resources/user.md index 0f542d36b9..b1bb599467 100644 --- a/docs/resources/user.md +++ b/docs/resources/user.md @@ -40,9 +40,9 @@ resource "snowflake_user" "user" { display_name = "Snowflake User display name" email = "user@snowflake.example" - default_warehouse = "warehouse" + default_warehouse = snowflake_warehouse.example.fully_qualified_name default_secondary_roles_option = "ALL" - default_role = "role1" + default_role = snowflake_role.example.fully_qualified_name default_namespace = "some.namespace" mins_to_unlock = 9 @@ -128,7 +128,7 @@ resource "snowflake_user" "u" { ### Required -- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Name of the user. Note that if you do not supply login_name this will be used as login_name. Check the [docs](https://docs.snowflake.net/manuals/sql-reference/sql/create-user.html#required-parameters). Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -149,9 +149,9 @@ resource "snowflake_user" "u" { - `date_output_format` (String) Specifies the display format for the DATE data type. For more information, see [Date and time input and output formats](https://docs.snowflake.com/en/sql-reference/date-time-input-output). For more information, check [DATE_OUTPUT_FORMAT docs](https://docs.snowflake.com/en/sql-reference/parameters#date-output-format). - `days_to_expiry` (Number) Specifies the number of days after which the user status is set to `Expired` and the user is no longer allowed to log in. This is useful for defining temporary users (i.e. users who should only have access to Snowflake for a limited time period). In general, you should not set this property for [account administrators](https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html#label-accountadmin-users) (i.e. users with the `ACCOUNTADMIN` role) because Snowflake locks them out when they become `Expired`. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `default_namespace` (String) Specifies the namespace (database only or database and schema) that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the namespace exists. -- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. +- `default_role` (String) Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists. For more information about this resource, see [docs](./account_role). - `default_secondary_roles_option` (String) Specifies the secondary roles that are active for the user’s session upon login. Valid values are (case-insensitive): `DEFAULT` | `NONE` | `ALL`. More information can be found in [doc](https://docs.snowflake.com/en/sql-reference/sql/create-user#optional-object-properties-objectproperties). -- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. +- `default_warehouse` (String) Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists. For more information about this resource, see [docs](./warehouse). - `disable_mfa` (String) Allows enabling or disabling [multi-factor authentication](https://docs.snowflake.com/en/user-guide/security-mfa). Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `disabled` (String) Specifies whether the user is disabled, which prevents logging in and aborts all the currently-running queries for the user. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `display_name` (String) Name displayed for the user in the Snowflake web interface. @@ -179,7 +179,7 @@ resource "snowflake_user" "u" { - `network_policy` (String) Specifies the network policy to enforce for your account. Network policies enable restricting access to your account based on users’ IP address. For more details, see [Controlling network traffic with network policies](https://docs.snowflake.com/en/user-guide/network-policies). Any existing network policy (created using [CREATE NETWORK POLICY](https://docs.snowflake.com/en/sql-reference/sql/create-network-policy)). For more information, check [NETWORK_POLICY docs](https://docs.snowflake.com/en/sql-reference/parameters#network-policy). - `noorder_sequence_as_default` (Boolean) Specifies whether the ORDER or NOORDER property is set by default when you create a new sequence or add a new table column. The ORDER and NOORDER properties determine whether or not the values are generated for the sequence or auto-incremented column in [increasing or decreasing order](https://docs.snowflake.com/en/user-guide/querying-sequences.html#label-querying-sequences-increasing-values). For more information, check [NOORDER_SEQUENCE_AS_DEFAULT docs](https://docs.snowflake.com/en/sql-reference/parameters#noorder-sequence-as-default). - `odbc_treat_decimal_as_int` (Boolean) Specifies how ODBC processes columns that have a scale of zero (0). For more information, check [ODBC_TREAT_DECIMAL_AS_INT docs](https://docs.snowflake.com/en/sql-reference/parameters#odbc-treat-decimal-as-int). -- `password` (String, Sensitive) Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully. +- `password` (String, Sensitive) Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". - `prevent_unload_to_internal_stages` (Boolean) Specifies whether to prevent data unload operations to internal (Snowflake) stages using [COPY INTO ](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location) statements. For more information, check [PREVENT_UNLOAD_TO_INTERNAL_STAGES docs](https://docs.snowflake.com/en/sql-reference/parameters#prevent-unload-to-internal-stages). - `query_tag` (String) Optional string that can be used to tag queries and other SQL statements executed within a session. The tags are displayed in the output of the [QUERY_HISTORY, QUERY_HISTORY_BY_*](https://docs.snowflake.com/en/sql-reference/functions/query_history) functions. For more information, check [QUERY_TAG docs](https://docs.snowflake.com/en/sql-reference/parameters#query-tag). - `quoted_identifiers_ignore_case` (Boolean) Specifies whether letters in double-quoted object identifiers are stored and resolved as uppercase letters. By default, Snowflake preserves the case of alphabetic characters when storing and resolving double-quoted identifiers (see [Identifier resolution](https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html#label-identifier-casing)). You can use this parameter in situations in which [third-party applications always use double quotes around identifiers](https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html#label-identifier-casing-parameter). For more information, check [QUOTED_IDENTIFIERS_IGNORE_CASE docs](https://docs.snowflake.com/en/sql-reference/parameters#quoted-identifiers-ignore-case). diff --git a/docs/resources/view.md b/docs/resources/view.md index 4295f33051..9a4efafe6d 100644 --- a/docs/resources/view.md +++ b/docs/resources/view.md @@ -7,10 +7,10 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v094x--v0950) to use it. -!> **Note about copy_grants** Fields like `is_recursive`, `is_temporary`, `copy_grants` and `statement` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-view)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. - !> Due to Snowflake limitations, to properly compute diff on `statement` field, the provider parses a `text` field which contains the whole CREATE query used to create the resource. We recommend not using special characters, especially `(`, `,`, `)` in any of the fields, if possible. +~> **Note about copy_grants** Fields like `is_recursive`, `is_temporary`, `copy_grants` and `statement` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-view)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + ~> **Required warehouse** For this resource, the provider uses [policy references](https://docs.snowflake.com/en/sql-reference/functions/policy_references) which requires a warehouse in the connection. Please, make sure you have either set a DEFAULT_WAREHOUSE for the user, or specified a warehouse in the provider configuration. # snowflake_view (Resource) @@ -59,12 +59,12 @@ resource "snowflake_view" "test" { policy_name = "projection_policy" } masking_policy { - policy_name = "masking_policy" + policy_name = snowflake_masking_policy.example.fully_qualified_name using = ["address"] } } row_access_policy { - policy_name = "row_access_policy" + policy_name = snowflake_row_access_policy.example.fully_qualified_name on = ["id"] } aggregation_policy { @@ -72,8 +72,9 @@ resource "snowflake_view" "test" { entity_key = ["id"] } data_metric_function { - function_name = "data_metric_function" - on = ["id"] + function_name = "data_metric_function" + on = ["id"] + schedule_status = "STARTED" } data_metric_schedule { using_cron = "15 * * * * UTC" @@ -91,10 +92,10 @@ SQL ### Required -- `database` (String) The database in which to create the view. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `name` (String) Specifies the identifier for the view; must be unique for the schema in which the view is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `schema` (String) The schema in which to create the view. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` -- `statement` (String) Specifies the query used to create the view. +- `database` (String) The database in which to create the view. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `name` (String) Specifies the identifier for the view; must be unique for the schema in which the view is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `schema` (String) The schema in which to create the view. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. +- `statement` (String) Specifies the query used to create the view. To mitigate permadiff on this field, the provider replaces blank characters with a space. This can lead to false positives in cases where a change in case or run of whitespace is semantically significant. ### Optional @@ -102,7 +103,7 @@ SQL - `change_tracking` (String) Specifies to enable or disable change tracking on the table. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `column` (Block List) If you want to change the name of a column or add a comment to a column in the new view, include a column list that specifies the column names and (if needed) comments about the columns. You do not need to specify the data types of the columns. If this field is not specified, columns are inferred from the `statement` field by Snowflake. (see [below for nested schema](#nestedblock--column)) - `comment` (String) Specifies a comment for the view. -- `copy_grants` (Boolean) Retains the access permissions from the original view when a new view is created using the OR REPLACE clause. +- `copy_grants` (Boolean) Retains the access permissions from the original view when a view is recreated using the OR REPLACE clause. This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform. - `data_metric_function` (Block Set) Data metric functions used for the view. (see [below for nested schema](#nestedblock--data_metric_function)) - `data_metric_schedule` (Block List, Max: 1) Specifies the schedule to run the data metric functions periodically. (see [below for nested schema](#nestedblock--data_metric_schedule)) - `is_recursive` (String) Specifies that the view can refer to itself using recursive syntax without necessarily using a CTE (common table expression). Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. @@ -147,7 +148,7 @@ Optional: Required: -- `policy_name` (String) Specifies the masking policy to set on a column. +- `policy_name` (String) Specifies the masking policy to set on a column. For more information about this resource, see [docs](./masking_policy). Optional: @@ -188,7 +189,7 @@ Optional: Required: - `on` (Set of String) Defines which columns are affected by the policy. -- `policy_name` (String) Row access policy name. +- `policy_name` (String) Row access policy name. For more information about this resource, see [docs](./row_access_policy). diff --git a/docs/resources/warehouse.md b/docs/resources/warehouse.md index af0fbe914d..44f1ca5e65 100644 --- a/docs/resources/warehouse.md +++ b/docs/resources/warehouse.md @@ -7,6 +7,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. + +-> **Note** Field `RESOURCE_CONSTRAINT` is currently missing. It will be added in the future. + + +-> **Note** Assigning resource monitors to warehouses requires ACCOUNTADMIN role. To do this, either manage the warehouse resource with ACCOUNTADMIN role, or use [unsafe_execute](./unsafe_execute) instead. See [this issue](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3019) for more details. + # snowflake_warehouse (Resource) Resource used to manage warehouse objects. For more information, check [warehouse documentation](https://docs.snowflake.com/en/sql-reference/commands-warehouse). @@ -14,10 +20,30 @@ Resource used to manage warehouse objects. For more information, check [warehous ## Example Usage ```terraform +# Resource with required fields +resource "snowflake_warehouse" "warehouse" { + name = "WAREHOUSE" +} + +# Resource with all fields resource "snowflake_warehouse" "warehouse" { - name = "test" - comment = "foo" - warehouse_size = "small" + name = "WAREHOUSE" + warehouse_type = "SNOWPARK-OPTIMIZED" + warehouse_size = "MEDIUM" + max_cluster_count = 4 + min_cluster_count = 2 + scaling_policy = "ECONOMY" + auto_suspend = 1200 + auto_resume = false + initially_suspended = false + resource_monitor = snowflake_resource_monitor.monitor.fully_qualified_name + comment = "An example warehouse." + enable_query_acceleration = true + query_acceleration_max_scale_factor = 4 + + max_concurrency_level = 4 + statement_queued_timeout_in_seconds = 5 + statement_timeout_in_seconds = 86400 } ``` -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). @@ -28,7 +54,7 @@ resource "snowflake_warehouse" "warehouse" { ### Required -- `name` (String) Identifier for the virtual warehouse; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"` +- `name` (String) Identifier for the virtual warehouse; must be unique for your account. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`. ### Optional @@ -41,7 +67,7 @@ resource "snowflake_warehouse" "warehouse" { - `max_concurrency_level` (Number) Object parameter that specifies the concurrency level for SQL statements (i.e. queries and DML) executed by a warehouse. - `min_cluster_count` (Number) Specifies the minimum number of server clusters for the warehouse (only applies to multi-cluster warehouses). - `query_acceleration_max_scale_factor` (Number) Specifies the maximum scale factor for leasing compute resources for query acceleration. The scale factor is used as a multiplier based on warehouse size. -- `resource_monitor` (String) Specifies the name of a resource monitor that is explicitly assigned to the warehouse. +- `resource_monitor` (String) Specifies the name of a resource monitor that is explicitly assigned to the warehouse. For more information about this resource, see [docs](./resource_monitor). - `scaling_policy` (String) Specifies the policy for automatically starting and shutting down clusters in a multi-cluster warehouse running in Auto-scale mode. Valid values are (case-insensitive): `STANDARD` | `ECONOMY`. - `statement_queued_timeout_in_seconds` (Number) Object parameter that specifies the time, in seconds, a SQL statement (query, DDL, DML, etc.) can be queued on a warehouse before it is canceled by the system. - `statement_timeout_in_seconds` (Number) Specifies the time, in seconds, after which a running SQL statement (query, DDL, DML, etc.) is canceled by the system @@ -139,5 +165,5 @@ Read-Only: Import is supported using the following syntax: ```shell -terraform import snowflake_warehouse.example warehouseName +terraform import snowflake_warehouse.example '""' ``` diff --git a/examples/data-sources/snowflake_connections/data-source.tf b/examples/data-sources/snowflake_connections/data-source.tf index b32fd92e5e..58e2b8ea5e 100644 --- a/examples/data-sources/snowflake_connections/data-source.tf +++ b/examples/data-sources/snowflake_connections/data-source.tf @@ -23,3 +23,15 @@ data "snowflake_connections" "like_prefix" { output "like_prefix_output" { value = data.snowflake_connections.like_prefix.connections } + +# Ensure the number of connections is equal to at exactly one element (with the use of check block) +check "connection_check" { + data "snowflake_connections" "assert_with_check_block" { + like = "connection-name" + } + + assert { + condition = length(data.snowflake_connections.assert_with_check_block.connections) == 1 + error_message = "connections filtered by '${data.snowflake_connections.assert_with_check_block.like}' returned ${length(data.snowflake_connections.assert_with_check_block.connections)} connections where one was expected" + } +} diff --git a/examples/data-sources/snowflake_database_roles/data-source.tf b/examples/data-sources/snowflake_database_roles/data-source.tf index ff07fdc68b..29621c29e9 100644 --- a/examples/data-sources/snowflake_database_roles/data-source.tf +++ b/examples/data-sources/snowflake_database_roles/data-source.tf @@ -44,7 +44,7 @@ data "snowflake_database_roles" "assert_with_postcondition" { # Ensure the number of database roles is equal to at exactly one element (with the use of check block) check "database_role_check" { - data "snowflake_resource_monitors" "assert_with_check_block" { + data "snowflake_database_roles" "assert_with_check_block" { in_database = "database-name" like = "database_role-name" } diff --git a/examples/data-sources/snowflake_network_policies/data-source.tf b/examples/data-sources/snowflake_network_policies/data-source.tf index 55b7ce844a..496ade396f 100644 --- a/examples/data-sources/snowflake_network_policies/data-source.tf +++ b/examples/data-sources/snowflake_network_policies/data-source.tf @@ -27,7 +27,7 @@ output "only_show_output" { # Ensure the number of network policies is equal to at least one element (with the use of postcondition) data "snowflake_network_policies" "assert_with_postcondition" { - starts_with = "network-policy-name" + like = "network-policy-name" lifecycle { postcondition { condition = length(self.network_policies) > 0 diff --git a/examples/resources/snowflake_account/import.sh b/examples/resources/snowflake_account/import.sh new file mode 100644 index 0000000000..8076279421 --- /dev/null +++ b/examples/resources/snowflake_account/import.sh @@ -0,0 +1 @@ +terraform import snowflake_account.example '"".""' diff --git a/examples/resources/snowflake_account/resource.tf b/examples/resources/snowflake_account/resource.tf index 3de2897d40..a9e61e2d3f 100644 --- a/examples/resources/snowflake_account/resource.tf +++ b/examples/resources/snowflake_account/resource.tf @@ -1,18 +1,42 @@ -provider "snowflake" { - role = "ORGADMIN" - alias = "orgadmin" +## Minimal +resource "snowflake_account" "minimal" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_password = "ADMIN_PASSWORD" + email = "admin@email.com" + edition = "STANDARD" + grace_period_in_days = 3 +} + +## Complete (with SERVICE user type) +resource "snowflake_account" "complete" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_rsa_public_key = "" + admin_user_type = "SERVICE" + email = "admin@email.com" + edition = "STANDARD" + region_group = "PUBLIC" + region = "AWS_US_WEST_2" + comment = "some comment" + is_org_admin = "true" + grace_period_in_days = 3 } -resource "snowflake_account" "ac1" { - provider = snowflake.orgadmin - name = "SNOWFLAKE_TEST_ACCOUNT" - admin_name = "John Doe" - admin_password = "Abcd1234!" - email = "john.doe@snowflake.com" - first_name = "John" - last_name = "Doe" - must_change_password = true +## Complete (with PERSON user type) +resource "snowflake_account" "complete" { + name = "ACCOUNT_NAME" + admin_name = "ADMIN_NAME" + admin_password = "ADMIN_PASSWORD" + admin_user_type = "PERSON" + first_name = "first_name" + last_name = "last_name" + email = "admin@email.com" + must_change_password = "false" edition = "STANDARD" - comment = "Snowflake Test Account" + region_group = "PUBLIC" region = "AWS_US_WEST_2" + comment = "some comment" + is_org_admin = "true" + grace_period_in_days = 3 } diff --git a/examples/resources/snowflake_account_role/import.sh b/examples/resources/snowflake_account_role/import.sh index d7d6ebddbe..28bc0caf54 100644 --- a/examples/resources/snowflake_account_role/import.sh +++ b/examples/resources/snowflake_account_role/import.sh @@ -1 +1 @@ -terraform import snowflake_account_role.example "name" +terraform import snowflake_account_role.example '""' diff --git a/examples/resources/snowflake_api_authentication_integration_with_authorization_code_grant/import.sh b/examples/resources/snowflake_api_authentication_integration_with_authorization_code_grant/import.sh index c641594f3b..d825e7f812 100644 --- a/examples/resources/snowflake_api_authentication_integration_with_authorization_code_grant/import.sh +++ b/examples/resources/snowflake_api_authentication_integration_with_authorization_code_grant/import.sh @@ -1 +1 @@ -terraform import snowflake_api_authentication_integration_with_authorization_code_grant.example "name" +terraform import snowflake_api_authentication_integration_with_authorization_code_grant.example '""' diff --git a/examples/resources/snowflake_api_authentication_integration_with_client_credentials/import.sh b/examples/resources/snowflake_api_authentication_integration_with_client_credentials/import.sh index d3454c9a27..60b80879ac 100644 --- a/examples/resources/snowflake_api_authentication_integration_with_client_credentials/import.sh +++ b/examples/resources/snowflake_api_authentication_integration_with_client_credentials/import.sh @@ -1 +1 @@ -terraform import snowflake_api_authentication_integration_with_client_credentials.example "name" +terraform import snowflake_api_authentication_integration_with_client_credentials.example '""' diff --git a/examples/resources/snowflake_api_authentication_integration_with_jwt_bearer/import.sh b/examples/resources/snowflake_api_authentication_integration_with_jwt_bearer/import.sh index b1cb40660a..a3d29286ad 100644 --- a/examples/resources/snowflake_api_authentication_integration_with_jwt_bearer/import.sh +++ b/examples/resources/snowflake_api_authentication_integration_with_jwt_bearer/import.sh @@ -1 +1 @@ -terraform import snowflake_api_authentication_integration_with_jwt_bearer.example "name" +terraform import snowflake_api_authentication_integration_with_jwt_bearer.example '""' diff --git a/examples/resources/snowflake_database/import.sh b/examples/resources/snowflake_database/import.sh index 8a30774299..add2afbd03 100644 --- a/examples/resources/snowflake_database/import.sh +++ b/examples/resources/snowflake_database/import.sh @@ -1 +1 @@ -terraform import snowflake_database.example 'database_name' +terraform import snowflake_database.example '""' diff --git a/examples/resources/snowflake_database/resource.tf b/examples/resources/snowflake_database/resource.tf index 13c1833c6b..616e0b2c00 100644 --- a/examples/resources/snowflake_database/resource.tf +++ b/examples/resources/snowflake_database/resource.tf @@ -10,10 +10,9 @@ resource "snowflake_database" "primary" { comment = "my standard database" data_retention_time_in_days = 10 - data_retention_time_in_days_save = 10 max_data_extension_time_in_days = 20 - external_volume = "" - catalog = "" + external_volume = snowflake_external_volume.example.fully_qualified_name + catalog = snowflake_catalog.example.fully_qualified_name replace_invalid_characters = false default_ddl_collation = "en_US" storage_serialization_policy = "COMPATIBLE" @@ -40,11 +39,11 @@ resource "snowflake_database" "primary" { locals { replication_configs = [ { - account_identifier = "." + account_identifier = "\"\".\"\"" with_failover = true }, { - account_identifier = "." + account_identifier = "\"\".\"\"" with_failover = true }, ] @@ -52,10 +51,13 @@ locals { resource "snowflake_database" "primary" { name = "database_name" - for_each = local.replication_configs + for_each = { for rc in local.replication_configs : rc.account_identifier => rc } replication { - enable_to_account = each.value + enable_to_account { + account_identifier = each.value.account_identifier + with_failover = each.value.with_failover + } ignore_edition_check = true } } diff --git a/examples/resources/snowflake_external_oauth_integration/import.sh b/examples/resources/snowflake_external_oauth_integration/import.sh index 8029ac973e..d4ad4dd90d 100644 --- a/examples/resources/snowflake_external_oauth_integration/import.sh +++ b/examples/resources/snowflake_external_oauth_integration/import.sh @@ -1 +1 @@ -terraform import snowflake_external_oauth_integration.example "name" +terraform import snowflake_external_oauth_integration.example '""' diff --git a/examples/resources/snowflake_external_oauth_integration/resource.tf b/examples/resources/snowflake_external_oauth_integration/resource.tf index ef29249ace..017f0edf04 100644 --- a/examples/resources/snowflake_external_oauth_integration/resource.tf +++ b/examples/resources/snowflake_external_oauth_integration/resource.tf @@ -11,7 +11,7 @@ resource "snowflake_external_oauth_integration" "test" { resource "snowflake_external_oauth_integration" "test" { comment = "comment" enabled = true - external_oauth_allowed_roles_list = ["user1"] + external_oauth_allowed_roles_list = [snowflake_role.one.fully_qualified_name] external_oauth_any_role_mode = "ENABLE" external_oauth_audience_list = ["https://example.com"] external_oauth_issuer = "issuer" @@ -29,7 +29,7 @@ resource "snowflake_external_oauth_integration" "test" { enabled = true external_oauth_any_role_mode = "ENABLE" external_oauth_audience_list = ["https://example.com"] - external_oauth_blocked_roles_list = ["user1"] + external_oauth_blocked_roles_list = [snowflake_role.one.fully_qualified_name] external_oauth_issuer = "issuer" external_oauth_rsa_public_key = file("key.pem") external_oauth_rsa_public_key_2 = file("key2.pem") diff --git a/examples/resources/snowflake_grant_account_role/resource.tf b/examples/resources/snowflake_grant_account_role/resource.tf index d3af9ba081..7558d13da7 100644 --- a/examples/resources/snowflake_grant_account_role/resource.tf +++ b/examples/resources/snowflake_grant_account_role/resource.tf @@ -3,11 +3,11 @@ ################################## resource "snowflake_account_role" "role" { - name = var.role_name + name = "ROLE" } resource "snowflake_account_role" "parent_role" { - name = var.parent_role_name + name = "PARENT_ROLE" } resource "snowflake_grant_account_role" "g" { @@ -21,11 +21,11 @@ resource "snowflake_grant_account_role" "g" { ################################## resource "snowflake_account_role" "role" { - name = var.role_name + name = "ROLE" } resource "snowflake_user" "user" { - name = var.user_name + name = "USER" } resource "snowflake_grant_account_role" "g" { diff --git a/examples/resources/snowflake_network_policy/import.sh b/examples/resources/snowflake_network_policy/import.sh index e9f2b372ac..9da953c5d9 100644 --- a/examples/resources/snowflake_network_policy/import.sh +++ b/examples/resources/snowflake_network_policy/import.sh @@ -1 +1 @@ -terraform import snowflake_network_policy.example "name" +terraform import snowflake_network_policy.example '""' diff --git a/examples/resources/snowflake_network_policy/resource.tf b/examples/resources/snowflake_network_policy/resource.tf index d6cfe4bb62..5ff38f82b7 100644 --- a/examples/resources/snowflake_network_policy/resource.tf +++ b/examples/resources/snowflake_network_policy/resource.tf @@ -6,9 +6,9 @@ resource "snowflake_network_policy" "basic" { ## Complete (with every optional set) resource "snowflake_network_policy" "complete" { name = "network_policy_name" - allowed_network_rule_list = [""] - blocked_network_rule_list = [""] + allowed_network_rule_list = [snowflake_network_rule.one.fully_qualified_name] + blocked_network_rule_list = [snowflake_network_rule.two.fully_qualified_name] allowed_ip_list = ["192.168.1.0/24"] blocked_ip_list = ["192.168.1.99"] comment = "my network policy" -} \ No newline at end of file +} diff --git a/examples/resources/snowflake_oauth_integration_for_custom_clients/import.sh b/examples/resources/snowflake_oauth_integration_for_custom_clients/import.sh index beeddc5d18..94acb1cc2d 100644 --- a/examples/resources/snowflake_oauth_integration_for_custom_clients/import.sh +++ b/examples/resources/snowflake_oauth_integration_for_custom_clients/import.sh @@ -1 +1 @@ -terraform import snowflake_oauth_integration_for_custom_clients.example "name" +terraform import snowflake_oauth_integration_for_custom_clients.example '""' diff --git a/examples/resources/snowflake_oauth_integration_for_custom_clients/resource.tf b/examples/resources/snowflake_oauth_integration_for_custom_clients/resource.tf index 77f64e69ba..c48c536a33 100644 --- a/examples/resources/snowflake_oauth_integration_for_custom_clients/resource.tf +++ b/examples/resources/snowflake_oauth_integration_for_custom_clients/resource.tf @@ -1,6 +1,6 @@ # basic resource resource "snowflake_oauth_integration_for_custom_clients" "basic" { - name = "saml_integration" + name = "integration" oauth_client_type = "CONFIDENTIAL" oauth_redirect_uri = "https://example.com" blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] @@ -8,18 +8,18 @@ resource "snowflake_oauth_integration_for_custom_clients" "basic" { # resource with all fields set resource "snowflake_oauth_integration_for_custom_clients" "complete" { - name = "saml_integration" + name = "integration" oauth_client_type = "CONFIDENTIAL" oauth_redirect_uri = "https://example.com" enabled = "true" oauth_allow_non_tls_redirect_uri = "true" oauth_enforce_pkce = "true" oauth_use_secondary_roles = "NONE" - pre_authorized_roles_list = ["role_id1", "role_id2"] - blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", "role_id1", "role_id2"] + pre_authorized_roles_list = [snowflake_role.one.fully_qualified_name, snowflake_role.two.fully_qualified_name] + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", snowflake_role.three.fully_qualified_name, snowflake_role.four.fully_qualified_name] oauth_issue_refresh_tokens = "true" oauth_refresh_token_validity = 87600 - network_policy = "network_policy_id" + network_policy = snowflake_network_policy.example.fully_qualified_name oauth_client_rsa_public_key = file("rsa.pub") oauth_client_rsa_public_key_2 = file("rsa2.pub") comment = "my oauth integration" diff --git a/examples/resources/snowflake_oauth_integration_for_partner_applications/resource.tf b/examples/resources/snowflake_oauth_integration_for_partner_applications/resource.tf index f6a52145a2..1c8a7830c2 100644 --- a/examples/resources/snowflake_oauth_integration_for_partner_applications/resource.tf +++ b/examples/resources/snowflake_oauth_integration_for_partner_applications/resource.tf @@ -14,6 +14,6 @@ resource "snowflake_oauth_integration_for_partner_applications" "test" { oauth_issue_refresh_tokens = "true" oauth_refresh_token_validity = 3600 oauth_use_secondary_roles = "IMPLICIT" - blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", "role_id1", "role_id2"] + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN", snowflake_role.one.fully_qualified_name, snowflake_role.two.fully_qualified_name] comment = "example oauth integration for partner applications" } diff --git a/examples/resources/snowflake_primary_connection/import.sh b/examples/resources/snowflake_primary_connection/import.sh index 743bf79921..7813816225 100644 --- a/examples/resources/snowflake_primary_connection/import.sh +++ b/examples/resources/snowflake_primary_connection/import.sh @@ -1 +1 @@ -terraform import snowflake_primary_connection.example 'connection_name' +terraform import snowflake_primary_connection.example '""' diff --git a/examples/resources/snowflake_primary_connection/resource.tf b/examples/resources/snowflake_primary_connection/resource.tf index b9fe410b72..ba7c7da919 100644 --- a/examples/resources/snowflake_primary_connection/resource.tf +++ b/examples/resources/snowflake_primary_connection/resource.tf @@ -8,6 +8,6 @@ resource "snowflake_primary_connection" "complete" { name = "connection_name" comment = "my complete connection" enable_failover_to_accounts = [ - "." + "\"\".\"\"" ] } diff --git a/examples/resources/snowflake_resource_monitor/import.sh b/examples/resources/snowflake_resource_monitor/import.sh index 2fd060d6ed..c0cc9809c7 100644 --- a/examples/resources/snowflake_resource_monitor/import.sh +++ b/examples/resources/snowflake_resource_monitor/import.sh @@ -1,2 +1 @@ -# format is the resource monitor name -terraform import snowflake_resource_monitor.example 'resourceMonitorName' +terraform import snowflake_resource_monitor.example '""' diff --git a/examples/resources/snowflake_resource_monitor/resource.tf b/examples/resources/snowflake_resource_monitor/resource.tf index 45273e869d..5d9a960ee8 100644 --- a/examples/resources/snowflake_resource_monitor/resource.tf +++ b/examples/resources/snowflake_resource_monitor/resource.tf @@ -9,7 +9,7 @@ resource "snowflake_resource_monitor" "minimal_working" { name = "resource-monitor-name" credit_quota = 100 suspend_trigger = 100 - notify_users = ["USERONE", "USERTWO"] + notify_users = [snowflake_user.one.fully_qualified_name, snowflake_user.two.fully_qualified_name] } resource "snowflake_resource_monitor" "complete" { @@ -24,5 +24,5 @@ resource "snowflake_resource_monitor" "complete" { suspend_trigger = 50 suspend_immediate_trigger = 90 - notify_users = ["USERONE", "USERTWO"] + notify_users = [snowflake_user.one.fully_qualified_name, snowflake_user.two.fully_qualified_name] } diff --git a/examples/resources/snowflake_row_access_policy/resource.tf b/examples/resources/snowflake_row_access_policy/resource.tf index c4ff60b7be..c266158eb0 100644 --- a/examples/resources/snowflake_row_access_policy/resource.tf +++ b/examples/resources/snowflake_row_access_policy/resource.tf @@ -1,3 +1,4 @@ +# resource with all fields set resource "snowflake_row_access_policy" "example_row_access_policy" { name = "EXAMPLE_ROW_ACCESS_POLICY" database = "EXAMPLE_DB" diff --git a/examples/resources/snowflake_saml2_integration/import.sh b/examples/resources/snowflake_saml2_integration/import.sh index bf68b01c98..d6643bf352 100644 --- a/examples/resources/snowflake_saml2_integration/import.sh +++ b/examples/resources/snowflake_saml2_integration/import.sh @@ -1 +1 @@ -terraform import snowflake_saml2_integration.example "name" +terraform import snowflake_saml2_integration.example '""' diff --git a/examples/resources/snowflake_schema/import.sh b/examples/resources/snowflake_schema/import.sh index dea2bb90cf..fe3ede8b35 100644 --- a/examples/resources/snowflake_schema/import.sh +++ b/examples/resources/snowflake_schema/import.sh @@ -1,2 +1 @@ -# format is . terraform import snowflake_schema.example '"".""' diff --git a/examples/resources/snowflake_scim_integration/import.sh b/examples/resources/snowflake_scim_integration/import.sh index 365c14b973..467137f2f3 100644 --- a/examples/resources/snowflake_scim_integration/import.sh +++ b/examples/resources/snowflake_scim_integration/import.sh @@ -1 +1 @@ -terraform import snowflake_scim_integration.example "name" +terraform import snowflake_scim_integration.example '""' diff --git a/examples/resources/snowflake_scim_integration/resource.tf b/examples/resources/snowflake_scim_integration/resource.tf index 8e3417fae2..445860f22a 100644 --- a/examples/resources/snowflake_scim_integration/resource.tf +++ b/examples/resources/snowflake_scim_integration/resource.tf @@ -4,14 +4,16 @@ resource "snowflake_scim_integration" "test" { enabled = true scim_client = "GENERIC" sync_password = true + run_as_role = "GENERIC_SCIM_PROVISIONER" } + # resource with all fields set resource "snowflake_scim_integration" "test" { name = "test" enabled = true scim_client = "GENERIC" sync_password = true - network_policy = "network_policy_test" + network_policy = snowflake_network_policy.example.fully_qualified_name run_as_role = "GENERIC_SCIM_PROVISIONER" comment = "foo" } diff --git a/examples/resources/snowflake_secondary_connection/import.sh b/examples/resources/snowflake_secondary_connection/import.sh index 4de28135f7..b78ed2332d 100644 --- a/examples/resources/snowflake_secondary_connection/import.sh +++ b/examples/resources/snowflake_secondary_connection/import.sh @@ -1 +1 @@ -terraform import snowflake_secondary_connection.example 'secondary_connection_name' +terraform import snowflake_secondary_connection.example '""' diff --git a/examples/resources/snowflake_secondary_connection/resource.tf b/examples/resources/snowflake_secondary_connection/resource.tf index 17d32c0820..66617dd596 100644 --- a/examples/resources/snowflake_secondary_connection/resource.tf +++ b/examples/resources/snowflake_secondary_connection/resource.tf @@ -1,12 +1,12 @@ ## Minimal resource "snowflake_secondary_connection" "basic" { name = "connection_name" - as_replica_of = ".." + as_replica_of = "\"\".\"\".\"\"" } ## Complete (with every optional set) resource "snowflake_secondary_connection" "complete" { name = "connection_name" - as_replica_of = ".." + as_replica_of = "\"\".\"\".\"\"" comment = "my complete secondary connection" } diff --git a/examples/resources/snowflake_secondary_database/import.sh b/examples/resources/snowflake_secondary_database/import.sh index f183eac8ac..2896ef9be8 100644 --- a/examples/resources/snowflake_secondary_database/import.sh +++ b/examples/resources/snowflake_secondary_database/import.sh @@ -1 +1 @@ -terraform import snowflake_secondary_database.example 'secondary_database_name' +terraform import snowflake_secondary_database.example '""' diff --git a/examples/resources/snowflake_secret_with_authorization_code_grant/resource.tf b/examples/resources/snowflake_secret_with_authorization_code_grant/resource.tf index bb45a36e87..f6a499b028 100644 --- a/examples/resources/snowflake_secret_with_authorization_code_grant/resource.tf +++ b/examples/resources/snowflake_secret_with_authorization_code_grant/resource.tf @@ -3,7 +3,7 @@ resource "snowflake_secret_with_authorization_code_grant" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_authorization_code_grant.example.fully_qualified_name oauth_refresh_token = "EXAMPLE_TOKEN" oauth_refresh_token_expiry_time = "2025-01-02 15:04:01" } @@ -13,7 +13,7 @@ resource "snowflake_secret_with_authorization_code_grant" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_authorization_code_grant.example.fully_qualified_name oauth_refresh_token = "EXAMPLE_TOKEN" oauth_refresh_token_expiry_time = "2025-01-02 15:04:01" comment = "EXAMPLE_COMMENT" diff --git a/examples/resources/snowflake_secret_with_client_credentials/resource.tf b/examples/resources/snowflake_secret_with_client_credentials/resource.tf index baaf605e67..c62aecf252 100644 --- a/examples/resources/snowflake_secret_with_client_credentials/resource.tf +++ b/examples/resources/snowflake_secret_with_client_credentials/resource.tf @@ -3,7 +3,7 @@ resource "snowflake_secret_with_client_credentials" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_client_credentials.example.fully_qualified_name oauth_scopes = ["useraccount", "testscope"] } @@ -12,7 +12,7 @@ resource "snowflake_secret_with_client_credentials" "test" { name = "EXAMPLE_SECRET" database = "EXAMPLE_DB" schema = "EXAMPLE_SCHEMA" - api_authentication = "EXAMPLE_SECURITY_INTEGRATION_NAME" + api_authentication = snowflake_api_authentication_integration_with_client_credentials.example.fully_qualified_name oauth_scopes = ["useraccount", "testscope"] comment = "EXAMPLE_COMMENT" } diff --git a/examples/resources/snowflake_shared_database/import.sh b/examples/resources/snowflake_shared_database/import.sh index 6cf900566c..8c39bba7ee 100644 --- a/examples/resources/snowflake_shared_database/import.sh +++ b/examples/resources/snowflake_shared_database/import.sh @@ -1 +1 @@ -terraform import snowflake_shared_database.example 'shared_database_name' +terraform import snowflake_shared_database.example '""' diff --git a/examples/resources/snowflake_stream_on_directory_table/resource.tf b/examples/resources/snowflake_stream_on_directory_table/resource.tf index ab85c22f29..70188045e0 100644 --- a/examples/resources/snowflake_stream_on_directory_table/resource.tf +++ b/examples/resources/snowflake_stream_on_directory_table/resource.tf @@ -1,18 +1,10 @@ -resource "snowflake_stage" "example_stage" { - name = "EXAMPLE_STAGE" - url = "s3://com.example.bucket/prefix" - database = "EXAMPLE_DB" - schema = "EXAMPLE_SCHEMA" - credentials = "AWS_KEY_ID='${var.example_aws_key_id}' AWS_SECRET_KEY='${var.example_aws_secret_key}'" -} - # basic resource resource "snowflake_stream_on_directory_table" "stream" { name = "stream" schema = "schema" database = "database" - stage = snowflake_stage.stage.fully_qualified_name + stage = snowflake_stage.example.fully_qualified_name } @@ -23,11 +15,7 @@ resource "snowflake_stream_on_directory_table" "stream" { database = "database" copy_grants = true - stage = snowflake_stage.stage.fully_qualified_name - - at { - statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726" - } + stage = snowflake_stage.example.fully_qualified_name comment = "A stream." } diff --git a/examples/resources/snowflake_stream_on_external_table/resource.tf b/examples/resources/snowflake_stream_on_external_table/resource.tf index 964cb0f342..bdd0073b2a 100644 --- a/examples/resources/snowflake_stream_on_external_table/resource.tf +++ b/examples/resources/snowflake_stream_on_external_table/resource.tf @@ -1,29 +1,10 @@ -resource "snowflake_external_table" "external_table" { - database = "db" - schema = "schema" - name = "external_table" - comment = "External table" - file_format = "TYPE = CSV FIELD_DELIMITER = '|'" - location = "@stage/directory/" - - column { - name = "id" - type = "int" - } - - column { - name = "data" - type = "text" - } -} - # basic resource resource "snowflake_stream_on_external_table" "stream" { name = "stream" schema = "schema" database = "database" - external_table = snowflake_external_table.external_table.fully_qualified_name + external_table = snowflake_external_table.example.fully_qualified_name } @@ -34,7 +15,7 @@ resource "snowflake_stream_on_external_table" "stream" { database = "database" copy_grants = true - external_table = snowflake_external_table.external_table.fully_qualified_name + external_table = snowflake_external_table.example.fully_qualified_name insert_only = "true" at { diff --git a/examples/resources/snowflake_stream_on_table/resource.tf b/examples/resources/snowflake_stream_on_table/resource.tf index c3bf45a71e..52a80e0380 100644 --- a/examples/resources/snowflake_stream_on_table/resource.tf +++ b/examples/resources/snowflake_stream_on_table/resource.tf @@ -1,15 +1,12 @@ -resource "snowflake_table" "table" { - database = "database" +# basic resource +resource "snowflake_stream_on_table" "stream" { + name = "stream" schema = "schema" - name = "name" + database = "database" - column { - type = "NUMBER(38,0)" - name = "id" - } + table = snowflake_table.example.fully_qualified_name } - # resource with more fields set resource "snowflake_stream_on_table" "stream" { name = "stream" @@ -17,7 +14,7 @@ resource "snowflake_stream_on_table" "stream" { database = "database" copy_grants = true - table = snowflake_table.table.fully_qualified_name + table = snowflake_table.example.fully_qualified_name append_only = "true" show_initial_rows = "true" diff --git a/examples/resources/snowflake_stream_on_view/resource.tf b/examples/resources/snowflake_stream_on_view/resource.tf index 754c893418..e0a7304ed5 100644 --- a/examples/resources/snowflake_stream_on_view/resource.tf +++ b/examples/resources/snowflake_stream_on_view/resource.tf @@ -1,19 +1,10 @@ -resource "snowflake_view" "view" { - database = "database" - schema = "schema" - name = "view" - statement = <<-SQL - select * from foo; -SQL -} - # basic resource resource "snowflake_stream_on_view" "stream" { name = "stream" schema = "schema" database = "database" - view = snowflake_view.view.fully_qualified_name + view = snowflake_view.example.fully_qualified_name } # resource with additional fields @@ -23,7 +14,7 @@ resource "snowflake_stream_on_view" "stream" { database = "database" copy_grants = true - view = snowflake_view.view.fully_qualified_name + view = snowflake_view.example.fully_qualified_name append_only = "true" show_initial_rows = "true" diff --git a/examples/resources/snowflake_streamlit/import.sh b/examples/resources/snowflake_streamlit/import.sh index aadf0a2952..35d7591655 100644 --- a/examples/resources/snowflake_streamlit/import.sh +++ b/examples/resources/snowflake_streamlit/import.sh @@ -1,2 +1 @@ -# format is .. terraform import snowflake_schema.example '""."".""' diff --git a/examples/resources/snowflake_streamlit/resource.tf b/examples/resources/snowflake_streamlit/resource.tf index e84ed3a08f..a5eae82e6e 100644 --- a/examples/resources/snowflake_streamlit/resource.tf +++ b/examples/resources/snowflake_streamlit/resource.tf @@ -3,18 +3,19 @@ resource "snowflake_streamlit" "streamlit" { database = "database" schema = "schema" name = "streamlit" - stage = "streamlit_db.streamlit_schema.streamlit_stage" + stage = snowflake_stage.example.fully_qualified_name main_file = "/streamlit_main.py" } + # resource with all fields set resource "snowflake_streamlit" "streamlit" { database = "database" schema = "schema" name = "streamlit" - stage = "streamlit_db.streamlit_schema.streamlit_stage" + stage = snowflake_stage.example.fully_qualified_name directory_location = "src" main_file = "streamlit_main.py" - query_warehouse = "warehouse" + query_warehouse = snowflake_warehouse.example.fully_qualified_name external_access_integrations = ["integration_id"] title = "title" comment = "comment" diff --git a/examples/resources/snowflake_tag/resource.tf b/examples/resources/snowflake_tag/resource.tf index 9c99ab0503..ee284d78ea 100644 --- a/examples/resources/snowflake_tag/resource.tf +++ b/examples/resources/snowflake_tag/resource.tf @@ -12,5 +12,5 @@ resource "snowflake_tag" "tag" { schema = "schema" comment = "comment" allowed_values = ["finance", "engineering", ""] - masking_policies = [snowfalke_masking_policy.masking_policy.fully_qualified_name] + masking_policies = [snowfalke_masking_policy.example.fully_qualified_name] } diff --git a/examples/resources/snowflake_task/import.sh b/examples/resources/snowflake_task/import.sh index 18f4e0bda8..1ea62df133 100644 --- a/examples/resources/snowflake_task/import.sh +++ b/examples/resources/snowflake_task/import.sh @@ -1,2 +1 @@ -# format is database name | schema name | task name -terraform import snowflake_task.example 'dbName|schemaName|taskName' +terraform import snowflake_task.example '""."".""' diff --git a/examples/resources/snowflake_task/resource.tf b/examples/resources/snowflake_task/resource.tf index da18d05a81..839946a3b2 100644 --- a/examples/resources/snowflake_task/resource.tf +++ b/examples/resources/snowflake_task/resource.tf @@ -53,13 +53,13 @@ resource "snowflake_task" "test" { database = "database" schema = "schema" name = "task" - warehouse = "warehouse" + warehouse = snowflake_warehouse.example.fully_qualified_name started = true sql_statement = "select 1" config = "{\"key\":\"value\"}" allow_overlapping_execution = true - error_integration = "" + error_integration = snowflake_notification_integration.example.fully_qualified_name when = "SYSTEM$STREAM_HAS_DATA('')" comment = "complete task" diff --git a/examples/resources/snowflake_user/resource.tf b/examples/resources/snowflake_user/resource.tf index 1892a4568f..70f8e4a0d5 100644 --- a/examples/resources/snowflake_user/resource.tf +++ b/examples/resources/snowflake_user/resource.tf @@ -16,9 +16,9 @@ resource "snowflake_user" "user" { display_name = "Snowflake User display name" email = "user@snowflake.example" - default_warehouse = "warehouse" + default_warehouse = snowflake_warehouse.example.fully_qualified_name default_secondary_roles_option = "ALL" - default_role = "role1" + default_role = snowflake_role.example.fully_qualified_name default_namespace = "some.namespace" mins_to_unlock = 9 diff --git a/examples/resources/snowflake_view/resource.tf b/examples/resources/snowflake_view/resource.tf index b41c2c308d..f44106e8c5 100644 --- a/examples/resources/snowflake_view/resource.tf +++ b/examples/resources/snowflake_view/resource.tf @@ -37,12 +37,12 @@ resource "snowflake_view" "test" { policy_name = "projection_policy" } masking_policy { - policy_name = "masking_policy" + policy_name = snowflake_masking_policy.example.fully_qualified_name using = ["address"] } } row_access_policy { - policy_name = "row_access_policy" + policy_name = snowflake_row_access_policy.example.fully_qualified_name on = ["id"] } aggregation_policy { @@ -50,8 +50,9 @@ resource "snowflake_view" "test" { entity_key = ["id"] } data_metric_function { - function_name = "data_metric_function" - on = ["id"] + function_name = "data_metric_function" + on = ["id"] + schedule_status = "STARTED" } data_metric_schedule { using_cron = "15 * * * * UTC" diff --git a/examples/resources/snowflake_warehouse/import.sh b/examples/resources/snowflake_warehouse/import.sh index 6fe5aa5ab8..e9e01ef33b 100644 --- a/examples/resources/snowflake_warehouse/import.sh +++ b/examples/resources/snowflake_warehouse/import.sh @@ -1 +1 @@ -terraform import snowflake_warehouse.example warehouseName +terraform import snowflake_warehouse.example '""' diff --git a/examples/resources/snowflake_warehouse/resource.tf b/examples/resources/snowflake_warehouse/resource.tf index 4172366508..1c08611115 100644 --- a/examples/resources/snowflake_warehouse/resource.tf +++ b/examples/resources/snowflake_warehouse/resource.tf @@ -1,5 +1,25 @@ +# Resource with required fields resource "snowflake_warehouse" "warehouse" { - name = "test" - comment = "foo" - warehouse_size = "small" + name = "WAREHOUSE" +} + +# Resource with all fields +resource "snowflake_warehouse" "warehouse" { + name = "WAREHOUSE" + warehouse_type = "SNOWPARK-OPTIMIZED" + warehouse_size = "MEDIUM" + max_cluster_count = 4 + min_cluster_count = 2 + scaling_policy = "ECONOMY" + auto_suspend = 1200 + auto_resume = false + initially_suspended = false + resource_monitor = snowflake_resource_monitor.monitor.fully_qualified_name + comment = "An example warehouse." + enable_query_acceleration = true + query_acceleration_max_scale_factor = 4 + + max_concurrency_level = 4 + statement_queued_timeout_in_seconds = 5 + statement_timeout_in_seconds = 86400 } diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/account_snowflake_gen.go b/pkg/acceptance/bettertestspoc/assert/objectassert/account_snowflake_gen.go index d394c1ae39..6ba8eceb11 100644 --- a/pkg/acceptance/bettertestspoc/assert/objectassert/account_snowflake_gen.go +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/account_snowflake_gen.go @@ -148,11 +148,11 @@ func (a *AccountAssert) HasAccountLocator(expected string) *AccountAssert { func (a *AccountAssert) HasAccountLocatorURL(expected string) *AccountAssert { a.AddAssertion(func(t *testing.T, o *sdk.Account) error { t.Helper() - if o.AccountLocatorURL == nil { + if o.AccountLocatorUrl == nil { return fmt.Errorf("expected account locator url to have value; got: nil") } - if *o.AccountLocatorURL != expected { - return fmt.Errorf("expected account locator url: %v; got: %v", expected, *o.AccountLocatorURL) + if *o.AccountLocatorUrl != expected { + return fmt.Errorf("expected account locator url: %v; got: %v", expected, *o.AccountLocatorUrl) } return nil }) diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/function_describe_snowflake_ext.go b/pkg/acceptance/bettertestspoc/assert/objectassert/function_describe_snowflake_ext.go new file mode 100644 index 0000000000..f540d487bd --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/function_describe_snowflake_ext.go @@ -0,0 +1,407 @@ +package objectassert + +import ( + "fmt" + "strings" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +// TODO [SNOW-1501905]: this file should be fully regenerated when adding and option to assert the results of describe +type FunctionDetailsAssert struct { + *assert.SnowflakeObjectAssert[sdk.FunctionDetails, sdk.SchemaObjectIdentifierWithArguments] +} + +func FunctionDetails(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *FunctionDetailsAssert { + t.Helper() + return &FunctionDetailsAssert{ + assert.NewSnowflakeObjectAssertWithProvider(sdk.ObjectType("FUNCTION_DETAILS"), id, acc.TestClient().Function.DescribeDetails), + } +} + +func (f *FunctionDetailsAssert) HasSignature(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Signature != expected { + return fmt.Errorf("expected signature: %v; got: %v", expected, o.Signature) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasReturns(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Returns != expected { + return fmt.Errorf("expected returns: %v; got: %v", expected, o.Returns) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasLanguage(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Language != expected { + return fmt.Errorf("expected language: %v; got: %v", expected, o.Language) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasBody(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Body == nil { + return fmt.Errorf("expected body to have value; got: nil") + } + if *o.Body != expected { + return fmt.Errorf("expected body: %v; got: %v", expected, *o.Body) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasNullHandling(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.NullHandling == nil { + return fmt.Errorf("expected null handling to have value; got: nil") + } + if *o.NullHandling != expected { + return fmt.Errorf("expected null handling: %v; got: %v", expected, *o.NullHandling) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasVolatility(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Volatility == nil { + return fmt.Errorf("expected volatility to have value; got: nil") + } + if *o.Volatility != expected { + return fmt.Errorf("expected volatility: %v; got: %v", expected, *o.Volatility) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasExternalAccessIntegrations(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasSecrets(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasImports(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Imports == nil { + return fmt.Errorf("expected imports to have value; got: nil") + } + if *o.Imports != expected { + return fmt.Errorf("expected imports: %v; got: %v", expected, *o.Imports) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasHandler(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Handler == nil { + return fmt.Errorf("expected handler to have value; got: nil") + } + if *o.Handler != expected { + return fmt.Errorf("expected handler: %v; got: %v", expected, *o.Handler) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasRuntimeVersion(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.RuntimeVersion == nil { + return fmt.Errorf("expected runtime version to have value; got: nil") + } + if *o.RuntimeVersion != expected { + return fmt.Errorf("expected runtime version: %v; got: %v", expected, *o.RuntimeVersion) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasPackages(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Packages == nil { + return fmt.Errorf("expected packages to have value; got: nil") + } + if *o.Packages != expected { + return fmt.Errorf("expected packages: %v; got: %v", expected, *o.Packages) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasTargetPath(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.TargetPath == nil { + return fmt.Errorf("expected target path to have value; got: nil") + } + if *o.TargetPath != expected { + return fmt.Errorf("expected target path: %v; got: %v", expected, *o.TargetPath) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasInstalledPackages(expected string) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.InstalledPackages == nil { + return fmt.Errorf("expected installed packages to have value; got: nil") + } + if *o.InstalledPackages != expected { + return fmt.Errorf("expected installed packages: %v; got: %v", expected, *o.InstalledPackages) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasIsAggregate(expected bool) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.IsAggregate == nil { + return fmt.Errorf("expected is aggregate to have value; got: nil") + } + if *o.IsAggregate != expected { + return fmt.Errorf("expected is aggregate: %v; got: %v", expected, *o.IsAggregate) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasBodyNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Body != nil { + return fmt.Errorf("expected body to be nil, was %v", *o.Body) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasNullHandlingNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.NullHandling != nil { + return fmt.Errorf("expected null handling to be nil, was %v", *o.NullHandling) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasVolatilityNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Volatility != nil { + return fmt.Errorf("expected volatility to be nil, was %v", *o.Volatility) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasExternalAccessIntegrationsNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.ExternalAccessIntegrations != nil { + return fmt.Errorf("expected external access integrations to be nil, was %v", *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasSecretsNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Secrets != nil { + return fmt.Errorf("expected secrets to be nil, was %v", *o.Secrets) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasImportsNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Imports != nil { + return fmt.Errorf("expected imports to be nil, was %v", *o.Imports) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasHandlerNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Handler != nil { + return fmt.Errorf("expected handler to be nil, was %v", *o.Handler) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasRuntimeVersionNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.RuntimeVersion != nil { + return fmt.Errorf("expected runtime version to be nil, was %v", *o.RuntimeVersion) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasPackagesNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Packages != nil { + return fmt.Errorf("expected packages to be nil, was %v", *o.Packages) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasTargetPathNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.TargetPath != nil { + return fmt.Errorf("expected target path to be nil, was %v", *o.TargetPath) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasInstalledPackagesNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.InstalledPackages != nil { + return fmt.Errorf("expected installed packages to be nil, was %v", *o.InstalledPackages) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasIsAggregateNil() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.IsAggregate != nil { + return fmt.Errorf("expected is aggregate to be nil, was %v", *o.IsAggregate) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasInstalledPackagesNotEmpty() *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.InstalledPackages == nil { + return fmt.Errorf("expected installed packages to not be nil") + } + if *o.InstalledPackages == "" { + return fmt.Errorf("expected installed packages to not be empty") + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasExactlyExternalAccessIntegrations(integrations ...sdk.AccountObjectIdentifier) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + joined := strings.Join(collections.Map(integrations, func(ex sdk.AccountObjectIdentifier) string { return ex.FullyQualifiedName() }), ",") + expected := fmt.Sprintf(`[%s]`, joined) + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *FunctionDetailsAssert) HasExactlySecrets(expectedSecrets map[string]sdk.SchemaObjectIdentifier) *FunctionDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.FunctionDetails) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + var parts []string + for k, v := range expectedSecrets { + parts = append(parts, fmt.Sprintf(`"%s":"\"%s\".\"%s\".%s"`, k, v.DatabaseName(), v.SchemaName(), v.Name())) + } + expected := fmt.Sprintf(`{%s}`, strings.Join(parts, ",")) + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_ext.go b/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_ext.go new file mode 100644 index 0000000000..aa8d17a022 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_ext.go @@ -0,0 +1,67 @@ +package objectassert + +import ( + "fmt" + "strings" + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +func (a *FunctionAssert) HasCreatedOnNotEmpty() *FunctionAssert { + a.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.CreatedOn == "" { + return fmt.Errorf("expected create_on to be not empty") + } + return nil + }) + return a +} + +func (a *FunctionAssert) HasExternalAccessIntegrationsNil() *FunctionAssert { + a.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.ExternalAccessIntegrations != nil { + return fmt.Errorf("expected external_access_integrations to be nil but was: %v", *o.ExternalAccessIntegrations) + } + return nil + }) + return a +} + +func (f *FunctionAssert) HasExactlyExternalAccessIntegrations(integrations ...sdk.AccountObjectIdentifier) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + joined := strings.Join(collections.Map(integrations, func(ex sdk.AccountObjectIdentifier) string { return ex.FullyQualifiedName() }), ",") + expected := fmt.Sprintf(`[%s]`, joined) + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasExactlySecrets(expectedSecrets map[string]sdk.SchemaObjectIdentifier) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + var parts []string + for k, v := range expectedSecrets { + parts = append(parts, fmt.Sprintf(`"%s":"\"%s\".\"%s\".%s"`, k, v.DatabaseName(), v.SchemaName(), v.Name())) + } + expected := fmt.Sprintf(`{%s}`, strings.Join(parts, ",")) + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_gen.go b/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_gen.go new file mode 100644 index 0000000000..8b6f674aa8 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/function_snowflake_gen.go @@ -0,0 +1,271 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package objectassert + +// imports modified manually +import ( + "fmt" + "slices" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type FunctionAssert struct { + *assert.SnowflakeObjectAssert[sdk.Function, sdk.SchemaObjectIdentifierWithArguments] +} + +func Function(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *FunctionAssert { + t.Helper() + return &FunctionAssert{ + assert.NewSnowflakeObjectAssertWithProvider(sdk.ObjectTypeFunction, id, acc.TestClient().Function.Show), + } +} + +func FunctionFromObject(t *testing.T, function *sdk.Function) *FunctionAssert { + t.Helper() + return &FunctionAssert{ + assert.NewSnowflakeObjectAssertWithObject(sdk.ObjectTypeFunction, function.ID(), function), + } +} + +func (f *FunctionAssert) HasCreatedOn(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.CreatedOn != expected { + return fmt.Errorf("expected created on: %v; got: %v", expected, o.CreatedOn) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasName(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.Name != expected { + return fmt.Errorf("expected name: %v; got: %v", expected, o.Name) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasSchemaName(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.SchemaName != expected { + return fmt.Errorf("expected schema name: %v; got: %v", expected, o.SchemaName) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsBuiltin(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsBuiltin != expected { + return fmt.Errorf("expected is builtin: %v; got: %v", expected, o.IsBuiltin) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsAggregate(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsAggregate != expected { + return fmt.Errorf("expected is aggregate: %v; got: %v", expected, o.IsAggregate) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsAnsi(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsAnsi != expected { + return fmt.Errorf("expected is ansi: %v; got: %v", expected, o.IsAnsi) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasMinNumArguments(expected int) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.MinNumArguments != expected { + return fmt.Errorf("expected min num arguments: %v; got: %v", expected, o.MinNumArguments) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasMaxNumArguments(expected int) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.MaxNumArguments != expected { + return fmt.Errorf("expected max num arguments: %v; got: %v", expected, o.MaxNumArguments) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasArgumentsOld(expected []sdk.DataType) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + // edited manually + if !slices.Equal(o.ArgumentsOld, expected) { + return fmt.Errorf("expected arguments old: %v; got: %v", expected, o.ArgumentsOld) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasArgumentsRaw(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.ArgumentsRaw != expected { + return fmt.Errorf("expected arguments raw: %v; got: %v", expected, o.ArgumentsRaw) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasDescription(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.Description != expected { + return fmt.Errorf("expected description: %v; got: %v", expected, o.Description) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasCatalogName(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.CatalogName != expected { + return fmt.Errorf("expected catalog name: %v; got: %v", expected, o.CatalogName) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsTableFunction(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsTableFunction != expected { + return fmt.Errorf("expected is table function: %v; got: %v", expected, o.IsTableFunction) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasValidForClustering(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.ValidForClustering != expected { + return fmt.Errorf("expected valid for clustering: %v; got: %v", expected, o.ValidForClustering) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsSecure(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsSecure != expected { + return fmt.Errorf("expected is secure: %v; got: %v", expected, o.IsSecure) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasSecrets(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasExternalAccessIntegrations(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsExternalFunction(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsExternalFunction != expected { + return fmt.Errorf("expected is external function: %v; got: %v", expected, o.IsExternalFunction) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasLanguage(expected string) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.Language != expected { + return fmt.Errorf("expected language: %v; got: %v", expected, o.Language) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsMemoizable(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsMemoizable != expected { + return fmt.Errorf("expected is memoizable: %v; got: %v", expected, o.IsMemoizable) + } + return nil + }) + return f +} + +func (f *FunctionAssert) HasIsDataMetric(expected bool) *FunctionAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Function) error { + t.Helper() + if o.IsDataMetric != expected { + return fmt.Errorf("expected is data metric: %v; got: %v", expected, o.IsDataMetric) + } + return nil + }) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/gen/sdk_object_def.go b/pkg/acceptance/bettertestspoc/assert/objectassert/gen/sdk_object_def.go index 636fb40c8e..06ab381d2d 100644 --- a/pkg/acceptance/bettertestspoc/assert/objectassert/gen/sdk_object_def.go +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/gen/sdk_object_def.go @@ -92,6 +92,16 @@ var allStructs = []SdkObjectDef{ ObjectType: sdk.ObjectTypeAccount, ObjectStruct: sdk.Account{}, }, + { + IdType: "sdk.SchemaObjectIdentifierWithArguments", + ObjectType: sdk.ObjectTypeFunction, + ObjectStruct: sdk.Function{}, + }, + { + IdType: "sdk.SchemaObjectIdentifierWithArguments", + ObjectType: sdk.ObjectTypeProcedure, + ObjectStruct: sdk.Procedure{}, + }, } func GetSdkObjectDetails() []genhelpers.SdkObjectDetails { diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_describe_snowflake_ext.go b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_describe_snowflake_ext.go new file mode 100644 index 0000000000..64011d14f9 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_describe_snowflake_ext.go @@ -0,0 +1,393 @@ +package objectassert + +import ( + "fmt" + "strings" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +// TODO [SNOW-1501905]: this file should be fully regenerated when adding and option to assert the results of describe +type ProcedureDetailsAssert struct { + *assert.SnowflakeObjectAssert[sdk.ProcedureDetails, sdk.SchemaObjectIdentifierWithArguments] +} + +func ProcedureDetails(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *ProcedureDetailsAssert { + t.Helper() + return &ProcedureDetailsAssert{ + assert.NewSnowflakeObjectAssertWithProvider(sdk.ObjectType("PROCEDURE_DETAILS"), id, acc.TestClient().Procedure.DescribeDetails), + } +} + +func (f *ProcedureDetailsAssert) HasSignature(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Signature != expected { + return fmt.Errorf("expected signature: %v; got: %v", expected, o.Signature) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasReturns(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Returns != expected { + return fmt.Errorf("expected returns: %v; got: %v", expected, o.Returns) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasLanguage(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Language != expected { + return fmt.Errorf("expected language: %v; got: %v", expected, o.Language) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasBody(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Body == nil { + return fmt.Errorf("expected body to have value; got: nil") + } + if *o.Body != expected { + return fmt.Errorf("expected body: %v; got: %v", expected, *o.Body) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasNullHandling(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.NullHandling == nil { + return fmt.Errorf("expected null handling to have value; got: nil") + } + if *o.NullHandling != expected { + return fmt.Errorf("expected null handling: %v; got: %v", expected, *o.NullHandling) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasVolatility(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Volatility == nil { + return fmt.Errorf("expected volatility to have value; got: nil") + } + if *o.Volatility != expected { + return fmt.Errorf("expected volatility: %v; got: %v", expected, *o.Volatility) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasExternalAccessIntegrations(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasSecrets(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasImports(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Imports == nil { + return fmt.Errorf("expected imports to have value; got: nil") + } + if *o.Imports != expected { + return fmt.Errorf("expected imports: %v; got: %v", expected, *o.Imports) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasHandler(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Handler == nil { + return fmt.Errorf("expected handler to have value; got: nil") + } + if *o.Handler != expected { + return fmt.Errorf("expected handler: %v; got: %v", expected, *o.Handler) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasRuntimeVersion(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.RuntimeVersion == nil { + return fmt.Errorf("expected runtime version to have value; got: nil") + } + if *o.RuntimeVersion != expected { + return fmt.Errorf("expected runtime version: %v; got: %v", expected, *o.RuntimeVersion) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasPackages(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Packages == nil { + return fmt.Errorf("expected packages to have value; got: nil") + } + if *o.Packages != expected { + return fmt.Errorf("expected packages: %v; got: %v", expected, *o.Packages) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasTargetPath(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.TargetPath == nil { + return fmt.Errorf("expected target path to have value; got: nil") + } + if *o.TargetPath != expected { + return fmt.Errorf("expected target path: %v; got: %v", expected, *o.TargetPath) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasInstalledPackages(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.InstalledPackages == nil { + return fmt.Errorf("expected installed packages to have value; got: nil") + } + if *o.InstalledPackages != expected { + return fmt.Errorf("expected installed packages: %v; got: %v", expected, *o.InstalledPackages) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasExecuteAs(expected string) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.ExecuteAs != expected { + return fmt.Errorf("expected execute as: %v; got: %v", expected, o.ExecuteAs) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasBodyNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Body != nil { + return fmt.Errorf("expected body to be nil, was %v", *o.Body) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasNullHandlingNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.NullHandling != nil { + return fmt.Errorf("expected null handling to be nil, was %v", *o.NullHandling) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasVolatilityNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Volatility != nil { + return fmt.Errorf("expected volatility to be nil, was %v", *o.Volatility) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasExternalAccessIntegrationsNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.ExternalAccessIntegrations != nil { + return fmt.Errorf("expected external access integrations to be nil, was %v", *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasSecretsNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Secrets != nil { + return fmt.Errorf("expected secrets to be nil, was %v", *o.Secrets) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasImportsNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Imports != nil { + return fmt.Errorf("expected imports to be nil, was %v", *o.Imports) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasHandlerNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Handler != nil { + return fmt.Errorf("expected handler to be nil, was %v", *o.Handler) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasRuntimeVersionNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.RuntimeVersion != nil { + return fmt.Errorf("expected runtime version to be nil, was %v", *o.RuntimeVersion) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasPackagesNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Packages != nil { + return fmt.Errorf("expected packages to be nil, was %v", *o.Packages) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasTargetPathNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.TargetPath != nil { + return fmt.Errorf("expected target path to be nil, was %v", *o.TargetPath) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasInstalledPackagesNil() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.InstalledPackages != nil { + return fmt.Errorf("expected installed packages to be nil, was %v", *o.InstalledPackages) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasInstalledPackagesNotEmpty() *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.InstalledPackages == nil { + return fmt.Errorf("expected installed packages to not be nil") + } + if *o.InstalledPackages == "" { + return fmt.Errorf("expected installed packages to not be empty") + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasExactlyExternalAccessIntegrations(integrations ...sdk.AccountObjectIdentifier) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + joined := strings.Join(collections.Map(integrations, func(ex sdk.AccountObjectIdentifier) string { return ex.FullyQualifiedName() }), ",") + expected := fmt.Sprintf(`[%s]`, joined) + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} + +func (f *ProcedureDetailsAssert) HasExactlySecrets(expectedSecrets map[string]sdk.SchemaObjectIdentifier) *ProcedureDetailsAssert { + f.AddAssertion(func(t *testing.T, o *sdk.ProcedureDetails) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + var parts []string + for k, v := range expectedSecrets { + parts = append(parts, fmt.Sprintf(`"%s":"\"%s\".\"%s\".%s"`, k, v.DatabaseName(), v.SchemaName(), v.Name())) + } + expected := fmt.Sprintf(`{%s}`, strings.Join(parts, ",")) + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_ext.go b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_ext.go new file mode 100644 index 0000000000..12d5a384cf --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_ext.go @@ -0,0 +1,59 @@ +package objectassert + +import ( + "fmt" + "strings" + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +func (a *ProcedureAssert) HasCreatedOnNotEmpty() *ProcedureAssert { + a.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.CreatedOn == "" { + return fmt.Errorf("expected create_on to be not empty") + } + return nil + }) + return a +} + +func (a *ProcedureAssert) HasExternalAccessIntegrationsNil() *ProcedureAssert { + a.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.ExternalAccessIntegrations != nil { + return fmt.Errorf("expected external_access_integrations to be nil but was: %v", *o.ExternalAccessIntegrations) + } + return nil + }) + return a +} + +func (a *ProcedureAssert) HasSecretsNil() *ProcedureAssert { + a.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.Secrets != nil { + return fmt.Errorf("expected secrets to be nil but was: %v", *o.Secrets) + } + return nil + }) + return a +} + +func (f *ProcedureAssert) HasExactlyExternalAccessIntegrations(integrations ...sdk.AccountObjectIdentifier) *ProcedureAssert { + f.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + joined := strings.Join(collections.Map(integrations, func(ex sdk.AccountObjectIdentifier) string { return ex.FullyQualifiedName() }), ",") + expected := fmt.Sprintf(`[%s]`, joined) + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_gen.go b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_gen.go new file mode 100644 index 0000000000..ef1d4c83cf --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectassert/procedure_snowflake_gen.go @@ -0,0 +1,227 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package objectassert + +// imports edited manually +import ( + "fmt" + "slices" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type ProcedureAssert struct { + *assert.SnowflakeObjectAssert[sdk.Procedure, sdk.SchemaObjectIdentifierWithArguments] +} + +func Procedure(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *ProcedureAssert { + t.Helper() + return &ProcedureAssert{ + assert.NewSnowflakeObjectAssertWithProvider(sdk.ObjectTypeProcedure, id, acc.TestClient().Procedure.Show), + } +} + +func ProcedureFromObject(t *testing.T, procedure *sdk.Procedure) *ProcedureAssert { + t.Helper() + return &ProcedureAssert{ + assert.NewSnowflakeObjectAssertWithObject(sdk.ObjectTypeProcedure, procedure.ID(), procedure), + } +} + +func (p *ProcedureAssert) HasCreatedOn(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.CreatedOn != expected { + return fmt.Errorf("expected created on: %v; got: %v", expected, o.CreatedOn) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasName(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.Name != expected { + return fmt.Errorf("expected name: %v; got: %v", expected, o.Name) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasSchemaName(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.SchemaName != expected { + return fmt.Errorf("expected schema name: %v; got: %v", expected, o.SchemaName) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasIsBuiltin(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.IsBuiltin != expected { + return fmt.Errorf("expected is builtin: %v; got: %v", expected, o.IsBuiltin) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasIsAggregate(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.IsAggregate != expected { + return fmt.Errorf("expected is aggregate: %v; got: %v", expected, o.IsAggregate) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasIsAnsi(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.IsAnsi != expected { + return fmt.Errorf("expected is ansi: %v; got: %v", expected, o.IsAnsi) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasMinNumArguments(expected int) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.MinNumArguments != expected { + return fmt.Errorf("expected min num arguments: %v; got: %v", expected, o.MinNumArguments) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasMaxNumArguments(expected int) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.MaxNumArguments != expected { + return fmt.Errorf("expected max num arguments: %v; got: %v", expected, o.MaxNumArguments) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasArgumentsOld(expected []sdk.DataType) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + // edited manually + if !slices.Equal(o.ArgumentsOld, expected) { + return fmt.Errorf("expected arguments old: %v; got: %v", expected, o.ArgumentsOld) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasArgumentsRaw(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.ArgumentsRaw != expected { + return fmt.Errorf("expected arguments raw: %v; got: %v", expected, o.ArgumentsRaw) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasDescription(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.Description != expected { + return fmt.Errorf("expected description: %v; got: %v", expected, o.Description) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasCatalogName(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.CatalogName != expected { + return fmt.Errorf("expected catalog name: %v; got: %v", expected, o.CatalogName) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasIsTableFunction(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.IsTableFunction != expected { + return fmt.Errorf("expected is table function: %v; got: %v", expected, o.IsTableFunction) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasValidForClustering(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.ValidForClustering != expected { + return fmt.Errorf("expected valid for clustering: %v; got: %v", expected, o.ValidForClustering) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasIsSecure(expected bool) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.IsSecure != expected { + return fmt.Errorf("expected is secure: %v; got: %v", expected, o.IsSecure) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasSecrets(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.Secrets == nil { + return fmt.Errorf("expected secrets to have value; got: nil") + } + if *o.Secrets != expected { + return fmt.Errorf("expected secrets: %v; got: %v", expected, *o.Secrets) + } + return nil + }) + return p +} + +func (p *ProcedureAssert) HasExternalAccessIntegrations(expected string) *ProcedureAssert { + p.AddAssertion(func(t *testing.T, o *sdk.Procedure) error { + t.Helper() + if o.ExternalAccessIntegrations == nil { + return fmt.Errorf("expected external access integrations to have value; got: nil") + } + if *o.ExternalAccessIntegrations != expected { + return fmt.Errorf("expected external access integrations: %v; got: %v", expected, *o.ExternalAccessIntegrations) + } + return nil + }) + return p +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectparametersassert/function_parameters_snowflake_gen.go b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/function_parameters_snowflake_gen.go new file mode 100644 index 0000000000..fac494a123 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/function_parameters_snowflake_gen.go @@ -0,0 +1,170 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package objectparametersassert + +import ( + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type FunctionParametersAssert struct { + *assert.SnowflakeParametersAssert[sdk.SchemaObjectIdentifierWithArguments] +} + +func FunctionParameters(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *FunctionParametersAssert { + t.Helper() + return &FunctionParametersAssert{ + assert.NewSnowflakeParametersAssertWithProvider(id, sdk.ObjectTypeFunction, acc.TestClient().Parameter.ShowFunctionParameters), + } +} + +func FunctionParametersPrefetched(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments, parameters []*sdk.Parameter) *FunctionParametersAssert { + t.Helper() + return &FunctionParametersAssert{ + assert.NewSnowflakeParametersAssertWithParameters(id, sdk.ObjectTypeFunction, parameters), + } +} + +////////////////////////////// +// Generic parameter checks // +////////////////////////////// + +func (f *FunctionParametersAssert) HasBoolParameterValue(parameterName sdk.FunctionParameter, expected bool) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterBoolValueSet(parameterName, expected)) + return f +} + +func (f *FunctionParametersAssert) HasIntParameterValue(parameterName sdk.FunctionParameter, expected int) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterIntValueSet(parameterName, expected)) + return f +} + +func (f *FunctionParametersAssert) HasStringParameterValue(parameterName sdk.FunctionParameter, expected string) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterValueSet(parameterName, expected)) + return f +} + +func (f *FunctionParametersAssert) HasDefaultParameterValue(parameterName sdk.FunctionParameter) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterDefaultValueSet(parameterName)) + return f +} + +func (f *FunctionParametersAssert) HasDefaultParameterValueOnLevel(parameterName sdk.FunctionParameter, parameterType sdk.ParameterType) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterDefaultValueOnLevelSet(parameterName, parameterType)) + return f +} + +/////////////////////////////// +// Aggregated generic checks // +/////////////////////////////// + +// HasAllDefaults checks if all the parameters: +// - have a default value by comparing current value of the sdk.Parameter with its default +// - have an expected level +func (f *FunctionParametersAssert) HasAllDefaults() *FunctionParametersAssert { + return f. + HasDefaultParameterValueOnLevel(sdk.FunctionParameterEnableConsoleOutput, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.FunctionParameterLogLevel, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.FunctionParameterMetricLevel, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.FunctionParameterTraceLevel, sdk.ParameterTypeSnowflakeDefault) +} + +func (f *FunctionParametersAssert) HasAllDefaultsExplicit() *FunctionParametersAssert { + return f. + HasDefaultEnableConsoleOutputValueExplicit(). + HasDefaultLogLevelValueExplicit(). + HasDefaultMetricLevelValueExplicit(). + HasDefaultTraceLevelValueExplicit() +} + +//////////////////////////// +// Parameter value checks // +//////////////////////////// + +func (f *FunctionParametersAssert) HasEnableConsoleOutput(expected bool) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterBoolValueSet(sdk.FunctionParameterEnableConsoleOutput, expected)) + return f +} + +func (f *FunctionParametersAssert) HasLogLevel(expected sdk.LogLevel) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.FunctionParameterLogLevel, expected)) + return f +} + +func (f *FunctionParametersAssert) HasMetricLevel(expected sdk.MetricLevel) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.FunctionParameterMetricLevel, expected)) + return f +} + +func (f *FunctionParametersAssert) HasTraceLevel(expected sdk.TraceLevel) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.FunctionParameterTraceLevel, expected)) + return f +} + +//////////////////////////// +// Parameter level checks // +//////////////////////////// + +func (f *FunctionParametersAssert) HasEnableConsoleOutputLevel(expected sdk.ParameterType) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.FunctionParameterEnableConsoleOutput, expected)) + return f +} + +func (f *FunctionParametersAssert) HasLogLevelLevel(expected sdk.ParameterType) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.FunctionParameterLogLevel, expected)) + return f +} + +func (f *FunctionParametersAssert) HasMetricLevelLevel(expected sdk.ParameterType) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.FunctionParameterMetricLevel, expected)) + return f +} + +func (f *FunctionParametersAssert) HasTraceLevelLevel(expected sdk.ParameterType) *FunctionParametersAssert { + f.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.FunctionParameterTraceLevel, expected)) + return f +} + +//////////////////////////////////// +// Parameter default value checks // +//////////////////////////////////// + +func (f *FunctionParametersAssert) HasDefaultEnableConsoleOutputValue() *FunctionParametersAssert { + return f.HasDefaultParameterValue(sdk.FunctionParameterEnableConsoleOutput) +} + +func (f *FunctionParametersAssert) HasDefaultLogLevelValue() *FunctionParametersAssert { + return f.HasDefaultParameterValue(sdk.FunctionParameterLogLevel) +} + +func (f *FunctionParametersAssert) HasDefaultMetricLevelValue() *FunctionParametersAssert { + return f.HasDefaultParameterValue(sdk.FunctionParameterMetricLevel) +} + +func (f *FunctionParametersAssert) HasDefaultTraceLevelValue() *FunctionParametersAssert { + return f.HasDefaultParameterValue(sdk.FunctionParameterTraceLevel) +} + +///////////////////////////////////////////// +// Parameter explicit default value checks // +///////////////////////////////////////////// + +func (f *FunctionParametersAssert) HasDefaultEnableConsoleOutputValueExplicit() *FunctionParametersAssert { + return f.HasEnableConsoleOutput(false) +} + +func (f *FunctionParametersAssert) HasDefaultLogLevelValueExplicit() *FunctionParametersAssert { + return f.HasLogLevel(sdk.LogLevelOff) +} + +func (f *FunctionParametersAssert) HasDefaultMetricLevelValueExplicit() *FunctionParametersAssert { + return f.HasMetricLevel(sdk.MetricLevelNone) +} + +func (f *FunctionParametersAssert) HasDefaultTraceLevelValueExplicit() *FunctionParametersAssert { + return f.HasTraceLevel(sdk.TraceLevelOff) +} diff --git a/pkg/acceptance/bettertestspoc/assert/objectparametersassert/gen/object_parameters_def.go b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/gen/object_parameters_def.go index 67a6b4e7a0..fd716a8993 100644 --- a/pkg/acceptance/bettertestspoc/assert/objectparametersassert/gen/object_parameters_def.go +++ b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/gen/object_parameters_def.go @@ -205,4 +205,27 @@ var allObjectsParameters = []SnowflakeObjectParameters{ // TODO(SNOW-1348092 - next prs): Add parameters }, }, + { + Name: "Function", + IdType: "sdk.SchemaObjectIdentifierWithArguments", + Level: sdk.ParameterTypeFunction, + Parameters: []SnowflakeParameter{ + {ParameterName: string(sdk.FunctionParameterEnableConsoleOutput), ParameterType: "bool", DefaultValue: "false", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.FunctionParameterLogLevel), ParameterType: "sdk.LogLevel", DefaultValue: "sdk.LogLevelOff", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.FunctionParameterMetricLevel), ParameterType: "sdk.MetricLevel", DefaultValue: "sdk.MetricLevelNone", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.FunctionParameterTraceLevel), ParameterType: "sdk.TraceLevel", DefaultValue: "sdk.TraceLevelOff", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + }, + }, + { + Name: "Procedure", + IdType: "sdk.SchemaObjectIdentifierWithArguments", + Level: sdk.ParameterTypeProcedure, + Parameters: []SnowflakeParameter{ + {ParameterName: string(sdk.ProcedureParameterAutoEventLogging), ParameterType: "sdk.AutoEventLogging", DefaultValue: "sdk.AutoEventLoggingOff", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.ProcedureParameterEnableConsoleOutput), ParameterType: "bool", DefaultValue: "false", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.ProcedureParameterLogLevel), ParameterType: "sdk.LogLevel", DefaultValue: "sdk.LogLevelOff", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.ProcedureParameterMetricLevel), ParameterType: "sdk.MetricLevel", DefaultValue: "sdk.MetricLevelNone", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + {ParameterName: string(sdk.ProcedureParameterTraceLevel), ParameterType: "sdk.TraceLevel", DefaultValue: "sdk.TraceLevelOff", DefaultLevel: "sdk.ParameterTypeSnowflakeDefault"}, + }, + }, } diff --git a/pkg/acceptance/bettertestspoc/assert/objectparametersassert/procedure_parameters_snowflake_gen.go b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/procedure_parameters_snowflake_gen.go new file mode 100644 index 0000000000..b425119010 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/objectparametersassert/procedure_parameters_snowflake_gen.go @@ -0,0 +1,190 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package objectparametersassert + +import ( + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type ProcedureParametersAssert struct { + *assert.SnowflakeParametersAssert[sdk.SchemaObjectIdentifierWithArguments] +} + +func ProcedureParameters(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) *ProcedureParametersAssert { + t.Helper() + return &ProcedureParametersAssert{ + assert.NewSnowflakeParametersAssertWithProvider(id, sdk.ObjectTypeProcedure, acc.TestClient().Parameter.ShowProcedureParameters), + } +} + +func ProcedureParametersPrefetched(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments, parameters []*sdk.Parameter) *ProcedureParametersAssert { + t.Helper() + return &ProcedureParametersAssert{ + assert.NewSnowflakeParametersAssertWithParameters(id, sdk.ObjectTypeProcedure, parameters), + } +} + +////////////////////////////// +// Generic parameter checks // +////////////////////////////// + +func (p *ProcedureParametersAssert) HasBoolParameterValue(parameterName sdk.ProcedureParameter, expected bool) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterBoolValueSet(parameterName, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasIntParameterValue(parameterName sdk.ProcedureParameter, expected int) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterIntValueSet(parameterName, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasStringParameterValue(parameterName sdk.ProcedureParameter, expected string) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterValueSet(parameterName, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasDefaultParameterValue(parameterName sdk.ProcedureParameter) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterDefaultValueSet(parameterName)) + return p +} + +func (p *ProcedureParametersAssert) HasDefaultParameterValueOnLevel(parameterName sdk.ProcedureParameter, parameterType sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterDefaultValueOnLevelSet(parameterName, parameterType)) + return p +} + +/////////////////////////////// +// Aggregated generic checks // +/////////////////////////////// + +// HasAllDefaults checks if all the parameters: +// - have a default value by comparing current value of the sdk.Parameter with its default +// - have an expected level +func (p *ProcedureParametersAssert) HasAllDefaults() *ProcedureParametersAssert { + return p. + HasDefaultParameterValueOnLevel(sdk.ProcedureParameterAutoEventLogging, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.ProcedureParameterEnableConsoleOutput, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.ProcedureParameterLogLevel, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.ProcedureParameterMetricLevel, sdk.ParameterTypeSnowflakeDefault). + HasDefaultParameterValueOnLevel(sdk.ProcedureParameterTraceLevel, sdk.ParameterTypeSnowflakeDefault) +} + +func (p *ProcedureParametersAssert) HasAllDefaultsExplicit() *ProcedureParametersAssert { + return p. + HasDefaultAutoEventLoggingValueExplicit(). + HasDefaultEnableConsoleOutputValueExplicit(). + HasDefaultLogLevelValueExplicit(). + HasDefaultMetricLevelValueExplicit(). + HasDefaultTraceLevelValueExplicit() +} + +//////////////////////////// +// Parameter value checks // +//////////////////////////// + +func (p *ProcedureParametersAssert) HasAutoEventLogging(expected sdk.AutoEventLogging) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.ProcedureParameterAutoEventLogging, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasEnableConsoleOutput(expected bool) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterBoolValueSet(sdk.ProcedureParameterEnableConsoleOutput, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasLogLevel(expected sdk.LogLevel) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.ProcedureParameterLogLevel, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasMetricLevel(expected sdk.MetricLevel) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.ProcedureParameterMetricLevel, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasTraceLevel(expected sdk.TraceLevel) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterStringUnderlyingValueSet(sdk.ProcedureParameterTraceLevel, expected)) + return p +} + +//////////////////////////// +// Parameter level checks // +//////////////////////////// + +func (p *ProcedureParametersAssert) HasAutoEventLoggingLevel(expected sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.ProcedureParameterAutoEventLogging, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasEnableConsoleOutputLevel(expected sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.ProcedureParameterEnableConsoleOutput, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasLogLevelLevel(expected sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.ProcedureParameterLogLevel, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasMetricLevelLevel(expected sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.ProcedureParameterMetricLevel, expected)) + return p +} + +func (p *ProcedureParametersAssert) HasTraceLevelLevel(expected sdk.ParameterType) *ProcedureParametersAssert { + p.AddAssertion(assert.SnowflakeParameterLevelSet(sdk.ProcedureParameterTraceLevel, expected)) + return p +} + +//////////////////////////////////// +// Parameter default value checks // +//////////////////////////////////// + +func (p *ProcedureParametersAssert) HasDefaultAutoEventLoggingValue() *ProcedureParametersAssert { + return p.HasDefaultParameterValue(sdk.ProcedureParameterAutoEventLogging) +} + +func (p *ProcedureParametersAssert) HasDefaultEnableConsoleOutputValue() *ProcedureParametersAssert { + return p.HasDefaultParameterValue(sdk.ProcedureParameterEnableConsoleOutput) +} + +func (p *ProcedureParametersAssert) HasDefaultLogLevelValue() *ProcedureParametersAssert { + return p.HasDefaultParameterValue(sdk.ProcedureParameterLogLevel) +} + +func (p *ProcedureParametersAssert) HasDefaultMetricLevelValue() *ProcedureParametersAssert { + return p.HasDefaultParameterValue(sdk.ProcedureParameterMetricLevel) +} + +func (p *ProcedureParametersAssert) HasDefaultTraceLevelValue() *ProcedureParametersAssert { + return p.HasDefaultParameterValue(sdk.ProcedureParameterTraceLevel) +} + +///////////////////////////////////////////// +// Parameter explicit default value checks // +///////////////////////////////////////////// + +func (p *ProcedureParametersAssert) HasDefaultAutoEventLoggingValueExplicit() *ProcedureParametersAssert { + return p.HasAutoEventLogging(sdk.AutoEventLoggingOff) +} + +func (p *ProcedureParametersAssert) HasDefaultEnableConsoleOutputValueExplicit() *ProcedureParametersAssert { + return p.HasEnableConsoleOutput(false) +} + +func (p *ProcedureParametersAssert) HasDefaultLogLevelValueExplicit() *ProcedureParametersAssert { + return p.HasLogLevel(sdk.LogLevelOff) +} + +func (p *ProcedureParametersAssert) HasDefaultMetricLevelValueExplicit() *ProcedureParametersAssert { + return p.HasMetricLevel(sdk.MetricLevelNone) +} + +func (p *ProcedureParametersAssert) HasDefaultTraceLevelValueExplicit() *ProcedureParametersAssert { + return p.HasTraceLevel(sdk.TraceLevelOff) +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_ext.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_ext.go new file mode 100644 index 0000000000..daf6dd018a --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_ext.go @@ -0,0 +1,11 @@ +package resourceassert + +import ( + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +func (a *AccountResourceAssert) HasAdminUserType(expected sdk.UserType) *AccountResourceAssert { + a.AddAssertion(assert.ValueSet("admin_user_type", string(expected))) + return a +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_gen.go index c68f6424c1..5d6f9d2d0a 100644 --- a/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_gen.go +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/account_resource_gen.go @@ -47,6 +47,11 @@ func (a *AccountResourceAssert) HasAdminRsaPublicKeyString(expected string) *Acc return a } +func (a *AccountResourceAssert) HasAdminUserTypeString(expected string) *AccountResourceAssert { + a.AddAssertion(assert.ValueSet("admin_user_type", expected)) + return a +} + func (a *AccountResourceAssert) HasCommentString(expected string) *AccountResourceAssert { a.AddAssertion(assert.ValueSet("comment", expected)) return a @@ -126,6 +131,11 @@ func (a *AccountResourceAssert) HasNoAdminRsaPublicKey() *AccountResourceAssert return a } +func (a *AccountResourceAssert) HasNoAdminUserType() *AccountResourceAssert { + a.AddAssertion(assert.ValueNotSet("admin_user_type")) + return a +} + func (a *AccountResourceAssert) HasNoComment() *AccountResourceAssert { a.AddAssertion(assert.ValueNotSet("comment")) return a diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/function_java_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_java_resource_gen.go new file mode 100644 index 0000000000..089d621565 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_java_resource_gen.go @@ -0,0 +1,267 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" +) + +type FunctionJavaResourceAssert struct { + *assert.ResourceAssert +} + +func FunctionJavaResource(t *testing.T, name string) *FunctionJavaResourceAssert { + t.Helper() + + return &FunctionJavaResourceAssert{ + ResourceAssert: assert.NewResourceAssert(name, "resource"), + } +} + +func ImportedFunctionJavaResource(t *testing.T, id string) *FunctionJavaResourceAssert { + t.Helper() + + return &FunctionJavaResourceAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported resource"), + } +} + +/////////////////////////////////// +// Attribute value string checks // +/////////////////////////////////// + +func (f *FunctionJavaResourceAssert) HasArgumentsString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("arguments", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasCommentString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("comment", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasDatabaseString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("database", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasEnableConsoleOutputString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("enable_console_output", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasExternalAccessIntegrationsString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("external_access_integrations", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasFullyQualifiedNameString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("fully_qualified_name", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasFunctionDefinitionString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("function_definition", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasFunctionLanguageString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("function_language", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasHandlerString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("handler", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasImportsString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("imports", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasIsSecureString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("is_secure", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasLogLevelString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("log_level", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasMetricLevelString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("metric_level", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasNameString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("name", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasNullInputBehaviorString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("null_input_behavior", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasPackagesString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("packages", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasReturnBehaviorString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("return_behavior", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasReturnTypeString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("return_type", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasRuntimeVersionString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("runtime_version", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasSchemaString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("schema", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasSecretsString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("secrets", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasTargetPathString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("target_path", expected)) + return f +} + +func (f *FunctionJavaResourceAssert) HasTraceLevelString(expected string) *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueSet("trace_level", expected)) + return f +} + +//////////////////////////// +// Attribute empty checks // +//////////////////////////// + +func (f *FunctionJavaResourceAssert) HasNoArguments() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("arguments")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoComment() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("comment")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoDatabase() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("database")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoEnableConsoleOutput() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("enable_console_output")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoExternalAccessIntegrations() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("external_access_integrations")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoFullyQualifiedName() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("fully_qualified_name")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoFunctionDefinition() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_definition")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoFunctionLanguage() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_language")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoHandler() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("handler")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoImports() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("imports")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoIsSecure() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_secure")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoLogLevel() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("log_level")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoMetricLevel() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("metric_level")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoName() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("name")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoNullInputBehavior() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("null_input_behavior")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoPackages() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("packages")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoReturnBehavior() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_behavior")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoReturnType() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_type")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoRuntimeVersion() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("runtime_version")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoSchema() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("schema")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoSecrets() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("secrets")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoTargetPath() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("target_path")) + return f +} + +func (f *FunctionJavaResourceAssert) HasNoTraceLevel() *FunctionJavaResourceAssert { + f.AddAssertion(assert.ValueNotSet("trace_level")) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/function_javascript_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_javascript_resource_gen.go new file mode 100644 index 0000000000..e633c26e0c --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_javascript_resource_gen.go @@ -0,0 +1,197 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" +) + +type FunctionJavascriptResourceAssert struct { + *assert.ResourceAssert +} + +func FunctionJavascriptResource(t *testing.T, name string) *FunctionJavascriptResourceAssert { + t.Helper() + + return &FunctionJavascriptResourceAssert{ + ResourceAssert: assert.NewResourceAssert(name, "resource"), + } +} + +func ImportedFunctionJavascriptResource(t *testing.T, id string) *FunctionJavascriptResourceAssert { + t.Helper() + + return &FunctionJavascriptResourceAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported resource"), + } +} + +/////////////////////////////////// +// Attribute value string checks // +/////////////////////////////////// + +func (f *FunctionJavascriptResourceAssert) HasArgumentsString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("arguments", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasCommentString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("comment", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasDatabaseString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("database", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasEnableConsoleOutputString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("enable_console_output", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasFullyQualifiedNameString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("fully_qualified_name", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasFunctionDefinitionString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("function_definition", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasFunctionLanguageString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("function_language", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasIsSecureString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("is_secure", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasLogLevelString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("log_level", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasMetricLevelString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("metric_level", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNameString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("name", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNullInputBehaviorString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("null_input_behavior", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasReturnBehaviorString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("return_behavior", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasReturnTypeString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("return_type", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasSchemaString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("schema", expected)) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasTraceLevelString(expected string) *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueSet("trace_level", expected)) + return f +} + +//////////////////////////// +// Attribute empty checks // +//////////////////////////// + +func (f *FunctionJavascriptResourceAssert) HasNoArguments() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("arguments")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoComment() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("comment")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoDatabase() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("database")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoEnableConsoleOutput() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("enable_console_output")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoFullyQualifiedName() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("fully_qualified_name")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoFunctionDefinition() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_definition")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoFunctionLanguage() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_language")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoIsSecure() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_secure")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoLogLevel() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("log_level")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoMetricLevel() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("metric_level")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoName() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("name")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoNullInputBehavior() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("null_input_behavior")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoReturnBehavior() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_behavior")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoReturnType() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_type")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoSchema() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("schema")) + return f +} + +func (f *FunctionJavascriptResourceAssert) HasNoTraceLevel() *FunctionJavascriptResourceAssert { + f.AddAssertion(assert.ValueNotSet("trace_level")) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/function_python_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_python_resource_gen.go new file mode 100644 index 0000000000..17a9849c99 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_python_resource_gen.go @@ -0,0 +1,267 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" +) + +type FunctionPythonResourceAssert struct { + *assert.ResourceAssert +} + +func FunctionPythonResource(t *testing.T, name string) *FunctionPythonResourceAssert { + t.Helper() + + return &FunctionPythonResourceAssert{ + ResourceAssert: assert.NewResourceAssert(name, "resource"), + } +} + +func ImportedFunctionPythonResource(t *testing.T, id string) *FunctionPythonResourceAssert { + t.Helper() + + return &FunctionPythonResourceAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported resource"), + } +} + +/////////////////////////////////// +// Attribute value string checks // +/////////////////////////////////// + +func (f *FunctionPythonResourceAssert) HasArgumentsString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("arguments", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasCommentString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("comment", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasDatabaseString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("database", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasEnableConsoleOutputString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("enable_console_output", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasExternalAccessIntegrationsString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("external_access_integrations", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasFullyQualifiedNameString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("fully_qualified_name", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasFunctionDefinitionString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("function_definition", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasFunctionLanguageString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("function_language", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasHandlerString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("handler", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasImportsString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("imports", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasIsAggregateString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("is_aggregate", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasIsSecureString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("is_secure", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasLogLevelString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("log_level", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasMetricLevelString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("metric_level", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasNameString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("name", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasNullInputBehaviorString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("null_input_behavior", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasPackagesString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("packages", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasReturnBehaviorString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("return_behavior", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasReturnTypeString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("return_type", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasRuntimeVersionString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("runtime_version", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasSchemaString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("schema", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasSecretsString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("secrets", expected)) + return f +} + +func (f *FunctionPythonResourceAssert) HasTraceLevelString(expected string) *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueSet("trace_level", expected)) + return f +} + +//////////////////////////// +// Attribute empty checks // +//////////////////////////// + +func (f *FunctionPythonResourceAssert) HasNoArguments() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("arguments")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoComment() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("comment")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoDatabase() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("database")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoEnableConsoleOutput() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("enable_console_output")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoExternalAccessIntegrations() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("external_access_integrations")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoFullyQualifiedName() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("fully_qualified_name")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoFunctionDefinition() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_definition")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoFunctionLanguage() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_language")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoHandler() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("handler")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoImports() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("imports")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoIsAggregate() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_aggregate")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoIsSecure() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_secure")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoLogLevel() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("log_level")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoMetricLevel() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("metric_level")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoName() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("name")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoNullInputBehavior() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("null_input_behavior")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoPackages() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("packages")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoReturnBehavior() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_behavior")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoReturnType() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_type")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoRuntimeVersion() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("runtime_version")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoSchema() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("schema")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoSecrets() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("secrets")) + return f +} + +func (f *FunctionPythonResourceAssert) HasNoTraceLevel() *FunctionPythonResourceAssert { + f.AddAssertion(assert.ValueNotSet("trace_level")) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/function_scala_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_scala_resource_gen.go new file mode 100644 index 0000000000..be1ccc837a --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_scala_resource_gen.go @@ -0,0 +1,267 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" +) + +type FunctionScalaResourceAssert struct { + *assert.ResourceAssert +} + +func FunctionScalaResource(t *testing.T, name string) *FunctionScalaResourceAssert { + t.Helper() + + return &FunctionScalaResourceAssert{ + ResourceAssert: assert.NewResourceAssert(name, "resource"), + } +} + +func ImportedFunctionScalaResource(t *testing.T, id string) *FunctionScalaResourceAssert { + t.Helper() + + return &FunctionScalaResourceAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported resource"), + } +} + +/////////////////////////////////// +// Attribute value string checks // +/////////////////////////////////// + +func (f *FunctionScalaResourceAssert) HasArgumentsString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("arguments", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasCommentString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("comment", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasDatabaseString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("database", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasEnableConsoleOutputString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("enable_console_output", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasExternalAccessIntegrationsString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("external_access_integrations", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasFullyQualifiedNameString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("fully_qualified_name", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasFunctionDefinitionString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("function_definition", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasFunctionLanguageString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("function_language", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasHandlerString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("handler", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasImportsString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("imports", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasIsSecureString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("is_secure", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasLogLevelString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("log_level", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasMetricLevelString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("metric_level", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasNameString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("name", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasNullInputBehaviorString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("null_input_behavior", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasPackagesString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("packages", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasReturnBehaviorString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("return_behavior", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasReturnTypeString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("return_type", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasRuntimeVersionString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("runtime_version", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasSchemaString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("schema", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasSecretsString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("secrets", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasTargetPathString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("target_path", expected)) + return f +} + +func (f *FunctionScalaResourceAssert) HasTraceLevelString(expected string) *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueSet("trace_level", expected)) + return f +} + +//////////////////////////// +// Attribute empty checks // +//////////////////////////// + +func (f *FunctionScalaResourceAssert) HasNoArguments() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("arguments")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoComment() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("comment")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoDatabase() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("database")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoEnableConsoleOutput() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("enable_console_output")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoExternalAccessIntegrations() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("external_access_integrations")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoFullyQualifiedName() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("fully_qualified_name")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoFunctionDefinition() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_definition")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoFunctionLanguage() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_language")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoHandler() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("handler")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoImports() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("imports")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoIsSecure() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_secure")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoLogLevel() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("log_level")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoMetricLevel() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("metric_level")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoName() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("name")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoNullInputBehavior() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("null_input_behavior")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoPackages() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("packages")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoReturnBehavior() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_behavior")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoReturnType() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_type")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoRuntimeVersion() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("runtime_version")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoSchema() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("schema")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoSecrets() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("secrets")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoTargetPath() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("target_path")) + return f +} + +func (f *FunctionScalaResourceAssert) HasNoTraceLevel() *FunctionScalaResourceAssert { + f.AddAssertion(assert.ValueNotSet("trace_level")) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/function_sql_resource_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_sql_resource_gen.go new file mode 100644 index 0000000000..142de640a5 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/function_sql_resource_gen.go @@ -0,0 +1,197 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" +) + +type FunctionSqlResourceAssert struct { + *assert.ResourceAssert +} + +func FunctionSqlResource(t *testing.T, name string) *FunctionSqlResourceAssert { + t.Helper() + + return &FunctionSqlResourceAssert{ + ResourceAssert: assert.NewResourceAssert(name, "resource"), + } +} + +func ImportedFunctionSqlResource(t *testing.T, id string) *FunctionSqlResourceAssert { + t.Helper() + + return &FunctionSqlResourceAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported resource"), + } +} + +/////////////////////////////////// +// Attribute value string checks // +/////////////////////////////////// + +func (f *FunctionSqlResourceAssert) HasArgumentsString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("arguments", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasCommentString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("comment", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasDatabaseString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("database", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasEnableConsoleOutputString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("enable_console_output", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasFullyQualifiedNameString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("fully_qualified_name", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasFunctionDefinitionString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("function_definition", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasFunctionLanguageString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("function_language", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasIsSecureString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("is_secure", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasLogLevelString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("log_level", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasMetricLevelString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("metric_level", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasNameString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("name", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasNullInputBehaviorString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("null_input_behavior", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasReturnBehaviorString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("return_behavior", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasReturnTypeString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("return_type", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasSchemaString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("schema", expected)) + return f +} + +func (f *FunctionSqlResourceAssert) HasTraceLevelString(expected string) *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueSet("trace_level", expected)) + return f +} + +//////////////////////////// +// Attribute empty checks // +//////////////////////////// + +func (f *FunctionSqlResourceAssert) HasNoArguments() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("arguments")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoComment() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("comment")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoDatabase() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("database")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoEnableConsoleOutput() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("enable_console_output")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoFullyQualifiedName() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("fully_qualified_name")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoFunctionDefinition() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_definition")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoFunctionLanguage() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("function_language")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoIsSecure() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("is_secure")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoLogLevel() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("log_level")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoMetricLevel() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("metric_level")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoName() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("name")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoNullInputBehavior() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("null_input_behavior")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoReturnBehavior() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_behavior")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoReturnType() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("return_type")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoSchema() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("schema")) + return f +} + +func (f *FunctionSqlResourceAssert) HasNoTraceLevel() *FunctionSqlResourceAssert { + f.AddAssertion(assert.ValueNotSet("trace_level")) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceassert/gen/resource_schema_def.go b/pkg/acceptance/bettertestspoc/assert/resourceassert/gen/resource_schema_def.go index 2c48ff623c..44fa5d5490 100644 --- a/pkg/acceptance/bettertestspoc/assert/resourceassert/gen/resource_schema_def.go +++ b/pkg/acceptance/bettertestspoc/assert/resourceassert/gen/resource_schema_def.go @@ -121,4 +121,24 @@ var allResourceSchemaDefs = []ResourceSchemaDef{ name: "Account", schema: resources.Account().Schema, }, + { + name: "FunctionJava", + schema: resources.FunctionJava().Schema, + }, + { + name: "FunctionJavascript", + schema: resources.FunctionJavascript().Schema, + }, + { + name: "FunctionPython", + schema: resources.FunctionPython().Schema, + }, + { + name: "FunctionScala", + schema: resources.FunctionScala().Schema, + }, + { + name: "FunctionSql", + schema: resources.FunctionSql().Schema, + }, } diff --git a/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/function_resource_parameters_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/function_resource_parameters_gen.go new file mode 100644 index 0000000000..a7414b06e0 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/function_resource_parameters_gen.go @@ -0,0 +1,82 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceparametersassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type FunctionResourceParametersAssert struct { + *assert.ResourceAssert +} + +func FunctionResourceParameters(t *testing.T, name string) *FunctionResourceParametersAssert { + t.Helper() + + resourceParameterAssert := FunctionResourceParametersAssert{ + ResourceAssert: assert.NewResourceAssert(name, "parameters"), + } + resourceParameterAssert.AddAssertion(assert.ValueSet("parameters.#", "1")) + return &resourceParameterAssert +} + +func ImportedFunctionResourceParameters(t *testing.T, id string) *FunctionResourceParametersAssert { + t.Helper() + + resourceParameterAssert := FunctionResourceParametersAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported parameters"), + } + resourceParameterAssert.AddAssertion(assert.ValueSet("parameters.#", "1")) + return &resourceParameterAssert +} + +//////////////////////////// +// Parameter value checks // +//////////////////////////// + +func (f *FunctionResourceParametersAssert) HasEnableConsoleOutput(expected bool) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterBoolValueSet(sdk.FunctionParameterEnableConsoleOutput, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasLogLevel(expected sdk.LogLevel) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.FunctionParameterLogLevel, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasMetricLevel(expected sdk.MetricLevel) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.FunctionParameterMetricLevel, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasTraceLevel(expected sdk.TraceLevel) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.FunctionParameterTraceLevel, expected)) + return f +} + +//////////////////////////// +// Parameter level checks // +//////////////////////////// + +func (f *FunctionResourceParametersAssert) HasEnableConsoleOutputLevel(expected sdk.ParameterType) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterLevelSet(sdk.FunctionParameterEnableConsoleOutput, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasLogLevelLevel(expected sdk.ParameterType) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterLevelSet(sdk.FunctionParameterLogLevel, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasMetricLevelLevel(expected sdk.ParameterType) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterLevelSet(sdk.FunctionParameterMetricLevel, expected)) + return f +} + +func (f *FunctionResourceParametersAssert) HasTraceLevelLevel(expected sdk.ParameterType) *FunctionResourceParametersAssert { + f.AddAssertion(assert.ResourceParameterLevelSet(sdk.FunctionParameterTraceLevel, expected)) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/procedure_resource_parameters_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/procedure_resource_parameters_gen.go new file mode 100644 index 0000000000..3c89493541 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceparametersassert/procedure_resource_parameters_gen.go @@ -0,0 +1,92 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceparametersassert + +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +type ProcedureResourceParametersAssert struct { + *assert.ResourceAssert +} + +func ProcedureResourceParameters(t *testing.T, name string) *ProcedureResourceParametersAssert { + t.Helper() + + resourceParameterAssert := ProcedureResourceParametersAssert{ + ResourceAssert: assert.NewResourceAssert(name, "parameters"), + } + resourceParameterAssert.AddAssertion(assert.ValueSet("parameters.#", "1")) + return &resourceParameterAssert +} + +func ImportedProcedureResourceParameters(t *testing.T, id string) *ProcedureResourceParametersAssert { + t.Helper() + + resourceParameterAssert := ProcedureResourceParametersAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "imported parameters"), + } + resourceParameterAssert.AddAssertion(assert.ValueSet("parameters.#", "1")) + return &resourceParameterAssert +} + +//////////////////////////// +// Parameter value checks // +//////////////////////////// + +func (p *ProcedureResourceParametersAssert) HasAutoEventLogging(expected sdk.AutoEventLogging) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.ProcedureParameterAutoEventLogging, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasEnableConsoleOutput(expected bool) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterBoolValueSet(sdk.ProcedureParameterEnableConsoleOutput, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasLogLevel(expected sdk.LogLevel) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.ProcedureParameterLogLevel, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasMetricLevel(expected sdk.MetricLevel) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.ProcedureParameterMetricLevel, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasTraceLevel(expected sdk.TraceLevel) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterStringUnderlyingValueSet(sdk.ProcedureParameterTraceLevel, expected)) + return p +} + +//////////////////////////// +// Parameter level checks // +//////////////////////////// + +func (p *ProcedureResourceParametersAssert) HasAutoEventLoggingLevel(expected sdk.ParameterType) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterLevelSet(sdk.ProcedureParameterAutoEventLogging, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasEnableConsoleOutputLevel(expected sdk.ParameterType) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterLevelSet(sdk.ProcedureParameterEnableConsoleOutput, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasLogLevelLevel(expected sdk.ParameterType) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterLevelSet(sdk.ProcedureParameterLogLevel, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasMetricLevelLevel(expected sdk.ParameterType) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterLevelSet(sdk.ProcedureParameterMetricLevel, expected)) + return p +} + +func (p *ProcedureResourceParametersAssert) HasTraceLevelLevel(expected sdk.ParameterType) *ProcedureResourceParametersAssert { + p.AddAssertion(assert.ResourceParameterLevelSet(sdk.ProcedureParameterTraceLevel, expected)) + return p +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/function_show_output_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/function_show_output_gen.go new file mode 100644 index 0000000000..dc8fd54d38 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/function_show_output_gen.go @@ -0,0 +1,144 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceshowoutputassert + +// imports edited manually +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +// to ensure sdk package is used +var _ = sdk.Object{} + +type FunctionShowOutputAssert struct { + *assert.ResourceAssert +} + +func FunctionShowOutput(t *testing.T, name string) *FunctionShowOutputAssert { + t.Helper() + + f := FunctionShowOutputAssert{ + ResourceAssert: assert.NewResourceAssert(name, "show_output"), + } + f.AddAssertion(assert.ValueSet("show_output.#", "1")) + return &f +} + +func ImportedFunctionShowOutput(t *testing.T, id string) *FunctionShowOutputAssert { + t.Helper() + + f := FunctionShowOutputAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "show_output"), + } + f.AddAssertion(assert.ValueSet("show_output.#", "1")) + return &f +} + +//////////////////////////// +// Attribute value checks // +//////////////////////////// + +func (f *FunctionShowOutputAssert) HasCreatedOn(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("created_on", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasName(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("name", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasSchemaName(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("schema_name", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsBuiltin(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_builtin", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsAggregate(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_aggregate", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsAnsi(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_ansi", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasMinNumArguments(expected int) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputIntValueSet("min_num_arguments", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasMaxNumArguments(expected int) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputIntValueSet("max_num_arguments", expected)) + return f +} + +// HasArgumentsOld removed manually + +func (f *FunctionShowOutputAssert) HasArgumentsRaw(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("arguments_raw", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasDescription(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("description", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasCatalogName(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("catalog_name", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsTableFunction(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_table_function", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasValidForClustering(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("valid_for_clustering", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsSecure(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_secure", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasSecrets(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("secrets", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasExternalAccessIntegrations(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("external_access_integrations", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsExternalFunction(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_external_function", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasLanguage(expected string) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputValueSet("language", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsMemoizable(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_memoizable", expected)) + return f +} + +func (f *FunctionShowOutputAssert) HasIsDataMetric(expected bool) *FunctionShowOutputAssert { + f.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_data_metric", expected)) + return f +} diff --git a/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/procedure_show_output_gen.go b/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/procedure_show_output_gen.go new file mode 100644 index 0000000000..f9f6ed5831 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert/procedure_show_output_gen.go @@ -0,0 +1,124 @@ +// Code generated by assertions generator; DO NOT EDIT. + +package resourceshowoutputassert + +// imports edited manually +import ( + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +// to ensure sdk package is used +var _ = sdk.Object{} + +type ProcedureShowOutputAssert struct { + *assert.ResourceAssert +} + +func ProcedureShowOutput(t *testing.T, name string) *ProcedureShowOutputAssert { + t.Helper() + + p := ProcedureShowOutputAssert{ + ResourceAssert: assert.NewResourceAssert(name, "show_output"), + } + p.AddAssertion(assert.ValueSet("show_output.#", "1")) + return &p +} + +func ImportedProcedureShowOutput(t *testing.T, id string) *ProcedureShowOutputAssert { + t.Helper() + + p := ProcedureShowOutputAssert{ + ResourceAssert: assert.NewImportedResourceAssert(id, "show_output"), + } + p.AddAssertion(assert.ValueSet("show_output.#", "1")) + return &p +} + +//////////////////////////// +// Attribute value checks // +//////////////////////////// + +func (p *ProcedureShowOutputAssert) HasCreatedOn(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("created_on", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasName(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("name", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasSchemaName(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("schema_name", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasIsBuiltin(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_builtin", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasIsAggregate(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_aggregate", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasIsAnsi(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_ansi", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasMinNumArguments(expected int) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputIntValueSet("min_num_arguments", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasMaxNumArguments(expected int) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputIntValueSet("max_num_arguments", expected)) + return p +} + +// HasArgumentsOld removed manually + +func (p *ProcedureShowOutputAssert) HasArgumentsRaw(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("arguments_raw", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasDescription(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("description", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasCatalogName(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("catalog_name", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasIsTableFunction(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_table_function", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasValidForClustering(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("valid_for_clustering", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasIsSecure(expected bool) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputBoolValueSet("is_secure", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasSecrets(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("secrets", expected)) + return p +} + +func (p *ProcedureShowOutputAssert) HasExternalAccessIntegrations(expected string) *ProcedureShowOutputAssert { + p.AddAssertion(assert.ResourceShowOutputValueSet("external_access_integrations", expected)) + return p +} diff --git a/pkg/acceptance/bettertestspoc/config/model/account_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/account_model_ext.go new file mode 100644 index 0000000000..4d81e2e589 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/account_model_ext.go @@ -0,0 +1,11 @@ +package model + +import ( + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" +) + +func (a *AccountModel) WithAdminUserTypeEnum(adminUserType sdk.UserType) *AccountModel { + a.AdminUserType = tfconfig.StringVariable(string(adminUserType)) + return a +} diff --git a/pkg/acceptance/bettertestspoc/config/model/account_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/account_model_gen.go index 87ddf58d4e..b1000b7931 100644 --- a/pkg/acceptance/bettertestspoc/config/model/account_model_gen.go +++ b/pkg/acceptance/bettertestspoc/config/model/account_model_gen.go @@ -13,6 +13,7 @@ type AccountModel struct { AdminName tfconfig.Variable `json:"admin_name,omitempty"` AdminPassword tfconfig.Variable `json:"admin_password,omitempty"` AdminRsaPublicKey tfconfig.Variable `json:"admin_rsa_public_key,omitempty"` + AdminUserType tfconfig.Variable `json:"admin_user_type,omitempty"` Comment tfconfig.Variable `json:"comment,omitempty"` Edition tfconfig.Variable `json:"edition,omitempty"` Email tfconfig.Variable `json:"email,omitempty"` @@ -38,12 +39,14 @@ func Account( adminName string, edition string, email string, + gracePeriodInDays int, name string, ) *AccountModel { a := &AccountModel{ResourceModelMeta: config.Meta(resourceName, resources.Account)} a.WithAdminName(adminName) a.WithEdition(edition) a.WithEmail(email) + a.WithGracePeriodInDays(gracePeriodInDays) a.WithName(name) return a } @@ -52,12 +55,14 @@ func AccountWithDefaultMeta( adminName string, edition string, email string, + gracePeriodInDays int, name string, ) *AccountModel { a := &AccountModel{ResourceModelMeta: config.DefaultMeta(resources.Account)} a.WithAdminName(adminName) a.WithEdition(edition) a.WithEmail(email) + a.WithGracePeriodInDays(gracePeriodInDays) a.WithName(name) return a } @@ -81,6 +86,11 @@ func (a *AccountModel) WithAdminRsaPublicKey(adminRsaPublicKey string) *AccountM return a } +func (a *AccountModel) WithAdminUserType(adminUserType string) *AccountModel { + a.AdminUserType = tfconfig.StringVariable(adminUserType) + return a +} + func (a *AccountModel) WithComment(comment string) *AccountModel { a.Comment = tfconfig.StringVariable(comment) return a @@ -111,8 +121,8 @@ func (a *AccountModel) WithGracePeriodInDays(gracePeriodInDays int) *AccountMode return a } -func (a *AccountModel) WithIsOrgAdmin(isOrgAdmin bool) *AccountModel { - a.IsOrgAdmin = tfconfig.BoolVariable(isOrgAdmin) +func (a *AccountModel) WithIsOrgAdmin(isOrgAdmin string) *AccountModel { + a.IsOrgAdmin = tfconfig.StringVariable(isOrgAdmin) return a } @@ -121,8 +131,8 @@ func (a *AccountModel) WithLastName(lastName string) *AccountModel { return a } -func (a *AccountModel) WithMustChangePassword(mustChangePassword bool) *AccountModel { - a.MustChangePassword = tfconfig.BoolVariable(mustChangePassword) +func (a *AccountModel) WithMustChangePassword(mustChangePassword string) *AccountModel { + a.MustChangePassword = tfconfig.StringVariable(mustChangePassword) return a } @@ -160,6 +170,11 @@ func (a *AccountModel) WithAdminRsaPublicKeyValue(value tfconfig.Variable) *Acco return a } +func (a *AccountModel) WithAdminUserTypeValue(value tfconfig.Variable) *AccountModel { + a.AdminUserType = value + return a +} + func (a *AccountModel) WithCommentValue(value tfconfig.Variable) *AccountModel { a.Comment = value return a diff --git a/pkg/acceptance/bettertestspoc/config/model/function_java_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/function_java_model_ext.go new file mode 100644 index 0000000000..4bac27ada5 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_java_model_ext.go @@ -0,0 +1,16 @@ +package model + +import ( + "encoding/json" +) + +func (f *FunctionJavaModel) MarshalJSON() ([]byte, error) { + type Alias FunctionJavaModel + return json.Marshal(&struct { + *Alias + DependsOn []string `json:"depends_on,omitempty"` + }{ + Alias: (*Alias)(f), + DependsOn: f.DependsOn(), + }) +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_java_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/function_java_model_gen.go new file mode 100644 index 0000000000..704f6b2bcf --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_java_model_gen.go @@ -0,0 +1,302 @@ +// Code generated by config model builder generator; DO NOT EDIT. + +package model + +import ( + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" +) + +type FunctionJavaModel struct { + Arguments tfconfig.Variable `json:"arguments,omitempty"` + Comment tfconfig.Variable `json:"comment,omitempty"` + Database tfconfig.Variable `json:"database,omitempty"` + EnableConsoleOutput tfconfig.Variable `json:"enable_console_output,omitempty"` + ExternalAccessIntegrations tfconfig.Variable `json:"external_access_integrations,omitempty"` + FullyQualifiedName tfconfig.Variable `json:"fully_qualified_name,omitempty"` + FunctionDefinition tfconfig.Variable `json:"function_definition,omitempty"` + FunctionLanguage tfconfig.Variable `json:"function_language,omitempty"` + Handler tfconfig.Variable `json:"handler,omitempty"` + Imports tfconfig.Variable `json:"imports,omitempty"` + IsSecure tfconfig.Variable `json:"is_secure,omitempty"` + LogLevel tfconfig.Variable `json:"log_level,omitempty"` + MetricLevel tfconfig.Variable `json:"metric_level,omitempty"` + Name tfconfig.Variable `json:"name,omitempty"` + NullInputBehavior tfconfig.Variable `json:"null_input_behavior,omitempty"` + Packages tfconfig.Variable `json:"packages,omitempty"` + ReturnBehavior tfconfig.Variable `json:"return_behavior,omitempty"` + ReturnType tfconfig.Variable `json:"return_type,omitempty"` + RuntimeVersion tfconfig.Variable `json:"runtime_version,omitempty"` + Schema tfconfig.Variable `json:"schema,omitempty"` + Secrets tfconfig.Variable `json:"secrets,omitempty"` + TargetPath tfconfig.Variable `json:"target_path,omitempty"` + TraceLevel tfconfig.Variable `json:"trace_level,omitempty"` + + *config.ResourceModelMeta +} + +///////////////////////////////////////////////// +// Basic builders (resource name and required) // +///////////////////////////////////////////////// + +func FunctionJava( + resourceName string, + database string, + functionDefinition string, + handler string, + name string, + returnType string, + schema string, +) *FunctionJavaModel { + f := &FunctionJavaModel{ResourceModelMeta: config.Meta(resourceName, resources.FunctionJava)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +func FunctionJavaWithDefaultMeta( + database string, + functionDefinition string, + handler string, + name string, + returnType string, + schema string, +) *FunctionJavaModel { + f := &FunctionJavaModel{ResourceModelMeta: config.DefaultMeta(resources.FunctionJava)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +///////////////////////////////// +// below all the proper values // +///////////////////////////////// + +// arguments attribute type is not yet supported, so WithArguments can't be generated + +func (f *FunctionJavaModel) WithComment(comment string) *FunctionJavaModel { + f.Comment = tfconfig.StringVariable(comment) + return f +} + +func (f *FunctionJavaModel) WithDatabase(database string) *FunctionJavaModel { + f.Database = tfconfig.StringVariable(database) + return f +} + +func (f *FunctionJavaModel) WithEnableConsoleOutput(enableConsoleOutput bool) *FunctionJavaModel { + f.EnableConsoleOutput = tfconfig.BoolVariable(enableConsoleOutput) + return f +} + +// external_access_integrations attribute type is not yet supported, so WithExternalAccessIntegrations can't be generated + +func (f *FunctionJavaModel) WithFullyQualifiedName(fullyQualifiedName string) *FunctionJavaModel { + f.FullyQualifiedName = tfconfig.StringVariable(fullyQualifiedName) + return f +} + +func (f *FunctionJavaModel) WithFunctionDefinition(functionDefinition string) *FunctionJavaModel { + f.FunctionDefinition = tfconfig.StringVariable(functionDefinition) + return f +} + +func (f *FunctionJavaModel) WithFunctionLanguage(functionLanguage string) *FunctionJavaModel { + f.FunctionLanguage = tfconfig.StringVariable(functionLanguage) + return f +} + +func (f *FunctionJavaModel) WithHandler(handler string) *FunctionJavaModel { + f.Handler = tfconfig.StringVariable(handler) + return f +} + +// imports attribute type is not yet supported, so WithImports can't be generated + +func (f *FunctionJavaModel) WithIsSecure(isSecure string) *FunctionJavaModel { + f.IsSecure = tfconfig.StringVariable(isSecure) + return f +} + +func (f *FunctionJavaModel) WithLogLevel(logLevel string) *FunctionJavaModel { + f.LogLevel = tfconfig.StringVariable(logLevel) + return f +} + +func (f *FunctionJavaModel) WithMetricLevel(metricLevel string) *FunctionJavaModel { + f.MetricLevel = tfconfig.StringVariable(metricLevel) + return f +} + +func (f *FunctionJavaModel) WithName(name string) *FunctionJavaModel { + f.Name = tfconfig.StringVariable(name) + return f +} + +func (f *FunctionJavaModel) WithNullInputBehavior(nullInputBehavior string) *FunctionJavaModel { + f.NullInputBehavior = tfconfig.StringVariable(nullInputBehavior) + return f +} + +// packages attribute type is not yet supported, so WithPackages can't be generated + +func (f *FunctionJavaModel) WithReturnBehavior(returnBehavior string) *FunctionJavaModel { + f.ReturnBehavior = tfconfig.StringVariable(returnBehavior) + return f +} + +func (f *FunctionJavaModel) WithReturnType(returnType string) *FunctionJavaModel { + f.ReturnType = tfconfig.StringVariable(returnType) + return f +} + +func (f *FunctionJavaModel) WithRuntimeVersion(runtimeVersion string) *FunctionJavaModel { + f.RuntimeVersion = tfconfig.StringVariable(runtimeVersion) + return f +} + +func (f *FunctionJavaModel) WithSchema(schema string) *FunctionJavaModel { + f.Schema = tfconfig.StringVariable(schema) + return f +} + +// secrets attribute type is not yet supported, so WithSecrets can't be generated + +func (f *FunctionJavaModel) WithTargetPath(targetPath string) *FunctionJavaModel { + f.TargetPath = tfconfig.StringVariable(targetPath) + return f +} + +func (f *FunctionJavaModel) WithTraceLevel(traceLevel string) *FunctionJavaModel { + f.TraceLevel = tfconfig.StringVariable(traceLevel) + return f +} + +////////////////////////////////////////// +// below it's possible to set any value // +////////////////////////////////////////// + +func (f *FunctionJavaModel) WithArgumentsValue(value tfconfig.Variable) *FunctionJavaModel { + f.Arguments = value + return f +} + +func (f *FunctionJavaModel) WithCommentValue(value tfconfig.Variable) *FunctionJavaModel { + f.Comment = value + return f +} + +func (f *FunctionJavaModel) WithDatabaseValue(value tfconfig.Variable) *FunctionJavaModel { + f.Database = value + return f +} + +func (f *FunctionJavaModel) WithEnableConsoleOutputValue(value tfconfig.Variable) *FunctionJavaModel { + f.EnableConsoleOutput = value + return f +} + +func (f *FunctionJavaModel) WithExternalAccessIntegrationsValue(value tfconfig.Variable) *FunctionJavaModel { + f.ExternalAccessIntegrations = value + return f +} + +func (f *FunctionJavaModel) WithFullyQualifiedNameValue(value tfconfig.Variable) *FunctionJavaModel { + f.FullyQualifiedName = value + return f +} + +func (f *FunctionJavaModel) WithFunctionDefinitionValue(value tfconfig.Variable) *FunctionJavaModel { + f.FunctionDefinition = value + return f +} + +func (f *FunctionJavaModel) WithFunctionLanguageValue(value tfconfig.Variable) *FunctionJavaModel { + f.FunctionLanguage = value + return f +} + +func (f *FunctionJavaModel) WithHandlerValue(value tfconfig.Variable) *FunctionJavaModel { + f.Handler = value + return f +} + +func (f *FunctionJavaModel) WithImportsValue(value tfconfig.Variable) *FunctionJavaModel { + f.Imports = value + return f +} + +func (f *FunctionJavaModel) WithIsSecureValue(value tfconfig.Variable) *FunctionJavaModel { + f.IsSecure = value + return f +} + +func (f *FunctionJavaModel) WithLogLevelValue(value tfconfig.Variable) *FunctionJavaModel { + f.LogLevel = value + return f +} + +func (f *FunctionJavaModel) WithMetricLevelValue(value tfconfig.Variable) *FunctionJavaModel { + f.MetricLevel = value + return f +} + +func (f *FunctionJavaModel) WithNameValue(value tfconfig.Variable) *FunctionJavaModel { + f.Name = value + return f +} + +func (f *FunctionJavaModel) WithNullInputBehaviorValue(value tfconfig.Variable) *FunctionJavaModel { + f.NullInputBehavior = value + return f +} + +func (f *FunctionJavaModel) WithPackagesValue(value tfconfig.Variable) *FunctionJavaModel { + f.Packages = value + return f +} + +func (f *FunctionJavaModel) WithReturnBehaviorValue(value tfconfig.Variable) *FunctionJavaModel { + f.ReturnBehavior = value + return f +} + +func (f *FunctionJavaModel) WithReturnTypeValue(value tfconfig.Variable) *FunctionJavaModel { + f.ReturnType = value + return f +} + +func (f *FunctionJavaModel) WithRuntimeVersionValue(value tfconfig.Variable) *FunctionJavaModel { + f.RuntimeVersion = value + return f +} + +func (f *FunctionJavaModel) WithSchemaValue(value tfconfig.Variable) *FunctionJavaModel { + f.Schema = value + return f +} + +func (f *FunctionJavaModel) WithSecretsValue(value tfconfig.Variable) *FunctionJavaModel { + f.Secrets = value + return f +} + +func (f *FunctionJavaModel) WithTargetPathValue(value tfconfig.Variable) *FunctionJavaModel { + f.TargetPath = value + return f +} + +func (f *FunctionJavaModel) WithTraceLevelValue(value tfconfig.Variable) *FunctionJavaModel { + f.TraceLevel = value + return f +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_ext.go new file mode 100644 index 0000000000..3fa63b5701 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_ext.go @@ -0,0 +1,16 @@ +package model + +import ( + "encoding/json" +) + +func (f *FunctionJavascriptModel) MarshalJSON() ([]byte, error) { + type Alias FunctionJavascriptModel + return json.Marshal(&struct { + *Alias + DependsOn []string `json:"depends_on,omitempty"` + }{ + Alias: (*Alias)(f), + DependsOn: f.DependsOn(), + }) +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_gen.go new file mode 100644 index 0000000000..5d8ad68aec --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_javascript_model_gen.go @@ -0,0 +1,233 @@ +// Code generated by config model builder generator; DO NOT EDIT. + +package model + +import ( + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" +) + +type FunctionJavascriptModel struct { + Arguments tfconfig.Variable `json:"arguments,omitempty"` + Comment tfconfig.Variable `json:"comment,omitempty"` + Database tfconfig.Variable `json:"database,omitempty"` + EnableConsoleOutput tfconfig.Variable `json:"enable_console_output,omitempty"` + FullyQualifiedName tfconfig.Variable `json:"fully_qualified_name,omitempty"` + FunctionDefinition tfconfig.Variable `json:"function_definition,omitempty"` + FunctionLanguage tfconfig.Variable `json:"function_language,omitempty"` + IsSecure tfconfig.Variable `json:"is_secure,omitempty"` + LogLevel tfconfig.Variable `json:"log_level,omitempty"` + MetricLevel tfconfig.Variable `json:"metric_level,omitempty"` + Name tfconfig.Variable `json:"name,omitempty"` + NullInputBehavior tfconfig.Variable `json:"null_input_behavior,omitempty"` + ReturnBehavior tfconfig.Variable `json:"return_behavior,omitempty"` + ReturnType tfconfig.Variable `json:"return_type,omitempty"` + Schema tfconfig.Variable `json:"schema,omitempty"` + TraceLevel tfconfig.Variable `json:"trace_level,omitempty"` + + *config.ResourceModelMeta +} + +///////////////////////////////////////////////// +// Basic builders (resource name and required) // +///////////////////////////////////////////////// + +func FunctionJavascript( + resourceName string, + database string, + functionDefinition string, + name string, + returnType string, + schema string, +) *FunctionJavascriptModel { + f := &FunctionJavascriptModel{ResourceModelMeta: config.Meta(resourceName, resources.FunctionJavascript)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +func FunctionJavascriptWithDefaultMeta( + database string, + functionDefinition string, + name string, + returnType string, + schema string, +) *FunctionJavascriptModel { + f := &FunctionJavascriptModel{ResourceModelMeta: config.DefaultMeta(resources.FunctionJavascript)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +///////////////////////////////// +// below all the proper values // +///////////////////////////////// + +// arguments attribute type is not yet supported, so WithArguments can't be generated + +func (f *FunctionJavascriptModel) WithComment(comment string) *FunctionJavascriptModel { + f.Comment = tfconfig.StringVariable(comment) + return f +} + +func (f *FunctionJavascriptModel) WithDatabase(database string) *FunctionJavascriptModel { + f.Database = tfconfig.StringVariable(database) + return f +} + +func (f *FunctionJavascriptModel) WithEnableConsoleOutput(enableConsoleOutput bool) *FunctionJavascriptModel { + f.EnableConsoleOutput = tfconfig.BoolVariable(enableConsoleOutput) + return f +} + +func (f *FunctionJavascriptModel) WithFullyQualifiedName(fullyQualifiedName string) *FunctionJavascriptModel { + f.FullyQualifiedName = tfconfig.StringVariable(fullyQualifiedName) + return f +} + +func (f *FunctionJavascriptModel) WithFunctionDefinition(functionDefinition string) *FunctionJavascriptModel { + f.FunctionDefinition = tfconfig.StringVariable(functionDefinition) + return f +} + +func (f *FunctionJavascriptModel) WithFunctionLanguage(functionLanguage string) *FunctionJavascriptModel { + f.FunctionLanguage = tfconfig.StringVariable(functionLanguage) + return f +} + +func (f *FunctionJavascriptModel) WithIsSecure(isSecure string) *FunctionJavascriptModel { + f.IsSecure = tfconfig.StringVariable(isSecure) + return f +} + +func (f *FunctionJavascriptModel) WithLogLevel(logLevel string) *FunctionJavascriptModel { + f.LogLevel = tfconfig.StringVariable(logLevel) + return f +} + +func (f *FunctionJavascriptModel) WithMetricLevel(metricLevel string) *FunctionJavascriptModel { + f.MetricLevel = tfconfig.StringVariable(metricLevel) + return f +} + +func (f *FunctionJavascriptModel) WithName(name string) *FunctionJavascriptModel { + f.Name = tfconfig.StringVariable(name) + return f +} + +func (f *FunctionJavascriptModel) WithNullInputBehavior(nullInputBehavior string) *FunctionJavascriptModel { + f.NullInputBehavior = tfconfig.StringVariable(nullInputBehavior) + return f +} + +func (f *FunctionJavascriptModel) WithReturnBehavior(returnBehavior string) *FunctionJavascriptModel { + f.ReturnBehavior = tfconfig.StringVariable(returnBehavior) + return f +} + +func (f *FunctionJavascriptModel) WithReturnType(returnType string) *FunctionJavascriptModel { + f.ReturnType = tfconfig.StringVariable(returnType) + return f +} + +func (f *FunctionJavascriptModel) WithSchema(schema string) *FunctionJavascriptModel { + f.Schema = tfconfig.StringVariable(schema) + return f +} + +func (f *FunctionJavascriptModel) WithTraceLevel(traceLevel string) *FunctionJavascriptModel { + f.TraceLevel = tfconfig.StringVariable(traceLevel) + return f +} + +////////////////////////////////////////// +// below it's possible to set any value // +////////////////////////////////////////// + +func (f *FunctionJavascriptModel) WithArgumentsValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.Arguments = value + return f +} + +func (f *FunctionJavascriptModel) WithCommentValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.Comment = value + return f +} + +func (f *FunctionJavascriptModel) WithDatabaseValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.Database = value + return f +} + +func (f *FunctionJavascriptModel) WithEnableConsoleOutputValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.EnableConsoleOutput = value + return f +} + +func (f *FunctionJavascriptModel) WithFullyQualifiedNameValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.FullyQualifiedName = value + return f +} + +func (f *FunctionJavascriptModel) WithFunctionDefinitionValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.FunctionDefinition = value + return f +} + +func (f *FunctionJavascriptModel) WithFunctionLanguageValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.FunctionLanguage = value + return f +} + +func (f *FunctionJavascriptModel) WithIsSecureValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.IsSecure = value + return f +} + +func (f *FunctionJavascriptModel) WithLogLevelValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.LogLevel = value + return f +} + +func (f *FunctionJavascriptModel) WithMetricLevelValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.MetricLevel = value + return f +} + +func (f *FunctionJavascriptModel) WithNameValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.Name = value + return f +} + +func (f *FunctionJavascriptModel) WithNullInputBehaviorValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.NullInputBehavior = value + return f +} + +func (f *FunctionJavascriptModel) WithReturnBehaviorValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.ReturnBehavior = value + return f +} + +func (f *FunctionJavascriptModel) WithReturnTypeValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.ReturnType = value + return f +} + +func (f *FunctionJavascriptModel) WithSchemaValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.Schema = value + return f +} + +func (f *FunctionJavascriptModel) WithTraceLevelValue(value tfconfig.Variable) *FunctionJavascriptModel { + f.TraceLevel = value + return f +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_python_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/function_python_model_ext.go new file mode 100644 index 0000000000..8d7475e389 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_python_model_ext.go @@ -0,0 +1,16 @@ +package model + +import ( + "encoding/json" +) + +func (f *FunctionPythonModel) MarshalJSON() ([]byte, error) { + type Alias FunctionPythonModel + return json.Marshal(&struct { + *Alias + DependsOn []string `json:"depends_on,omitempty"` + }{ + Alias: (*Alias)(f), + DependsOn: f.DependsOn(), + }) +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_python_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/function_python_model_gen.go new file mode 100644 index 0000000000..9d0ffbd348 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_python_model_gen.go @@ -0,0 +1,306 @@ +// Code generated by config model builder generator; DO NOT EDIT. + +package model + +import ( + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" +) + +type FunctionPythonModel struct { + Arguments tfconfig.Variable `json:"arguments,omitempty"` + Comment tfconfig.Variable `json:"comment,omitempty"` + Database tfconfig.Variable `json:"database,omitempty"` + EnableConsoleOutput tfconfig.Variable `json:"enable_console_output,omitempty"` + ExternalAccessIntegrations tfconfig.Variable `json:"external_access_integrations,omitempty"` + FullyQualifiedName tfconfig.Variable `json:"fully_qualified_name,omitempty"` + FunctionDefinition tfconfig.Variable `json:"function_definition,omitempty"` + FunctionLanguage tfconfig.Variable `json:"function_language,omitempty"` + Handler tfconfig.Variable `json:"handler,omitempty"` + Imports tfconfig.Variable `json:"imports,omitempty"` + IsAggregate tfconfig.Variable `json:"is_aggregate,omitempty"` + IsSecure tfconfig.Variable `json:"is_secure,omitempty"` + LogLevel tfconfig.Variable `json:"log_level,omitempty"` + MetricLevel tfconfig.Variable `json:"metric_level,omitempty"` + Name tfconfig.Variable `json:"name,omitempty"` + NullInputBehavior tfconfig.Variable `json:"null_input_behavior,omitempty"` + Packages tfconfig.Variable `json:"packages,omitempty"` + ReturnBehavior tfconfig.Variable `json:"return_behavior,omitempty"` + ReturnType tfconfig.Variable `json:"return_type,omitempty"` + RuntimeVersion tfconfig.Variable `json:"runtime_version,omitempty"` + Schema tfconfig.Variable `json:"schema,omitempty"` + Secrets tfconfig.Variable `json:"secrets,omitempty"` + TraceLevel tfconfig.Variable `json:"trace_level,omitempty"` + + *config.ResourceModelMeta +} + +///////////////////////////////////////////////// +// Basic builders (resource name and required) // +///////////////////////////////////////////////// + +func FunctionPython( + resourceName string, + database string, + functionDefinition string, + handler string, + name string, + returnType string, + runtimeVersion string, + schema string, +) *FunctionPythonModel { + f := &FunctionPythonModel{ResourceModelMeta: config.Meta(resourceName, resources.FunctionPython)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithRuntimeVersion(runtimeVersion) + f.WithSchema(schema) + return f +} + +func FunctionPythonWithDefaultMeta( + database string, + functionDefinition string, + handler string, + name string, + returnType string, + runtimeVersion string, + schema string, +) *FunctionPythonModel { + f := &FunctionPythonModel{ResourceModelMeta: config.DefaultMeta(resources.FunctionPython)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithRuntimeVersion(runtimeVersion) + f.WithSchema(schema) + return f +} + +///////////////////////////////// +// below all the proper values // +///////////////////////////////// + +// arguments attribute type is not yet supported, so WithArguments can't be generated + +func (f *FunctionPythonModel) WithComment(comment string) *FunctionPythonModel { + f.Comment = tfconfig.StringVariable(comment) + return f +} + +func (f *FunctionPythonModel) WithDatabase(database string) *FunctionPythonModel { + f.Database = tfconfig.StringVariable(database) + return f +} + +func (f *FunctionPythonModel) WithEnableConsoleOutput(enableConsoleOutput bool) *FunctionPythonModel { + f.EnableConsoleOutput = tfconfig.BoolVariable(enableConsoleOutput) + return f +} + +// external_access_integrations attribute type is not yet supported, so WithExternalAccessIntegrations can't be generated + +func (f *FunctionPythonModel) WithFullyQualifiedName(fullyQualifiedName string) *FunctionPythonModel { + f.FullyQualifiedName = tfconfig.StringVariable(fullyQualifiedName) + return f +} + +func (f *FunctionPythonModel) WithFunctionDefinition(functionDefinition string) *FunctionPythonModel { + f.FunctionDefinition = tfconfig.StringVariable(functionDefinition) + return f +} + +func (f *FunctionPythonModel) WithFunctionLanguage(functionLanguage string) *FunctionPythonModel { + f.FunctionLanguage = tfconfig.StringVariable(functionLanguage) + return f +} + +func (f *FunctionPythonModel) WithHandler(handler string) *FunctionPythonModel { + f.Handler = tfconfig.StringVariable(handler) + return f +} + +// imports attribute type is not yet supported, so WithImports can't be generated + +func (f *FunctionPythonModel) WithIsAggregate(isAggregate string) *FunctionPythonModel { + f.IsAggregate = tfconfig.StringVariable(isAggregate) + return f +} + +func (f *FunctionPythonModel) WithIsSecure(isSecure string) *FunctionPythonModel { + f.IsSecure = tfconfig.StringVariable(isSecure) + return f +} + +func (f *FunctionPythonModel) WithLogLevel(logLevel string) *FunctionPythonModel { + f.LogLevel = tfconfig.StringVariable(logLevel) + return f +} + +func (f *FunctionPythonModel) WithMetricLevel(metricLevel string) *FunctionPythonModel { + f.MetricLevel = tfconfig.StringVariable(metricLevel) + return f +} + +func (f *FunctionPythonModel) WithName(name string) *FunctionPythonModel { + f.Name = tfconfig.StringVariable(name) + return f +} + +func (f *FunctionPythonModel) WithNullInputBehavior(nullInputBehavior string) *FunctionPythonModel { + f.NullInputBehavior = tfconfig.StringVariable(nullInputBehavior) + return f +} + +// packages attribute type is not yet supported, so WithPackages can't be generated + +func (f *FunctionPythonModel) WithReturnBehavior(returnBehavior string) *FunctionPythonModel { + f.ReturnBehavior = tfconfig.StringVariable(returnBehavior) + return f +} + +func (f *FunctionPythonModel) WithReturnType(returnType string) *FunctionPythonModel { + f.ReturnType = tfconfig.StringVariable(returnType) + return f +} + +func (f *FunctionPythonModel) WithRuntimeVersion(runtimeVersion string) *FunctionPythonModel { + f.RuntimeVersion = tfconfig.StringVariable(runtimeVersion) + return f +} + +func (f *FunctionPythonModel) WithSchema(schema string) *FunctionPythonModel { + f.Schema = tfconfig.StringVariable(schema) + return f +} + +// secrets attribute type is not yet supported, so WithSecrets can't be generated + +func (f *FunctionPythonModel) WithTraceLevel(traceLevel string) *FunctionPythonModel { + f.TraceLevel = tfconfig.StringVariable(traceLevel) + return f +} + +////////////////////////////////////////// +// below it's possible to set any value // +////////////////////////////////////////// + +func (f *FunctionPythonModel) WithArgumentsValue(value tfconfig.Variable) *FunctionPythonModel { + f.Arguments = value + return f +} + +func (f *FunctionPythonModel) WithCommentValue(value tfconfig.Variable) *FunctionPythonModel { + f.Comment = value + return f +} + +func (f *FunctionPythonModel) WithDatabaseValue(value tfconfig.Variable) *FunctionPythonModel { + f.Database = value + return f +} + +func (f *FunctionPythonModel) WithEnableConsoleOutputValue(value tfconfig.Variable) *FunctionPythonModel { + f.EnableConsoleOutput = value + return f +} + +func (f *FunctionPythonModel) WithExternalAccessIntegrationsValue(value tfconfig.Variable) *FunctionPythonModel { + f.ExternalAccessIntegrations = value + return f +} + +func (f *FunctionPythonModel) WithFullyQualifiedNameValue(value tfconfig.Variable) *FunctionPythonModel { + f.FullyQualifiedName = value + return f +} + +func (f *FunctionPythonModel) WithFunctionDefinitionValue(value tfconfig.Variable) *FunctionPythonModel { + f.FunctionDefinition = value + return f +} + +func (f *FunctionPythonModel) WithFunctionLanguageValue(value tfconfig.Variable) *FunctionPythonModel { + f.FunctionLanguage = value + return f +} + +func (f *FunctionPythonModel) WithHandlerValue(value tfconfig.Variable) *FunctionPythonModel { + f.Handler = value + return f +} + +func (f *FunctionPythonModel) WithImportsValue(value tfconfig.Variable) *FunctionPythonModel { + f.Imports = value + return f +} + +func (f *FunctionPythonModel) WithIsAggregateValue(value tfconfig.Variable) *FunctionPythonModel { + f.IsAggregate = value + return f +} + +func (f *FunctionPythonModel) WithIsSecureValue(value tfconfig.Variable) *FunctionPythonModel { + f.IsSecure = value + return f +} + +func (f *FunctionPythonModel) WithLogLevelValue(value tfconfig.Variable) *FunctionPythonModel { + f.LogLevel = value + return f +} + +func (f *FunctionPythonModel) WithMetricLevelValue(value tfconfig.Variable) *FunctionPythonModel { + f.MetricLevel = value + return f +} + +func (f *FunctionPythonModel) WithNameValue(value tfconfig.Variable) *FunctionPythonModel { + f.Name = value + return f +} + +func (f *FunctionPythonModel) WithNullInputBehaviorValue(value tfconfig.Variable) *FunctionPythonModel { + f.NullInputBehavior = value + return f +} + +func (f *FunctionPythonModel) WithPackagesValue(value tfconfig.Variable) *FunctionPythonModel { + f.Packages = value + return f +} + +func (f *FunctionPythonModel) WithReturnBehaviorValue(value tfconfig.Variable) *FunctionPythonModel { + f.ReturnBehavior = value + return f +} + +func (f *FunctionPythonModel) WithReturnTypeValue(value tfconfig.Variable) *FunctionPythonModel { + f.ReturnType = value + return f +} + +func (f *FunctionPythonModel) WithRuntimeVersionValue(value tfconfig.Variable) *FunctionPythonModel { + f.RuntimeVersion = value + return f +} + +func (f *FunctionPythonModel) WithSchemaValue(value tfconfig.Variable) *FunctionPythonModel { + f.Schema = value + return f +} + +func (f *FunctionPythonModel) WithSecretsValue(value tfconfig.Variable) *FunctionPythonModel { + f.Secrets = value + return f +} + +func (f *FunctionPythonModel) WithTraceLevelValue(value tfconfig.Variable) *FunctionPythonModel { + f.TraceLevel = value + return f +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_scala_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/function_scala_model_ext.go new file mode 100644 index 0000000000..a5e43e53ca --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_scala_model_ext.go @@ -0,0 +1,16 @@ +package model + +import ( + "encoding/json" +) + +func (f *FunctionScalaModel) MarshalJSON() ([]byte, error) { + type Alias FunctionScalaModel + return json.Marshal(&struct { + *Alias + DependsOn []string `json:"depends_on,omitempty"` + }{ + Alias: (*Alias)(f), + DependsOn: f.DependsOn(), + }) +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_scala_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/function_scala_model_gen.go new file mode 100644 index 0000000000..017c397af3 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_scala_model_gen.go @@ -0,0 +1,306 @@ +// Code generated by config model builder generator; DO NOT EDIT. + +package model + +import ( + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" +) + +type FunctionScalaModel struct { + Arguments tfconfig.Variable `json:"arguments,omitempty"` + Comment tfconfig.Variable `json:"comment,omitempty"` + Database tfconfig.Variable `json:"database,omitempty"` + EnableConsoleOutput tfconfig.Variable `json:"enable_console_output,omitempty"` + ExternalAccessIntegrations tfconfig.Variable `json:"external_access_integrations,omitempty"` + FullyQualifiedName tfconfig.Variable `json:"fully_qualified_name,omitempty"` + FunctionDefinition tfconfig.Variable `json:"function_definition,omitempty"` + FunctionLanguage tfconfig.Variable `json:"function_language,omitempty"` + Handler tfconfig.Variable `json:"handler,omitempty"` + Imports tfconfig.Variable `json:"imports,omitempty"` + IsSecure tfconfig.Variable `json:"is_secure,omitempty"` + LogLevel tfconfig.Variable `json:"log_level,omitempty"` + MetricLevel tfconfig.Variable `json:"metric_level,omitempty"` + Name tfconfig.Variable `json:"name,omitempty"` + NullInputBehavior tfconfig.Variable `json:"null_input_behavior,omitempty"` + Packages tfconfig.Variable `json:"packages,omitempty"` + ReturnBehavior tfconfig.Variable `json:"return_behavior,omitempty"` + ReturnType tfconfig.Variable `json:"return_type,omitempty"` + RuntimeVersion tfconfig.Variable `json:"runtime_version,omitempty"` + Schema tfconfig.Variable `json:"schema,omitempty"` + Secrets tfconfig.Variable `json:"secrets,omitempty"` + TargetPath tfconfig.Variable `json:"target_path,omitempty"` + TraceLevel tfconfig.Variable `json:"trace_level,omitempty"` + + *config.ResourceModelMeta +} + +///////////////////////////////////////////////// +// Basic builders (resource name and required) // +///////////////////////////////////////////////// + +func FunctionScala( + resourceName string, + database string, + functionDefinition string, + handler string, + name string, + returnType string, + runtimeVersion string, + schema string, +) *FunctionScalaModel { + f := &FunctionScalaModel{ResourceModelMeta: config.Meta(resourceName, resources.FunctionScala)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithRuntimeVersion(runtimeVersion) + f.WithSchema(schema) + return f +} + +func FunctionScalaWithDefaultMeta( + database string, + functionDefinition string, + handler string, + name string, + returnType string, + runtimeVersion string, + schema string, +) *FunctionScalaModel { + f := &FunctionScalaModel{ResourceModelMeta: config.DefaultMeta(resources.FunctionScala)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithHandler(handler) + f.WithName(name) + f.WithReturnType(returnType) + f.WithRuntimeVersion(runtimeVersion) + f.WithSchema(schema) + return f +} + +///////////////////////////////// +// below all the proper values // +///////////////////////////////// + +// arguments attribute type is not yet supported, so WithArguments can't be generated + +func (f *FunctionScalaModel) WithComment(comment string) *FunctionScalaModel { + f.Comment = tfconfig.StringVariable(comment) + return f +} + +func (f *FunctionScalaModel) WithDatabase(database string) *FunctionScalaModel { + f.Database = tfconfig.StringVariable(database) + return f +} + +func (f *FunctionScalaModel) WithEnableConsoleOutput(enableConsoleOutput bool) *FunctionScalaModel { + f.EnableConsoleOutput = tfconfig.BoolVariable(enableConsoleOutput) + return f +} + +// external_access_integrations attribute type is not yet supported, so WithExternalAccessIntegrations can't be generated + +func (f *FunctionScalaModel) WithFullyQualifiedName(fullyQualifiedName string) *FunctionScalaModel { + f.FullyQualifiedName = tfconfig.StringVariable(fullyQualifiedName) + return f +} + +func (f *FunctionScalaModel) WithFunctionDefinition(functionDefinition string) *FunctionScalaModel { + f.FunctionDefinition = tfconfig.StringVariable(functionDefinition) + return f +} + +func (f *FunctionScalaModel) WithFunctionLanguage(functionLanguage string) *FunctionScalaModel { + f.FunctionLanguage = tfconfig.StringVariable(functionLanguage) + return f +} + +func (f *FunctionScalaModel) WithHandler(handler string) *FunctionScalaModel { + f.Handler = tfconfig.StringVariable(handler) + return f +} + +// imports attribute type is not yet supported, so WithImports can't be generated + +func (f *FunctionScalaModel) WithIsSecure(isSecure string) *FunctionScalaModel { + f.IsSecure = tfconfig.StringVariable(isSecure) + return f +} + +func (f *FunctionScalaModel) WithLogLevel(logLevel string) *FunctionScalaModel { + f.LogLevel = tfconfig.StringVariable(logLevel) + return f +} + +func (f *FunctionScalaModel) WithMetricLevel(metricLevel string) *FunctionScalaModel { + f.MetricLevel = tfconfig.StringVariable(metricLevel) + return f +} + +func (f *FunctionScalaModel) WithName(name string) *FunctionScalaModel { + f.Name = tfconfig.StringVariable(name) + return f +} + +func (f *FunctionScalaModel) WithNullInputBehavior(nullInputBehavior string) *FunctionScalaModel { + f.NullInputBehavior = tfconfig.StringVariable(nullInputBehavior) + return f +} + +// packages attribute type is not yet supported, so WithPackages can't be generated + +func (f *FunctionScalaModel) WithReturnBehavior(returnBehavior string) *FunctionScalaModel { + f.ReturnBehavior = tfconfig.StringVariable(returnBehavior) + return f +} + +func (f *FunctionScalaModel) WithReturnType(returnType string) *FunctionScalaModel { + f.ReturnType = tfconfig.StringVariable(returnType) + return f +} + +func (f *FunctionScalaModel) WithRuntimeVersion(runtimeVersion string) *FunctionScalaModel { + f.RuntimeVersion = tfconfig.StringVariable(runtimeVersion) + return f +} + +func (f *FunctionScalaModel) WithSchema(schema string) *FunctionScalaModel { + f.Schema = tfconfig.StringVariable(schema) + return f +} + +// secrets attribute type is not yet supported, so WithSecrets can't be generated + +func (f *FunctionScalaModel) WithTargetPath(targetPath string) *FunctionScalaModel { + f.TargetPath = tfconfig.StringVariable(targetPath) + return f +} + +func (f *FunctionScalaModel) WithTraceLevel(traceLevel string) *FunctionScalaModel { + f.TraceLevel = tfconfig.StringVariable(traceLevel) + return f +} + +////////////////////////////////////////// +// below it's possible to set any value // +////////////////////////////////////////// + +func (f *FunctionScalaModel) WithArgumentsValue(value tfconfig.Variable) *FunctionScalaModel { + f.Arguments = value + return f +} + +func (f *FunctionScalaModel) WithCommentValue(value tfconfig.Variable) *FunctionScalaModel { + f.Comment = value + return f +} + +func (f *FunctionScalaModel) WithDatabaseValue(value tfconfig.Variable) *FunctionScalaModel { + f.Database = value + return f +} + +func (f *FunctionScalaModel) WithEnableConsoleOutputValue(value tfconfig.Variable) *FunctionScalaModel { + f.EnableConsoleOutput = value + return f +} + +func (f *FunctionScalaModel) WithExternalAccessIntegrationsValue(value tfconfig.Variable) *FunctionScalaModel { + f.ExternalAccessIntegrations = value + return f +} + +func (f *FunctionScalaModel) WithFullyQualifiedNameValue(value tfconfig.Variable) *FunctionScalaModel { + f.FullyQualifiedName = value + return f +} + +func (f *FunctionScalaModel) WithFunctionDefinitionValue(value tfconfig.Variable) *FunctionScalaModel { + f.FunctionDefinition = value + return f +} + +func (f *FunctionScalaModel) WithFunctionLanguageValue(value tfconfig.Variable) *FunctionScalaModel { + f.FunctionLanguage = value + return f +} + +func (f *FunctionScalaModel) WithHandlerValue(value tfconfig.Variable) *FunctionScalaModel { + f.Handler = value + return f +} + +func (f *FunctionScalaModel) WithImportsValue(value tfconfig.Variable) *FunctionScalaModel { + f.Imports = value + return f +} + +func (f *FunctionScalaModel) WithIsSecureValue(value tfconfig.Variable) *FunctionScalaModel { + f.IsSecure = value + return f +} + +func (f *FunctionScalaModel) WithLogLevelValue(value tfconfig.Variable) *FunctionScalaModel { + f.LogLevel = value + return f +} + +func (f *FunctionScalaModel) WithMetricLevelValue(value tfconfig.Variable) *FunctionScalaModel { + f.MetricLevel = value + return f +} + +func (f *FunctionScalaModel) WithNameValue(value tfconfig.Variable) *FunctionScalaModel { + f.Name = value + return f +} + +func (f *FunctionScalaModel) WithNullInputBehaviorValue(value tfconfig.Variable) *FunctionScalaModel { + f.NullInputBehavior = value + return f +} + +func (f *FunctionScalaModel) WithPackagesValue(value tfconfig.Variable) *FunctionScalaModel { + f.Packages = value + return f +} + +func (f *FunctionScalaModel) WithReturnBehaviorValue(value tfconfig.Variable) *FunctionScalaModel { + f.ReturnBehavior = value + return f +} + +func (f *FunctionScalaModel) WithReturnTypeValue(value tfconfig.Variable) *FunctionScalaModel { + f.ReturnType = value + return f +} + +func (f *FunctionScalaModel) WithRuntimeVersionValue(value tfconfig.Variable) *FunctionScalaModel { + f.RuntimeVersion = value + return f +} + +func (f *FunctionScalaModel) WithSchemaValue(value tfconfig.Variable) *FunctionScalaModel { + f.Schema = value + return f +} + +func (f *FunctionScalaModel) WithSecretsValue(value tfconfig.Variable) *FunctionScalaModel { + f.Secrets = value + return f +} + +func (f *FunctionScalaModel) WithTargetPathValue(value tfconfig.Variable) *FunctionScalaModel { + f.TargetPath = value + return f +} + +func (f *FunctionScalaModel) WithTraceLevelValue(value tfconfig.Variable) *FunctionScalaModel { + f.TraceLevel = value + return f +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_sql_model_ext.go b/pkg/acceptance/bettertestspoc/config/model/function_sql_model_ext.go new file mode 100644 index 0000000000..d4f775628d --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_sql_model_ext.go @@ -0,0 +1,16 @@ +package model + +import ( + "encoding/json" +) + +func (f *FunctionSqlModel) MarshalJSON() ([]byte, error) { + type Alias FunctionSqlModel + return json.Marshal(&struct { + *Alias + DependsOn []string `json:"depends_on,omitempty"` + }{ + Alias: (*Alias)(f), + DependsOn: f.DependsOn(), + }) +} diff --git a/pkg/acceptance/bettertestspoc/config/model/function_sql_model_gen.go b/pkg/acceptance/bettertestspoc/config/model/function_sql_model_gen.go new file mode 100644 index 0000000000..14cbbe9136 --- /dev/null +++ b/pkg/acceptance/bettertestspoc/config/model/function_sql_model_gen.go @@ -0,0 +1,233 @@ +// Code generated by config model builder generator; DO NOT EDIT. + +package model + +import ( + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" +) + +type FunctionSqlModel struct { + Arguments tfconfig.Variable `json:"arguments,omitempty"` + Comment tfconfig.Variable `json:"comment,omitempty"` + Database tfconfig.Variable `json:"database,omitempty"` + EnableConsoleOutput tfconfig.Variable `json:"enable_console_output,omitempty"` + FullyQualifiedName tfconfig.Variable `json:"fully_qualified_name,omitempty"` + FunctionDefinition tfconfig.Variable `json:"function_definition,omitempty"` + FunctionLanguage tfconfig.Variable `json:"function_language,omitempty"` + IsSecure tfconfig.Variable `json:"is_secure,omitempty"` + LogLevel tfconfig.Variable `json:"log_level,omitempty"` + MetricLevel tfconfig.Variable `json:"metric_level,omitempty"` + Name tfconfig.Variable `json:"name,omitempty"` + NullInputBehavior tfconfig.Variable `json:"null_input_behavior,omitempty"` + ReturnBehavior tfconfig.Variable `json:"return_behavior,omitempty"` + ReturnType tfconfig.Variable `json:"return_type,omitempty"` + Schema tfconfig.Variable `json:"schema,omitempty"` + TraceLevel tfconfig.Variable `json:"trace_level,omitempty"` + + *config.ResourceModelMeta +} + +///////////////////////////////////////////////// +// Basic builders (resource name and required) // +///////////////////////////////////////////////// + +func FunctionSql( + resourceName string, + database string, + functionDefinition string, + name string, + returnType string, + schema string, +) *FunctionSqlModel { + f := &FunctionSqlModel{ResourceModelMeta: config.Meta(resourceName, resources.FunctionSql)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +func FunctionSqlWithDefaultMeta( + database string, + functionDefinition string, + name string, + returnType string, + schema string, +) *FunctionSqlModel { + f := &FunctionSqlModel{ResourceModelMeta: config.DefaultMeta(resources.FunctionSql)} + f.WithDatabase(database) + f.WithFunctionDefinition(functionDefinition) + f.WithName(name) + f.WithReturnType(returnType) + f.WithSchema(schema) + return f +} + +///////////////////////////////// +// below all the proper values // +///////////////////////////////// + +// arguments attribute type is not yet supported, so WithArguments can't be generated + +func (f *FunctionSqlModel) WithComment(comment string) *FunctionSqlModel { + f.Comment = tfconfig.StringVariable(comment) + return f +} + +func (f *FunctionSqlModel) WithDatabase(database string) *FunctionSqlModel { + f.Database = tfconfig.StringVariable(database) + return f +} + +func (f *FunctionSqlModel) WithEnableConsoleOutput(enableConsoleOutput bool) *FunctionSqlModel { + f.EnableConsoleOutput = tfconfig.BoolVariable(enableConsoleOutput) + return f +} + +func (f *FunctionSqlModel) WithFullyQualifiedName(fullyQualifiedName string) *FunctionSqlModel { + f.FullyQualifiedName = tfconfig.StringVariable(fullyQualifiedName) + return f +} + +func (f *FunctionSqlModel) WithFunctionDefinition(functionDefinition string) *FunctionSqlModel { + f.FunctionDefinition = tfconfig.StringVariable(functionDefinition) + return f +} + +func (f *FunctionSqlModel) WithFunctionLanguage(functionLanguage string) *FunctionSqlModel { + f.FunctionLanguage = tfconfig.StringVariable(functionLanguage) + return f +} + +func (f *FunctionSqlModel) WithIsSecure(isSecure string) *FunctionSqlModel { + f.IsSecure = tfconfig.StringVariable(isSecure) + return f +} + +func (f *FunctionSqlModel) WithLogLevel(logLevel string) *FunctionSqlModel { + f.LogLevel = tfconfig.StringVariable(logLevel) + return f +} + +func (f *FunctionSqlModel) WithMetricLevel(metricLevel string) *FunctionSqlModel { + f.MetricLevel = tfconfig.StringVariable(metricLevel) + return f +} + +func (f *FunctionSqlModel) WithName(name string) *FunctionSqlModel { + f.Name = tfconfig.StringVariable(name) + return f +} + +func (f *FunctionSqlModel) WithNullInputBehavior(nullInputBehavior string) *FunctionSqlModel { + f.NullInputBehavior = tfconfig.StringVariable(nullInputBehavior) + return f +} + +func (f *FunctionSqlModel) WithReturnBehavior(returnBehavior string) *FunctionSqlModel { + f.ReturnBehavior = tfconfig.StringVariable(returnBehavior) + return f +} + +func (f *FunctionSqlModel) WithReturnType(returnType string) *FunctionSqlModel { + f.ReturnType = tfconfig.StringVariable(returnType) + return f +} + +func (f *FunctionSqlModel) WithSchema(schema string) *FunctionSqlModel { + f.Schema = tfconfig.StringVariable(schema) + return f +} + +func (f *FunctionSqlModel) WithTraceLevel(traceLevel string) *FunctionSqlModel { + f.TraceLevel = tfconfig.StringVariable(traceLevel) + return f +} + +////////////////////////////////////////// +// below it's possible to set any value // +////////////////////////////////////////// + +func (f *FunctionSqlModel) WithArgumentsValue(value tfconfig.Variable) *FunctionSqlModel { + f.Arguments = value + return f +} + +func (f *FunctionSqlModel) WithCommentValue(value tfconfig.Variable) *FunctionSqlModel { + f.Comment = value + return f +} + +func (f *FunctionSqlModel) WithDatabaseValue(value tfconfig.Variable) *FunctionSqlModel { + f.Database = value + return f +} + +func (f *FunctionSqlModel) WithEnableConsoleOutputValue(value tfconfig.Variable) *FunctionSqlModel { + f.EnableConsoleOutput = value + return f +} + +func (f *FunctionSqlModel) WithFullyQualifiedNameValue(value tfconfig.Variable) *FunctionSqlModel { + f.FullyQualifiedName = value + return f +} + +func (f *FunctionSqlModel) WithFunctionDefinitionValue(value tfconfig.Variable) *FunctionSqlModel { + f.FunctionDefinition = value + return f +} + +func (f *FunctionSqlModel) WithFunctionLanguageValue(value tfconfig.Variable) *FunctionSqlModel { + f.FunctionLanguage = value + return f +} + +func (f *FunctionSqlModel) WithIsSecureValue(value tfconfig.Variable) *FunctionSqlModel { + f.IsSecure = value + return f +} + +func (f *FunctionSqlModel) WithLogLevelValue(value tfconfig.Variable) *FunctionSqlModel { + f.LogLevel = value + return f +} + +func (f *FunctionSqlModel) WithMetricLevelValue(value tfconfig.Variable) *FunctionSqlModel { + f.MetricLevel = value + return f +} + +func (f *FunctionSqlModel) WithNameValue(value tfconfig.Variable) *FunctionSqlModel { + f.Name = value + return f +} + +func (f *FunctionSqlModel) WithNullInputBehaviorValue(value tfconfig.Variable) *FunctionSqlModel { + f.NullInputBehavior = value + return f +} + +func (f *FunctionSqlModel) WithReturnBehaviorValue(value tfconfig.Variable) *FunctionSqlModel { + f.ReturnBehavior = value + return f +} + +func (f *FunctionSqlModel) WithReturnTypeValue(value tfconfig.Variable) *FunctionSqlModel { + f.ReturnType = value + return f +} + +func (f *FunctionSqlModel) WithSchemaValue(value tfconfig.Variable) *FunctionSqlModel { + f.Schema = value + return f +} + +func (f *FunctionSqlModel) WithTraceLevelValue(value tfconfig.Variable) *FunctionSqlModel { + f.TraceLevel = value + return f +} diff --git a/pkg/acceptance/helpers/account_client.go b/pkg/acceptance/helpers/account_client.go index 96605949ab..b91d10579a 100644 --- a/pkg/acceptance/helpers/account_client.go +++ b/pkg/acceptance/helpers/account_client.go @@ -69,7 +69,7 @@ func (c *AccountClient) Create(t *testing.T) (*sdk.Account, func()) { func (c *AccountClient) CreateWithRequest(t *testing.T, id sdk.AccountObjectIdentifier, opts *sdk.CreateAccountOptions) (*sdk.Account, func()) { t.Helper() - err := c.client().Create(context.Background(), id, opts) + _, err := c.client().Create(context.Background(), id, opts) require.NoError(t, err) account, err := c.client().ShowByID(context.Background(), id) @@ -141,7 +141,7 @@ func (c *AccountClient) CreateAndLogIn(t *testing.T) (*sdk.Account, *sdk.Client, newClient, err := sdk.NewClient(&gosnowflake.Config{ Account: fmt.Sprintf("%s-%s", account.OrganizationName, account.AccountName), User: name, - Host: strings.TrimPrefix(*account.AccountLocatorURL, `https://`), + Host: strings.TrimPrefix(*account.AccountLocatorUrl, `https://`), Authenticator: gosnowflake.AuthTypeJwt, PrivateKey: privateKey, Role: snowflakeroles.Accountadmin.Name(), diff --git a/pkg/acceptance/helpers/external_access_integration_client.go b/pkg/acceptance/helpers/external_access_integration_client.go index 0e7496e85e..d5a6a59b91 100644 --- a/pkg/acceptance/helpers/external_access_integration_client.go +++ b/pkg/acceptance/helpers/external_access_integration_client.go @@ -26,22 +26,32 @@ func (c *ExternalAccessIntegrationClient) client() *sdk.Client { return c.context.client } -func (c *ExternalAccessIntegrationClient) CreateExternalAccessIntegration(t *testing.T, networkRuleId sdk.SchemaObjectIdentifier) (sdk.SchemaObjectIdentifier, func()) { +func (c *ExternalAccessIntegrationClient) CreateExternalAccessIntegration(t *testing.T, networkRuleId sdk.SchemaObjectIdentifier) (sdk.AccountObjectIdentifier, func()) { t.Helper() ctx := context.Background() - id := c.ids.RandomSchemaObjectIdentifier() - _, err := c.client().ExecForTests(ctx, fmt.Sprintf(`CREATE EXTERNAL ACCESS INTEGRATION %s ALLOWED_NETWORK_RULES = (%s) ENABLED = TRUE`, id.Name(), networkRuleId.Name())) + id := c.ids.RandomAccountObjectIdentifier() + _, err := c.client().ExecForTests(ctx, fmt.Sprintf(`CREATE EXTERNAL ACCESS INTEGRATION %s ALLOWED_NETWORK_RULES = (%s) ENABLED = TRUE`, id.FullyQualifiedName(), networkRuleId.FullyQualifiedName())) require.NoError(t, err) return id, c.DropExternalAccessIntegrationFunc(t, id) } -func (c *ExternalAccessIntegrationClient) DropExternalAccessIntegrationFunc(t *testing.T, id sdk.SchemaObjectIdentifier) func() { +func (c *ExternalAccessIntegrationClient) CreateExternalAccessIntegrationWithNetworkRuleAndSecret(t *testing.T, networkRuleId sdk.SchemaObjectIdentifier, secretId sdk.SchemaObjectIdentifier) (sdk.AccountObjectIdentifier, func()) { + t.Helper() + ctx := context.Background() + + id := c.ids.RandomAccountObjectIdentifier() + _, err := c.client().ExecForTests(ctx, fmt.Sprintf(`CREATE EXTERNAL ACCESS INTEGRATION %s ALLOWED_NETWORK_RULES = (%s) ALLOWED_AUTHENTICATION_SECRETS = (%s) ENABLED = TRUE`, id.FullyQualifiedName(), networkRuleId.FullyQualifiedName(), secretId.FullyQualifiedName())) + require.NoError(t, err) + return id, c.DropExternalAccessIntegrationFunc(t, id) +} + +func (c *ExternalAccessIntegrationClient) DropExternalAccessIntegrationFunc(t *testing.T, id sdk.AccountObjectIdentifier) func() { t.Helper() ctx := context.Background() return func() { - _, err := c.client().ExecForTests(ctx, fmt.Sprintf(`DROP EXTERNAL ACCESS INTEGRATION IF EXISTS %s`, id.Name())) + _, err := c.client().ExecForTests(ctx, fmt.Sprintf(`DROP EXTERNAL ACCESS INTEGRATION IF EXISTS %s`, id.FullyQualifiedName())) require.NoError(t, err) } } diff --git a/pkg/acceptance/helpers/function_client.go b/pkg/acceptance/helpers/function_client.go index 3e6fe5a294..4d9bf35aaa 100644 --- a/pkg/acceptance/helpers/function_client.go +++ b/pkg/acceptance/helpers/function_client.go @@ -2,9 +2,12 @@ package helpers import ( "context" + "fmt" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testdatatypes" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" "github.com/stretchr/testify/require" ) @@ -33,7 +36,7 @@ func (c *FunctionClient) CreateWithIdentifier(t *testing.T, id sdk.SchemaObjectI t.Helper() return c.CreateWithRequest(t, id, - sdk.NewCreateForSQLFunctionRequest( + sdk.NewCreateForSQLFunctionRequestDefinitionWrapped( id.SchemaObjectId(), *sdk.NewFunctionReturnsRequest().WithResultDataType(*sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeInt)), "SELECT 1", @@ -41,12 +44,13 @@ func (c *FunctionClient) CreateWithIdentifier(t *testing.T, id sdk.SchemaObjectI ) } +// TODO [SNOW-1850370]: improve this helper (all other types creation) func (c *FunctionClient) CreateSecure(t *testing.T, arguments ...sdk.DataType) *sdk.Function { t.Helper() id := c.ids.RandomSchemaObjectIdentifierWithArguments(arguments...) return c.CreateWithRequest(t, id, - sdk.NewCreateForSQLFunctionRequest( + sdk.NewCreateForSQLFunctionRequestDefinitionWrapped( id.SchemaObjectId(), *sdk.NewFunctionReturnsRequest().WithResultDataType(*sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeInt)), "SELECT 1", @@ -54,6 +58,86 @@ func (c *FunctionClient) CreateSecure(t *testing.T, arguments ...sdk.DataType) * ) } +func (c *FunctionClient) CreateSql(t *testing.T) (*sdk.Function, func()) { + t.Helper() + dataType := testdatatypes.DataTypeFloat + id := c.ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + return c.CreateSqlWithIdentifierAndArgument(t, id.SchemaObjectId(), dataType) +} + +func (c *FunctionClient) CreateSqlWithIdentifierAndArgument(t *testing.T, id sdk.SchemaObjectIdentifier, dataType datatypes.DataType) (*sdk.Function, func()) { + t.Helper() + ctx := context.Background() + + idWithArgs := sdk.NewSchemaObjectIdentifierWithArgumentsInSchema(id.SchemaId(), id.Name(), sdk.LegacyDataTypeFrom(dataType)) + argName := "x" + definition := c.SampleSqlDefinition(t) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id, *returns, definition). + WithArguments([]sdk.FunctionArgumentRequest{*argument}) + + err := c.client().CreateForSQL(ctx, request) + require.NoError(t, err) + + function, err := c.client().ShowByID(ctx, idWithArgs) + require.NoError(t, err) + + return function, c.DropFunctionFunc(t, idWithArgs) +} + +func (c *FunctionClient) CreateSqlNoArgs(t *testing.T) (*sdk.Function, func()) { + t.Helper() + ctx := context.Background() + + dataType := testdatatypes.DataTypeFloat + id := c.ids.RandomSchemaObjectIdentifierWithArguments() + + definition := c.SampleSqlDefinition(t) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition) + + err := c.client().CreateForSQL(ctx, request) + require.NoError(t, err) + t.Cleanup(c.DropFunctionFunc(t, id)) + + function, err := c.client().ShowByID(ctx, id) + require.NoError(t, err) + + return function, c.DropFunctionFunc(t, id) +} + +func (c *FunctionClient) CreateJava(t *testing.T) (*sdk.Function, func()) { + t.Helper() + ctx := context.Background() + + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := c.ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := c.SampleJavaDefinition(t, className, funcName, argName) + + request := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithFunctionDefinitionWrapped(definition) + + err := c.client().CreateForJava(ctx, request) + require.NoError(t, err) + + function, err := c.client().ShowByID(ctx, id) + require.NoError(t, err) + + return function, c.DropFunctionFunc(t, id) +} + func (c *FunctionClient) CreateWithRequest(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments, req *sdk.CreateForSQLFunctionRequest) *sdk.Function { t.Helper() ctx := context.Background() @@ -81,3 +165,82 @@ func (c *FunctionClient) DropFunctionFunc(t *testing.T, id sdk.SchemaObjectIdent require.NoError(t, err) } } + +func (c *FunctionClient) Show(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) (*sdk.Function, error) { + t.Helper() + ctx := context.Background() + + return c.client().ShowByID(ctx, id) +} + +func (c *FunctionClient) DescribeDetails(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) (*sdk.FunctionDetails, error) { + t.Helper() + ctx := context.Background() + + return c.client().DescribeDetails(ctx, id) +} + +func (c *FunctionClient) SampleJavaDefinition(t *testing.T, className string, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` + class %[1]s { + public static String %[2]s(String %[3]s) { + return %[3]s; + } + } +`, className, funcName, argName) +} + +func (c *FunctionClient) SampleJavascriptDefinition(t *testing.T, argName string) string { + t.Helper() + + return fmt.Sprintf(` + if (%[1]s <= 0) { + return 1; + } else { + var result = 1; + for (var i = 2; i <= %[1]s; i++) { + result = result * i; + } + return result; + } +`, argName) +} + +func (c *FunctionClient) SamplePythonDefinition(t *testing.T, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` +def %[1]s(%[2]s): + result = "" + for a in range(5): + result += %[2]s + return result +`, funcName, argName) +} + +func (c *FunctionClient) SampleScalaDefinition(t *testing.T, className string, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` + class %[1]s { + def %[2]s(%[3]s : String): String = { + return %[3]s + } + } +`, className, funcName, argName) +} + +// TODO [SNOW-1850370]: use input argument like in other samples +func (c *FunctionClient) SampleSqlDefinition(t *testing.T) string { + t.Helper() + + return "3.141592654::FLOAT" +} + +func (c *FunctionClient) PythonIdentityDefinition(t *testing.T, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf("def %[1]s(%[2]s): %[2]s", funcName, argName) +} diff --git a/pkg/acceptance/helpers/function_setup_helpers.go b/pkg/acceptance/helpers/function_setup_helpers.go new file mode 100644 index 0000000000..8f0447e443 --- /dev/null +++ b/pkg/acceptance/helpers/function_setup_helpers.go @@ -0,0 +1,160 @@ +package helpers + +import ( + "context" + "fmt" + "path/filepath" + "strings" + "testing" + "time" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testdatatypes" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" + "github.com/stretchr/testify/require" +) + +// TODO [SNOW-1827324]: add TestClient ref to each specific client, so that we enhance specific client and not the base one +func (c *TestClient) CreateSampleJavaFunctionAndJar(t *testing.T) *TmpFunction { + t.Helper() + ctx := context.Background() + + className := fmt.Sprintf("TestClassAbc%s", random.AlphaLowerN(3)) + funcName := fmt.Sprintf("echoVarchar%s", random.AlphaLowerN(3)) + argName := fmt.Sprintf("arg%s", random.AlphaLowerN(3)) + dataType := testdatatypes.DataTypeVarchar_100 + + id := c.Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := c.Function.SampleJavaDefinition(t, className, funcName, argName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + + request := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithTargetPath(targetPath). + WithFunctionDefinitionWrapped(definition) + + err := c.context.client.Functions.CreateForJava(ctx, request) + require.NoError(t, err) + t.Cleanup(c.Function.DropFunctionFunc(t, id)) + t.Cleanup(c.Stage.RemoveFromUserStageFunc(t, jarName)) + + return &TmpFunction{ + FunctionId: id, + ClassName: className, + FuncName: funcName, + ArgName: argName, + ArgType: dataType, + JarName: jarName, + } +} + +func (c *TestClient) CreateSampleJavaProcedureAndJar(t *testing.T) *TmpFunction { + t.Helper() + ctx := context.Background() + + className := fmt.Sprintf("TestClassAbc%s", random.AlphaLowerN(3)) + funcName := fmt.Sprintf("echoVarchar%s", random.AlphaLowerN(3)) + argName := fmt.Sprintf("arg%s", random.AlphaLowerN(3)) + dataType := testdatatypes.DataTypeVarchar_100 + + id := c.Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := c.Procedure.SampleJavaDefinition(t, className, funcName, argName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0")} + + request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithTargetPath(targetPath). + WithProcedureDefinitionWrapped(definition) + + err := c.context.client.Procedures.CreateForJava(ctx, request) + require.NoError(t, err) + t.Cleanup(c.Procedure.DropProcedureFunc(t, id)) + t.Cleanup(c.Stage.RemoveFromUserStageFunc(t, jarName)) + + return &TmpFunction{ + FunctionId: id, + ClassName: className, + FuncName: funcName, + ArgName: argName, + ArgType: dataType, + JarName: jarName, + } +} + +func (c *TestClient) CreateSamplePythonFunctionAndModule(t *testing.T) *TmpFunction { + t.Helper() + ctx := context.Background() + + funcName := fmt.Sprintf("echo%s", random.AlphaLowerN(3)) + argName := fmt.Sprintf("arg%s", random.AlphaLowerN(3)) + dataType := testdatatypes.DataTypeVarchar_100 + + id := c.Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + definition := c.Function.SamplePythonDefinition(t, funcName, argName) + + request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", funcName). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithFunctionDefinitionWrapped(definition) + + err := c.context.client.Functions.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(c.Function.DropFunctionFunc(t, id)) + + // using os.CreateTemp underneath - last * in pattern is replaced with random string + modulePattern := fmt.Sprintf("example*%s.py", random.AlphaLowerN(3)) + modulePath := c.Stage.PutOnUserStageWithContent(t, modulePattern, definition) + moduleFileName := filepath.Base(modulePath) + + return &TmpFunction{ + FunctionId: id, + ModuleName: strings.TrimSuffix(moduleFileName, ".py"), + FuncName: funcName, + ArgName: argName, + ArgType: dataType, + } +} + +type TmpFunction struct { + FunctionId sdk.SchemaObjectIdentifierWithArguments + ClassName string + ModuleName string + FuncName string + ArgName string + ArgType datatypes.DataType + JarName string +} + +func (f *TmpFunction) JarLocation() string { + return fmt.Sprintf("@~/%s", f.JarName) +} + +func (f *TmpFunction) PythonModuleLocation() string { + return fmt.Sprintf("@~/%s", f.PythonFileName()) +} + +func (f *TmpFunction) PythonFileName() string { + return fmt.Sprintf("%s.py", f.ModuleName) +} + +func (f *TmpFunction) JavaHandler() string { + return fmt.Sprintf("%s.%s", f.ClassName, f.FuncName) +} + +func (f *TmpFunction) PythonHandler() string { + return fmt.Sprintf("%s.%s", f.ModuleName, f.FuncName) +} diff --git a/pkg/acceptance/helpers/parameter_client.go b/pkg/acceptance/helpers/parameter_client.go index 70321379b1..1e3a24da0b 100644 --- a/pkg/acceptance/helpers/parameter_client.go +++ b/pkg/acceptance/helpers/parameter_client.go @@ -91,6 +91,28 @@ func (c *ParameterClient) ShowTaskParameters(t *testing.T, id sdk.SchemaObjectId return params } +func (c *ParameterClient) ShowFunctionParameters(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) []*sdk.Parameter { + t.Helper() + params, err := c.client().ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Function: id, + }, + }) + require.NoError(t, err) + return params +} + +func (c *ParameterClient) ShowProcedureParameters(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) []*sdk.Parameter { + t.Helper() + params, err := c.client().ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Procedure: id, + }, + }) + require.NoError(t, err) + return params +} + func (c *ParameterClient) UpdateAccountParameterTemporarily(t *testing.T, parameter sdk.AccountParameter, newValue string) func() { t.Helper() ctx := context.Background() diff --git a/pkg/acceptance/helpers/procedure_client.go b/pkg/acceptance/helpers/procedure_client.go index 34aec170f7..019d5f9299 100644 --- a/pkg/acceptance/helpers/procedure_client.go +++ b/pkg/acceptance/helpers/procedure_client.go @@ -2,9 +2,12 @@ package helpers import ( "context" + "fmt" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testdatatypes" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" "github.com/stretchr/testify/require" ) @@ -24,6 +27,66 @@ func (c *ProcedureClient) client() sdk.Procedures { return c.context.client.Procedures } +func (c *ProcedureClient) CreateSql(t *testing.T) (*sdk.Procedure, func()) { + t.Helper() + dataType := testdatatypes.DataTypeFloat + id := c.ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + definition := c.SampleSqlDefinition(t) + return c.CreateSqlWithIdentifierAndArgument(t, id.SchemaObjectId(), dataType, definition) +} + +func (c *ProcedureClient) CreateSqlWithIdentifierAndArgument(t *testing.T, id sdk.SchemaObjectIdentifier, dataType datatypes.DataType, definition string) (*sdk.Procedure, func()) { + t.Helper() + ctx := context.Background() + + idWithArgs := sdk.NewSchemaObjectIdentifierWithArgumentsInSchema(id.SchemaId(), id.Name(), sdk.LegacyDataTypeFrom(dataType)) + argName := "x" + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id, *returns, definition). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}) + + err := c.client().CreateForSQL(ctx, request) + require.NoError(t, err) + + procedure, err := c.client().ShowByID(ctx, idWithArgs) + require.NoError(t, err) + + return procedure, c.DropProcedureFunc(t, idWithArgs) +} + +func (c *ProcedureClient) CreateJava(t *testing.T) (*sdk.Procedure, func()) { + t.Helper() + ctx := context.Background() + + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := c.ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := c.SampleJavaDefinition(t, className, funcName, argName) + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0")} + + request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithProcedureDefinitionWrapped(definition) + + err := c.client().CreateForJava(ctx, request) + require.NoError(t, err) + + function, err := c.client().ShowByID(ctx, id) + require.NoError(t, err) + + return function, c.DropProcedureFunc(t, id) +} + func (c *ProcedureClient) Create(t *testing.T, arguments ...sdk.DataType) *sdk.Procedure { t.Helper() return c.CreateWithIdentifier(t, c.ids.RandomSchemaObjectIdentifierWithArguments(arguments...)) @@ -37,7 +100,7 @@ func (c *ProcedureClient) CreateWithIdentifier(t *testing.T, id sdk.SchemaObject argumentRequests[i] = *sdk.NewProcedureArgumentRequest(c.ids.Alpha(), nil).WithArgDataTypeOld(argumentDataType) } err := c.client().CreateForSQL(ctx, - sdk.NewCreateForSQLProcedureRequest( + sdk.NewCreateForSQLProcedureRequestDefinitionWrapped( id.SchemaObjectId(), *sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeInt)), `BEGIN RETURN 1; END`).WithArguments(argumentRequests), @@ -53,3 +116,96 @@ func (c *ProcedureClient) CreateWithIdentifier(t *testing.T, id sdk.SchemaObject return procedure } + +func (c *ProcedureClient) DropProcedureFunc(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) func() { + t.Helper() + ctx := context.Background() + + return func() { + err := c.client().Drop(ctx, sdk.NewDropProcedureRequest(id).WithIfExists(true)) + require.NoError(t, err) + } +} + +func (c *ProcedureClient) Show(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) (*sdk.Procedure, error) { + t.Helper() + ctx := context.Background() + + return c.client().ShowByID(ctx, id) +} + +func (c *ProcedureClient) DescribeDetails(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) (*sdk.ProcedureDetails, error) { + t.Helper() + ctx := context.Background() + + return c.client().DescribeDetails(ctx, id) +} + +// Session argument is needed: https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-java#data-access-example +// More references: https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-java +func (c *ProcedureClient) SampleJavaDefinition(t *testing.T, className string, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` + import com.snowflake.snowpark_java.*; + class %[1]s { + public static String %[2]s(Session session, String %[3]s) { + return %[3]s; + } + } +`, className, funcName, argName) +} + +// For more references: https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-javascript +func (c *ProcedureClient) SampleJavascriptDefinition(t *testing.T, argName string) string { + t.Helper() + + return fmt.Sprintf(` + if (%[1]s <= 0) { + return 1; + } else { + var result = 1; + for (var i = 2; i <= %[1]s; i++) { + result = result * i; + } + return result; + } +`, argName) +} + +func (c *ProcedureClient) SamplePythonDefinition(t *testing.T, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` +def %[1]s(%[2]s): + result = "" + for a in range(5): + result += %[2]s + return result +`, funcName, argName) +} + +// https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-scala +func (c *ProcedureClient) SampleScalaDefinition(t *testing.T, className string, funcName string, argName string) string { + t.Helper() + + return fmt.Sprintf(` + import com.snowflake.snowpark_java.Session + + class %[1]s { + def %[2]s(session : Session, %[3]s : String): String = { + return %[3]s + } + } +`, className, funcName, argName) +} + +func (c *ProcedureClient) SampleSqlDefinition(t *testing.T) string { + t.Helper() + + return ` +BEGIN + RETURN 3.141592654::FLOAT; +END; +` +} diff --git a/pkg/acceptance/helpers/random/certs.go b/pkg/acceptance/helpers/random/certs.go index c0e0142d7c..b314a0cbfa 100644 --- a/pkg/acceptance/helpers/random/certs.go +++ b/pkg/acceptance/helpers/random/certs.go @@ -60,15 +60,6 @@ func GenerateRSAPublicKeyFromPrivateKey(t *testing.T, key *rsa.PrivateKey) (stri return encode(t, "RSA PUBLIC KEY", b), hash(t, b) } -func GenerateRSAPublicKeyBasedOnPrivateKey(t *testing.T, key *rsa.PrivateKey) (string, string) { - t.Helper() - - pub := key.Public() - b, err := x509.MarshalPKIXPublicKey(pub.(*rsa.PublicKey)) - require.NoError(t, err) - return encode(t, "RSA PUBLIC KEY", b), hash(t, b) -} - // GenerateRSAPrivateKey returns an RSA private key. func GenerateRSAPrivateKey(t *testing.T) *rsa.PrivateKey { t.Helper() diff --git a/pkg/acceptance/helpers/random/random_helpers.go b/pkg/acceptance/helpers/random/random_helpers.go index fcfb5b7208..978e044174 100644 --- a/pkg/acceptance/helpers/random/random_helpers.go +++ b/pkg/acceptance/helpers/random/random_helpers.go @@ -1,6 +1,8 @@ package random import ( + "strings" + "github.com/brianvoe/gofakeit/v6" "github.com/hashicorp/go-uuid" ) @@ -22,7 +24,7 @@ func Password() string { // 090088 (22000): ADMIN_NAME can only contain letters, numbers and underscores. // 090089 (22000): ADMIN_NAME must start with a letter. func AdminName() string { - return AlphaN(1) + AlphanumericN(11) + return strings.ToUpper(AlphaN(1) + AlphanumericN(11)) } func Bool() bool { @@ -45,6 +47,10 @@ func AlphaN(num int) string { return gofakeit.Password(true, true, false, false, false, num) } +func AlphaLowerN(num int) string { + return gofakeit.Password(true, false, false, false, false, num) +} + func Email() string { return gofakeit.Email() } diff --git a/pkg/acceptance/helpers/stage_client.go b/pkg/acceptance/helpers/stage_client.go index 41eb0aea36..60bac47c90 100644 --- a/pkg/acceptance/helpers/stage_client.go +++ b/pkg/acceptance/helpers/stage_client.go @@ -8,6 +8,7 @@ import ( "testing" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/testhelpers" "github.com/stretchr/testify/require" ) @@ -96,6 +97,35 @@ func (c *StageClient) PutOnStage(t *testing.T, id sdk.SchemaObjectIdentifier, fi require.NoError(t, err) } +func (c *StageClient) PutOnUserStageWithContent(t *testing.T, filename string, content string) string { + t.Helper() + ctx := context.Background() + + path := testhelpers.TestFile(t, filename, []byte(content)) + + _, err := c.context.client.ExecForTests(ctx, fmt.Sprintf(`PUT file://%s @~/ AUTO_COMPRESS = FALSE OVERWRITE = TRUE`, path)) + require.NoError(t, err) + + t.Cleanup(c.RemoveFromUserStageFunc(t, path)) + + return path +} + +func (c *StageClient) RemoveFromUserStage(t *testing.T, pathOnStage string) { + t.Helper() + ctx := context.Background() + + _, err := c.context.client.ExecForTests(ctx, fmt.Sprintf(`REMOVE @~/%s`, pathOnStage)) + require.NoError(t, err) +} + +func (c *StageClient) RemoveFromUserStageFunc(t *testing.T, pathOnStage string) func() { + t.Helper() + return func() { + c.RemoveFromUserStage(t, pathOnStage) + } +} + func (c *StageClient) PutOnStageWithContent(t *testing.T, id sdk.SchemaObjectIdentifier, filename string, content string) { t.Helper() ctx := context.Background() diff --git a/pkg/acceptance/testdatatypes/testdatatypes.go b/pkg/acceptance/testdatatypes/testdatatypes.go new file mode 100644 index 0000000000..48aa8fde51 --- /dev/null +++ b/pkg/acceptance/testdatatypes/testdatatypes.go @@ -0,0 +1,11 @@ +package testdatatypes + +import "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" + +// TODO [SNOW-1843440]: create using constructors (when we add them)? +var ( + DataTypeNumber_36_2, _ = datatypes.ParseDataType("NUMBER(36, 2)") + DataTypeVarchar_100, _ = datatypes.ParseDataType("VARCHAR(100)") + DataTypeFloat, _ = datatypes.ParseDataType("FLOAT") + DataTypeVariant, _ = datatypes.ParseDataType("VARIANT") +) diff --git a/pkg/acceptance/testenvs/testing_environment_variables.go b/pkg/acceptance/testenvs/testing_environment_variables.go index 07a69e6a34..22ffdeb072 100644 --- a/pkg/acceptance/testenvs/testing_environment_variables.go +++ b/pkg/acceptance/testenvs/testing_environment_variables.go @@ -35,6 +35,8 @@ const ( ConfigureClientOnce env = "SF_TF_ACC_TEST_CONFIGURE_CLIENT_ONCE" TestObjectsSuffix env = "TEST_SF_TF_TEST_OBJECT_SUFFIX" RequireTestObjectsSuffix env = "TEST_SF_TF_REQUIRE_TEST_OBJECT_SUFFIX" + + SimplifiedIntegrationTestsSetup env = "TEST_SF_TF_SIMPLIFIED_INTEGRATION_TESTS_SETUP" ) func GetOrSkipTest(t *testing.T, envName Env) string { diff --git a/pkg/datasources/connections.go b/pkg/datasources/connections.go index bc3c59c378..14f4f4f3d2 100644 --- a/pkg/datasources/connections.go +++ b/pkg/datasources/connections.go @@ -38,7 +38,7 @@ func Connections() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Connections, ReadConnections), Schema: connectionsSchema, - Description: "Datasource used to get details of filtered connections. Filtering is aligned with the current possibilities for [SHOW CONNECTIONS](https://docs.snowflake.com/en/sql-reference/sql/show-connections) query. The results of SHOW is encapsulated in one output collection `connections`.", + Description: "Data source used to get details of filtered connections. Filtering is aligned with the current possibilities for [SHOW CONNECTIONS](https://docs.snowflake.com/en/sql-reference/sql/show-connections) query. The results of SHOW is encapsulated in one output collection `connections`.", } } diff --git a/pkg/datasources/connections_acceptance_test.go b/pkg/datasources/connections_acceptance_test.go index 5e71034391..0e91f4b6bd 100644 --- a/pkg/datasources/connections_acceptance_test.go +++ b/pkg/datasources/connections_acceptance_test.go @@ -2,6 +2,7 @@ package datasources_test import ( "fmt" + "regexp" "strings" "testing" @@ -221,3 +222,33 @@ func connectionAndSecondaryConnectionDatasourceWithLike(like string) string { } `, like) } + +func TestAcc_Connections_NotFound_WithPostConditions(t *testing.T) { + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + Steps: []resource.TestStep{ + { + Config: connectionNonExisting(), + ExpectError: regexp.MustCompile("there should be at least one connection"), + }, + }, + }) +} + +func connectionNonExisting() string { + return ` +data "snowflake_connections" "test" { + like = "non-existing-connection" + + lifecycle { + postcondition { + condition = length(self.connections) > 0 + error_message = "there should be at least one connection" + } + } +} +` +} diff --git a/pkg/datasources/database_roles.go b/pkg/datasources/database_roles.go index 570b75fd60..43548bcb9b 100644 --- a/pkg/datasources/database_roles.go +++ b/pkg/datasources/database_roles.go @@ -68,7 +68,7 @@ func DatabaseRoles() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.DatabaseRoles, ReadDatabaseRoles), Schema: databaseRolesSchema, - Description: "Datasource used to get details of filtered database roles. Filtering is aligned with the current possibilities for [SHOW DATABASE ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-database-roles) query (`like` and `limit` are supported). The results of SHOW is encapsulated in show_output collection.", + Description: "Data source used to get details of filtered database roles. Filtering is aligned with the current possibilities for [SHOW DATABASE ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-database-roles) query (`like` and `limit` are supported). The results of SHOW is encapsulated in show_output collection.", } } diff --git a/pkg/datasources/databases.go b/pkg/datasources/databases.go index 21fb414aed..f3173dec13 100644 --- a/pkg/datasources/databases.go +++ b/pkg/datasources/databases.go @@ -95,7 +95,7 @@ func Databases() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Databases, ReadDatabases), Schema: databasesSchema, - Description: "Datasource used to get details of filtered databases. Filtering is aligned with the current possibilities for [SHOW DATABASES](https://docs.snowflake.com/en/sql-reference/sql/show-databases) query (`like`, `starts_with`, and `limit` are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", + Description: "Data source used to get details of filtered databases. Filtering is aligned with the current possibilities for [SHOW DATABASES](https://docs.snowflake.com/en/sql-reference/sql/show-databases) query (`like`, `starts_with`, and `limit` are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", } } diff --git a/pkg/datasources/functions.go b/pkg/datasources/functions.go index 480ed24b6f..8dc158b406 100644 --- a/pkg/datasources/functions.go +++ b/pkg/datasources/functions.go @@ -77,7 +77,7 @@ func ReadContextFunctions(ctx context.Context, d *schema.ResourceData, meta inte schemaName := d.Get("schema").(string) request := sdk.NewShowFunctionRequest() - request.WithIn(sdk.In{Schema: sdk.NewDatabaseObjectIdentifier(databaseName, schemaName)}) + request.WithIn(sdk.ExtendedIn{In: sdk.In{Schema: sdk.NewDatabaseObjectIdentifier(databaseName, schemaName)}}) functions, err := client.Functions.Show(ctx, request) if err != nil { id := d.Id() diff --git a/pkg/datasources/masking_policies.go b/pkg/datasources/masking_policies.go index 670e29b760..c757088430 100644 --- a/pkg/datasources/masking_policies.go +++ b/pkg/datasources/masking_policies.go @@ -121,7 +121,7 @@ func MaskingPolicies() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.MaskingPolicies, ReadMaskingPolicies), Schema: maskingPoliciesSchema, - Description: "Datasource used to get details of filtered masking policies. Filtering is aligned with the current possibilities for [SHOW MASKING POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `masking_policies`.", + Description: "Data source used to get details of filtered masking policies. Filtering is aligned with the current possibilities for [SHOW MASKING POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-masking-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `masking_policies`.", } } diff --git a/pkg/datasources/network_policies.go b/pkg/datasources/network_policies.go index 6aa615d136..035372d0dc 100644 --- a/pkg/datasources/network_policies.go +++ b/pkg/datasources/network_policies.go @@ -56,7 +56,7 @@ func NetworkPolicies() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.NetworkPolicies, ReadNetworkPolicies), Schema: networkPoliciesSchema, - Description: "Datasource used to get details of filtered network policies. Filtering is aligned with the current possibilities for [SHOW NETWORK POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-network-policies) query (`like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection.", + Description: "Data source used to get details of filtered network policies. Filtering is aligned with the current possibilities for [SHOW NETWORK POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-network-policies) query (`like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection.", } } diff --git a/pkg/datasources/procedures.go b/pkg/datasources/procedures.go index a20c338a68..c0dd714ff1 100644 --- a/pkg/datasources/procedures.go +++ b/pkg/datasources/procedures.go @@ -79,10 +79,10 @@ func ReadContextProcedures(ctx context.Context, d *schema.ResourceData, meta int req := sdk.NewShowProcedureRequest() if databaseName != "" { - req.WithIn(sdk.In{Database: sdk.NewAccountObjectIdentifier(databaseName)}) + req.WithIn(sdk.ExtendedIn{In: sdk.In{Database: sdk.NewAccountObjectIdentifier(databaseName)}}) } if schemaName != "" { - req.WithIn(sdk.In{Schema: sdk.NewDatabaseObjectIdentifier(databaseName, schemaName)}) + req.WithIn(sdk.ExtendedIn{In: sdk.In{Schema: sdk.NewDatabaseObjectIdentifier(databaseName, schemaName)}}) } procedures, err := client.Procedures.Show(ctx, req) if err != nil { diff --git a/pkg/datasources/resource_monitors.go b/pkg/datasources/resource_monitors.go index 8a3825034f..74cbfce6a9 100644 --- a/pkg/datasources/resource_monitors.go +++ b/pkg/datasources/resource_monitors.go @@ -43,7 +43,7 @@ func ResourceMonitors() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.ResourceMonitors, ReadResourceMonitors), Schema: resourceMonitorsSchema, - Description: "Datasource used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for [SHOW RESOURCE MONITORS](https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors) query (`like` is supported). The results of SHOW is encapsulated in show_output collection.", + Description: "Data source used to get details of filtered resource monitors. Filtering is aligned with the current possibilities for [SHOW RESOURCE MONITORS](https://docs.snowflake.com/en/sql-reference/sql/show-resource-monitors) query (`like` is supported). The results of SHOW is encapsulated in show_output collection.", } } diff --git a/pkg/datasources/roles.go b/pkg/datasources/roles.go index 7278988f7d..bdce14c43e 100644 --- a/pkg/datasources/roles.go +++ b/pkg/datasources/roles.go @@ -51,7 +51,7 @@ func Roles() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Roles, ReadRoles), Schema: rolesSchema, - Description: "Datasource used to get details of filtered roles. Filtering is aligned with the current possibilities for [SHOW ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-roles) query (`like` and `in_class` are all supported). The results of SHOW are encapsulated in one output collection.", + Description: "Data source used to get details of filtered roles. Filtering is aligned with the current possibilities for [SHOW ROLES](https://docs.snowflake.com/en/sql-reference/sql/show-roles) query (`like` and `in_class` are all supported). The results of SHOW are encapsulated in one output collection.", } } diff --git a/pkg/datasources/row_access_policies.go b/pkg/datasources/row_access_policies.go index 22d8607422..a5cd35ee87 100644 --- a/pkg/datasources/row_access_policies.go +++ b/pkg/datasources/row_access_policies.go @@ -117,7 +117,7 @@ func RowAccessPolicies() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.RowAccessPolicies, ReadRowAccessPolicies), Schema: rowAccessPoliciesSchema, - Description: "Datasource used to get details of filtered row access policies. Filtering is aligned with the current possibilities for [SHOW ROW ACCESS POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `row_access_policies`.", + Description: "Data source used to get details of filtered row access policies. Filtering is aligned with the current possibilities for [SHOW ROW ACCESS POLICIES](https://docs.snowflake.com/en/sql-reference/sql/show-row-access-policies) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `row_access_policies`.", } } diff --git a/pkg/datasources/schemas.go b/pkg/datasources/schemas.go index bab420e5e3..aec6eac8cc 100644 --- a/pkg/datasources/schemas.go +++ b/pkg/datasources/schemas.go @@ -131,7 +131,7 @@ func Schemas() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Schemas, ReadSchemas), Schema: schemasSchema, - Description: "Datasource used to get details of filtered schemas. Filtering is aligned with the current possibilities for [SHOW SCHEMAS](https://docs.snowflake.com/en/sql-reference/sql/show-schemas) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", + Description: "Data source used to get details of filtered schemas. Filtering is aligned with the current possibilities for [SHOW SCHEMAS](https://docs.snowflake.com/en/sql-reference/sql/show-schemas) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", } } diff --git a/pkg/datasources/secrets.go b/pkg/datasources/secrets.go index c40ec7aa6e..3102e6dba6 100644 --- a/pkg/datasources/secrets.go +++ b/pkg/datasources/secrets.go @@ -100,7 +100,7 @@ func Secrets() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Secrets, ReadSecrets), Schema: secretsSchema, - Description: "Datasource used to get details of filtered secrets. Filtering is aligned with the current possibilities for [SHOW SECRETS](https://docs.snowflake.com/en/sql-reference/sql/show-secrets) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `secrets`.", + Description: "Data source used to get details of filtered secrets. Filtering is aligned with the current possibilities for [SHOW SECRETS](https://docs.snowflake.com/en/sql-reference/sql/show-secrets) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `secrets`.", } } diff --git a/pkg/datasources/security_integrations.go b/pkg/datasources/security_integrations.go index 6418f6e4fc..6de6728a1c 100644 --- a/pkg/datasources/security_integrations.go +++ b/pkg/datasources/security_integrations.go @@ -56,7 +56,7 @@ func SecurityIntegrations() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.SecurityIntegrations, ReadSecurityIntegrations), Schema: securityIntegrationsSchema, - Description: "Datasource used to get details of filtered security integrations. Filtering is aligned with the current possibilities for [SHOW SECURITY INTEGRATIONS](https://docs.snowflake.com/en/sql-reference/sql/show-integrations) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `security_integrations`.", + Description: "Data source used to get details of filtered security integrations. Filtering is aligned with the current possibilities for [SHOW SECURITY INTEGRATIONS](https://docs.snowflake.com/en/sql-reference/sql/show-integrations) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `security_integrations`.", } } diff --git a/pkg/datasources/streamlits.go b/pkg/datasources/streamlits.go index 889a23548f..e9fb8a364e 100644 --- a/pkg/datasources/streamlits.go +++ b/pkg/datasources/streamlits.go @@ -104,7 +104,7 @@ func Streamlits() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Streamlits, ReadStreamlits), Schema: streamlitsSchema, - Description: "Datasource used to get details of filtered streamlits. Filtering is aligned with the current possibilities for [SHOW STREAMLITS](https://docs.snowflake.com/en/sql-reference/sql/show-streamlits) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `streamlits`.", + Description: "Data source used to get details of filtered streamlits. Filtering is aligned with the current possibilities for [SHOW STREAMLITS](https://docs.snowflake.com/en/sql-reference/sql/show-streamlits) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `streamlits`.", } } diff --git a/pkg/datasources/streams.go b/pkg/datasources/streams.go index 4323fb19d2..50ed824825 100644 --- a/pkg/datasources/streams.go +++ b/pkg/datasources/streams.go @@ -56,7 +56,7 @@ func Streams() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Streams, ReadStreams), Schema: streamsSchema, - Description: "Datasource used to get details of filtered streams. Filtering is aligned with the current possibilities for [SHOW STREAMS](https://docs.snowflake.com/en/sql-reference/sql/show-streams) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `streams`.", + Description: "Data source used to get details of filtered streams. Filtering is aligned with the current possibilities for [SHOW STREAMS](https://docs.snowflake.com/en/sql-reference/sql/show-streams) query. The results of SHOW and DESCRIBE are encapsulated in one output collection `streams`.", } } diff --git a/pkg/datasources/tags.go b/pkg/datasources/tags.go index 6fee2d84d6..20d755df39 100644 --- a/pkg/datasources/tags.go +++ b/pkg/datasources/tags.go @@ -40,7 +40,7 @@ func Tags() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Tags, ReadTags), Schema: tagsSchema, - Description: "Datasource used to get details of filtered tags. Filtering is aligned with the current possibilities for [SHOW TAGS](https://docs.snowflake.com/en/sql-reference/sql/show-tags) query. The results of SHOW are encapsulated in one output collection `tags`.", + Description: "Data source used to get details of filtered tags. Filtering is aligned with the current possibilities for [SHOW TAGS](https://docs.snowflake.com/en/sql-reference/sql/show-tags) query. The results of SHOW are encapsulated in one output collection `tags`.", } } diff --git a/pkg/datasources/users.go b/pkg/datasources/users.go index 5afe984886..4d5c3be40a 100644 --- a/pkg/datasources/users.go +++ b/pkg/datasources/users.go @@ -95,7 +95,7 @@ func Users() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Users, ReadUsers), Schema: usersSchema, - Description: "Datasource used to get details of filtered users. Filtering is aligned with the current possibilities for [SHOW USERS](https://docs.snowflake.com/en/sql-reference/sql/show-users) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in `show_output` (only empty or default values), the DESCRIBE command cannot be called, so you have to set `with_describe = false`. Only `parameters` output is not affected by the lack of privileges.", + Description: "Data source used to get details of filtered users. Filtering is aligned with the current possibilities for [SHOW USERS](https://docs.snowflake.com/en/sql-reference/sql/show-users) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. Important note is that when querying users you don't have permissions to, the querying options are limited. You won't get almost any field in `show_output` (only empty or default values), the DESCRIBE command cannot be called, so you have to set `with_describe = false`. Only `parameters` output is not affected by the lack of privileges.", } } diff --git a/pkg/datasources/views.go b/pkg/datasources/views.go index 12a1bcdf14..64d64c7a68 100644 --- a/pkg/datasources/views.go +++ b/pkg/datasources/views.go @@ -110,7 +110,7 @@ func Views() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Views, ReadViews), Schema: viewsSchema, - Description: "Datasource used to get details of filtered views. Filtering is aligned with the current possibilities for [SHOW VIEWS](https://docs.snowflake.com/en/sql-reference/sql/show-views) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `views`.", + Description: "Data source used to get details of filtered views. Filtering is aligned with the current possibilities for [SHOW VIEWS](https://docs.snowflake.com/en/sql-reference/sql/show-views) query (only `like` is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection `views`.", } } diff --git a/pkg/datasources/warehouses.go b/pkg/datasources/warehouses.go index 9c42fb5e07..2399f33872 100644 --- a/pkg/datasources/warehouses.go +++ b/pkg/datasources/warehouses.go @@ -70,7 +70,7 @@ func Warehouses() *schema.Resource { return &schema.Resource{ ReadContext: TrackingReadWrapper(datasources.Warehouses, ReadWarehouses), Schema: warehousesSchema, - Description: "Datasource used to get details of filtered warehouses. Filtering is aligned with the current possibilities for [SHOW WAREHOUSES](https://docs.snowflake.com/en/sql-reference/sql/show-warehouses) query (only `like` is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", + Description: "Data source used to get details of filtered warehouses. Filtering is aligned with the current possibilities for [SHOW WAREHOUSES](https://docs.snowflake.com/en/sql-reference/sql/show-warehouses) query (only `like` is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection.", } } diff --git a/pkg/internal/tracking/context.go b/pkg/internal/tracking/context.go index 8fdbf8c03f..f228fa4773 100644 --- a/pkg/internal/tracking/context.go +++ b/pkg/internal/tracking/context.go @@ -10,7 +10,7 @@ import ( const ( CurrentSchemaVersion string = "1" - ProviderVersion string = "v0.99.0" // TODO(SNOW-1814934): Currently hardcoded, make it computed + ProviderVersion string = "v0.100.0" // TODO(SNOW-1814934): Currently hardcoded, make it computed MetadataPrefix string = "terraform_provider_usage_tracking" ) diff --git a/pkg/provider/provider.go b/pkg/provider/provider.go index 7e5f4c9370..5692d9a5eb 100644 --- a/pkg/provider/provider.go +++ b/pkg/provider/provider.go @@ -494,6 +494,11 @@ func getResources() map[string]*schema.Resource { "snowflake_failover_group": resources.FailoverGroup(), "snowflake_file_format": resources.FileFormat(), "snowflake_function": resources.Function(), + "snowflake_function_java": resources.FunctionJava(), + "snowflake_function_javascript": resources.FunctionJavascript(), + "snowflake_function_python": resources.FunctionPython(), + "snowflake_function_scala": resources.FunctionScala(), + "snowflake_function_sql": resources.FunctionSql(), "snowflake_grant_account_role": resources.GrantAccountRole(), "snowflake_grant_application_role": resources.GrantApplicationRole(), "snowflake_grant_database_role": resources.GrantDatabaseRole(), diff --git a/pkg/provider/resources/resources.go b/pkg/provider/resources/resources.go index 6991cbabe2..a8173a6d96 100644 --- a/pkg/provider/resources/resources.go +++ b/pkg/provider/resources/resources.go @@ -34,6 +34,11 @@ const ( GrantPrivilegesToDatabaseRole resource = "snowflake_grant_privileges_to_database_role" GrantPrivilegesToShare resource = "snowflake_grant_privileges_to_share" Function resource = "snowflake_function" + FunctionJava resource = "snowflake_function_java" + FunctionJavascript resource = "snowflake_function_javascript" + FunctionPython resource = "snowflake_function_python" + FunctionScala resource = "snowflake_function_scala" + FunctionSql resource = "snowflake_function_sql" LegacyServiceUser resource = "snowflake_legacy_service_user" ManagedAccount resource = "snowflake_managed_account" MaskingPolicy resource = "snowflake_masking_policy" diff --git a/pkg/resources/account.go b/pkg/resources/account.go index 8b687c7cd8..2f5a1e0249 100644 --- a/pkg/resources/account.go +++ b/pkg/resources/account.go @@ -2,18 +2,19 @@ package resources import ( "context" + "errors" "fmt" - "log" "strings" - "time" - "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider/docs" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/snowflakeroles" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/hashicorp/go-cty/cty" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/util" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" @@ -21,384 +22,460 @@ import ( "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" ) -// Note: no test case was created for account since we cannot actually delete them after creation, which is a critical part of the test suite. Instead, this resource -// was manually tested - var accountSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, Required: true, - Description: "Specifies the identifier (i.e. name) for the account; must be unique within an organization, regardless of which Snowflake Region the account is in. In addition, the identifier must start with an alphabetic character and cannot contain spaces or special characters except for underscores (_). Note that if the account name includes underscores, features that do not accept account names with underscores (e.g. Okta SSO or SCIM) can reference a version of the account name that substitutes hyphens (-) for the underscores.", - // Name is automatically uppercase by Snowflake - StateFunc: func(val interface{}) string { - return strings.ToUpper(val.(string)) - }, - ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), + Description: "Specifies the identifier (i.e. name) for the account. It must be unique within an organization, regardless of which Snowflake Region the account is in and must start with an alphabetic character and cannot contain spaces or special characters except for underscores (_). Note that if the account name includes underscores, features that do not accept account names with underscores (e.g. Okta SSO or SCIM) can reference a version of the account name that substitutes hyphens (-) for the underscores.", }, "admin_name": { - Type: schema.TypeString, - Required: true, - Description: "Login name of the initial administrative user of the account. A new user is created in the new account with this name and password and granted the ACCOUNTADMIN role in the account. A login name can be any string consisting of letters, numbers, and underscores. Login names are always case-insensitive.", - // We have no way of assuming a role into this account to change the admin user name so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Required: true, + Sensitive: true, + Description: externalChangesNotDetectedFieldDescription("Login name of the initial administrative user of the account. A new user is created in the new account with this name and password and granted the ACCOUNTADMIN role in the account. A login name can be any string consisting of letters, numbers, and underscores. Login names are always case-insensitive."), + DiffSuppressFunc: IgnoreAfterCreation, }, "admin_password": { - Type: schema.TypeString, - Optional: true, - Sensitive: true, - Description: "Password for the initial administrative user of the account. Optional if the `ADMIN_RSA_PUBLIC_KEY` parameter is specified. For more information about passwords in Snowflake, see [Snowflake-provided Password Policy](https://docs.snowflake.com/en/sql-reference/sql/create-account.html#:~:text=Snowflake%2Dprovided%20Password%20Policy).", - AtLeastOneOf: []string{"admin_password", "admin_rsa_public_key"}, - // We have no way of assuming a role into this account to change the password so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Optional: true, + Sensitive: true, + Description: externalChangesNotDetectedFieldDescription("Password for the initial administrative user of the account. Either admin_password or admin_rsa_public_key has to be specified. This field cannot be used whenever admin_user_type is set to SERVICE."), + DiffSuppressFunc: IgnoreAfterCreation, + AtLeastOneOf: []string{"admin_password", "admin_rsa_public_key"}, }, "admin_rsa_public_key": { - Type: schema.TypeString, - Optional: true, - Sensitive: true, - Description: "Assigns a public key to the initial administrative user of the account in order to implement [key pair authentication](https://docs.snowflake.com/en/sql-reference/sql/create-account.html#:~:text=key%20pair%20authentication) for the user. Optional if the `ADMIN_PASSWORD` parameter is specified.", - AtLeastOneOf: []string{"admin_password", "admin_rsa_public_key"}, - // We have no way of assuming a role into this account to change the admin rsa public key so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, - }, - "email": { - Type: schema.TypeString, - Required: true, - Sensitive: true, - Description: "Email address of the initial administrative user of the account. This email address is used to send any notifications about the account.", - // We have no way of assuming a role into this account to change the admin email so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Optional: true, + Description: externalChangesNotDetectedFieldDescription("Assigns a public key to the initial administrative user of the account. Either admin_password or admin_rsa_public_key has to be specified."), + DiffSuppressFunc: IgnoreAfterCreation, + AtLeastOneOf: []string{"admin_password", "admin_rsa_public_key"}, }, - "edition": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "[Snowflake Edition](https://docs.snowflake.com/en/user-guide/intro-editions.html) of the account. Valid values are: STANDARD | ENTERPRISE | BUSINESS_CRITICAL", - ValidateFunc: validation.StringInSlice([]string{string(sdk.EditionStandard), string(sdk.EditionEnterprise), string(sdk.EditionBusinessCritical)}, false), + "admin_user_type": { + Type: schema.TypeString, + Optional: true, + Description: externalChangesNotDetectedFieldDescription(fmt.Sprintf("Used for setting the type of the first user that is assigned the ACCOUNTADMIN role during account creation. Valid options are: %s", docs.PossibleValuesListed(sdk.AllUserTypes))), + DiffSuppressFunc: SuppressIfAny(IgnoreAfterCreation, NormalizeAndCompare(sdk.ToUserType)), + ValidateDiagFunc: sdkValidation(sdk.ToUserType), }, "first_name": { - Type: schema.TypeString, - Optional: true, - Sensitive: true, - Description: "First name of the initial administrative user of the account", - // We have no way of assuming a role into this account to change the admin first name so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Optional: true, + Sensitive: true, + Description: externalChangesNotDetectedFieldDescription("First name of the initial administrative user of the account. This field cannot be used whenever admin_user_type is set to SERVICE."), + DiffSuppressFunc: IgnoreAfterCreation, }, "last_name": { - Type: schema.TypeString, - Optional: true, - Sensitive: true, - Description: "Last name of the initial administrative user of the account", - // We have no way of assuming a role into this account to change the admin last name so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Optional: true, + Sensitive: true, + Description: externalChangesNotDetectedFieldDescription("Last name of the initial administrative user of the account. This field cannot be used whenever admin_user_type is set to SERVICE."), + DiffSuppressFunc: IgnoreAfterCreation, + }, + "email": { + Type: schema.TypeString, + Required: true, + Sensitive: true, + Description: externalChangesNotDetectedFieldDescription("Email address of the initial administrative user of the account. This email address is used to send any notifications about the account."), + DiffSuppressFunc: IgnoreAfterCreation, }, "must_change_password": { - Type: schema.TypeBool, - Optional: true, - Default: false, - Description: "Specifies whether the new user created to administer the account is forced to change their password upon first login into the account.", - // We have no way of assuming a role into this account to change the admin password policy so this has to be ForceNew even though it's not ideal - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return old == "" - }, + Type: schema.TypeString, + Optional: true, + Default: BooleanDefault, + Description: externalChangesNotDetectedFieldDescription("Specifies whether the new user created to administer the account is forced to change their password upon first login into the account. This field cannot be used whenever admin_user_type is set to SERVICE."), + DiffSuppressFunc: IgnoreAfterCreation, + ValidateDiagFunc: validateBooleanString, + }, + "edition": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + Description: fmt.Sprintf("Snowflake Edition of the account. See more about Snowflake Editions in the [official documentation](https://docs.snowflake.com/en/user-guide/intro-editions). Valid options are: %s", docs.PossibleValuesListed(sdk.AllAccountEditions)), + DiffSuppressFunc: NormalizeAndCompare(sdk.ToAccountEdition), + ValidateDiagFunc: sdkValidation(sdk.ToAccountEdition), }, "region_group": { - Type: schema.TypeString, - Optional: true, - Description: "ID of the Snowflake Region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.)", - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return new == "" - }, + Type: schema.TypeString, + Optional: true, + ForceNew: true, + Description: "ID of the region group where the account is created. To retrieve the region group ID for existing accounts in your organization, execute the [SHOW REGIONS](https://docs.snowflake.com/en/sql-reference/sql/show-regions) command. For information about when you might need to specify region group, see [Region groups](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#label-region-groups).", }, "region": { - Type: schema.TypeString, - Optional: true, - Description: "ID of the Snowflake Region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.)", - ForceNew: true, - DiffSuppressOnRefresh: true, - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - // For new resources always show the diff - if d.Id() == "" { - return false - } - // This suppresses the diff if the old value is empty. This would happen in the event of importing existing accounts since we have no way of reading this value - return new == "" - }, + Type: schema.TypeString, + Optional: true, + ForceNew: true, + Description: "[Snowflake Region ID](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#label-snowflake-region-ids) of the region where the account is created. If no value is provided, Snowflake creates the account in the same Snowflake Region as the current account (i.e. the account in which the CREATE ACCOUNT statement is executed.)", }, "comment": { Type: schema.TypeString, Optional: true, - Description: "Specifies a comment for the account.", ForceNew: true, + Description: "Specifies a comment for the account.", + DiffSuppressFunc: SuppressIfAny( + IgnoreChangeToCurrentSnowflakeValueInShow("comment"), + func(k, oldValue, newValue string, d *schema.ResourceData) bool { + return oldValue == "SNOWFLAKE" && newValue == "" + }, + ), }, "is_org_admin": { - Type: schema.TypeBool, - Computed: true, - Description: "Indicates whether the ORGADMIN role is enabled in an account. If TRUE, the role is enabled.", + Type: schema.TypeString, + Optional: true, + Default: BooleanDefault, + DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("is_org_admin"), + ValidateDiagFunc: validateBooleanString, + Description: "Sets an account property that determines whether the ORGADMIN role is enabled in the account. Only an organization administrator (i.e. user with the ORGADMIN role) can set the property.", }, "grace_period_in_days": { - Type: schema.TypeInt, - Optional: true, - Default: 3, - Description: "Specifies the number of days to wait before dropping the account. The default is 3 days.", + Type: schema.TypeInt, + Required: true, + Description: "Specifies the number of days during which the account can be restored (“undropped”). The minimum is 3 days and the maximum is 90 days.", + ValidateDiagFunc: validation.ToDiagFunc(validation.IntAtLeast(3)), }, FullyQualifiedNameAttributeName: schemas.FullyQualifiedNameSchema, + ShowOutputAttributeName: { + Type: schema.TypeList, + Computed: true, + Description: "Outputs the result of `SHOW ACCOUNTS` for the given account.", + Elem: &schema.Resource{ + Schema: schemas.ShowAccountSchema, + }, + }, } func Account() *schema.Resource { return &schema.Resource{ Description: "The account resource allows you to create and manage Snowflake accounts.", CreateContext: TrackingCreateWrapper(resources.Account, CreateAccount), - ReadContext: TrackingReadWrapper(resources.Account, ReadAccount), + ReadContext: TrackingReadWrapper(resources.Account, ReadAccount(true)), UpdateContext: TrackingUpdateWrapper(resources.Account, UpdateAccount), DeleteContext: TrackingDeleteWrapper(resources.Account, DeleteAccount), CustomizeDiff: TrackingCustomDiffWrapper(resources.Account, customdiff.All( ComputedIfAnyAttributeChanged(accountSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(accountSchema, ShowOutputAttributeName, "name", "is_org_admin"), )), Schema: accountSchema, Importer: &schema.ResourceImporter{ - StateContext: schema.ImportStatePassthroughContext, + StateContext: TrackingImportWrapper(resources.Account, ImportAccount), + }, + + SchemaVersion: 1, + StateUpgraders: []schema.StateUpgrader{ + { + Version: 0, + // setting type to cty.EmptyObject is a bit hacky here but following https://developer.hashicorp.com/terraform/plugin/framework/migrating/resources/state-upgrade#sdkv2-1 would require lots of repetitive code; this should work with cty.EmptyObject + Type: cty.EmptyObject, + Upgrade: v0_99_0_AccountStateUpgrader, + }, }, } } -// CreateAccount implements schema.CreateFunc. -func CreateAccount(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { +func ImportAccount(ctx context.Context, d *schema.ResourceData, meta any) ([]*schema.ResourceData, error) { + client := meta.(*provider.Context).Client + + isOrgAdmin, err := client.ContextFunctions.IsRoleInSession(ctx, snowflakeroles.Orgadmin) + if err != nil { + return nil, err + } + if !isOrgAdmin { + return nil, errors.New("current user doesn't have the orgadmin role in session") + } + + id, err := sdk.ParseAccountIdentifier(d.Id()) + if err != nil { + return nil, err + } + + account, err := client.Accounts.ShowByID(ctx, id.AsAccountObjectIdentifier()) + if err != nil { + return nil, err + } + + if _, err := ImportName[sdk.AccountIdentifier](context.Background(), d, nil); err != nil { + return nil, err + } + + if account.RegionGroup != nil { + if err = d.Set("region_group", *account.RegionGroup); err != nil { + return nil, err + } + } + + if err := errors.Join( + d.Set("edition", string(*account.Edition)), + d.Set("region", account.SnowflakeRegion), + d.Set("comment", *account.Comment), + d.Set("is_org_admin", booleanStringFromBool(*account.IsOrgAdmin)), + ); err != nil { + return nil, err + } + + return []*schema.ResourceData{d}, nil +} + +func CreateAccount(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - name := d.Get("name").(string) - objectIdentifier := sdk.NewAccountObjectIdentifier(name) + isOrgAdmin, err := client.ContextFunctions.IsRoleInSession(ctx, snowflakeroles.Orgadmin) + if err != nil { + return diag.FromErr(err) + } + if !isOrgAdmin { + return diag.FromErr(errors.New("current user doesn't have the orgadmin role in session")) + } + + id := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) - createOptions := &sdk.CreateAccountOptions{ + opts := &sdk.CreateAccountOptions{ AdminName: d.Get("admin_name").(string), Email: d.Get("email").(string), Edition: sdk.AccountEdition(d.Get("edition").(string)), } - // get optional fields. if v, ok := d.GetOk("admin_password"); ok { - createOptions.AdminPassword = sdk.String(v.(string)) + opts.AdminPassword = sdk.String(v.(string)) } if v, ok := d.GetOk("admin_rsa_public_key"); ok { - createOptions.AdminRSAPublicKey = sdk.String(v.(string)) + opts.AdminRSAPublicKey = sdk.String(v.(string)) + } + if v, ok := d.GetOk("admin_user_type"); ok { + userType, err := sdk.ToUserType(v.(string)) + if err != nil { + return diag.FromErr(err) + } + opts.AdminUserType = &userType } if v, ok := d.GetOk("first_name"); ok { - createOptions.FirstName = sdk.String(v.(string)) + opts.FirstName = sdk.String(v.(string)) } if v, ok := d.GetOk("last_name"); ok { - createOptions.LastName = sdk.String(v.(string)) + opts.LastName = sdk.String(v.(string)) } - - // Has default, don't fetch with GetOk because this can be falsey and valid - v := d.Get("must_change_password") - createOptions.MustChangePassword = sdk.Bool(v.(bool)) - - if v, ok := d.GetOk("region_group"); ok { - createOptions.RegionGroup = sdk.String(v.(string)) - } else { - // For organizations that have accounts in multiple region groups, returns . so we need to split on "." - currentRegion, err := client.ContextFunctions.CurrentRegion(ctx) + if v := d.Get("must_change_password"); v != BooleanDefault { + parsedBool, err := booleanStringToBool(v.(string)) if err != nil { return diag.FromErr(err) } - regionParts := strings.Split(currentRegion, ".") - if len(regionParts) == 2 { - createOptions.RegionGroup = sdk.String(regionParts[0]) - } + opts.MustChangePassword = &parsedBool + } + if v, ok := d.GetOk("region_group"); ok { + opts.RegionGroup = sdk.String(v.(string)) } if v, ok := d.GetOk("region"); ok { - createOptions.Region = sdk.String(v.(string)) - } else { - // For organizations that have accounts in multiple region groups, returns . so we need to split on "." - currentRegion, err := client.ContextFunctions.CurrentRegion(ctx) - if err != nil { - return diag.FromErr(err) - } - regionParts := strings.Split(currentRegion, ".") - if len(regionParts) == 2 { - createOptions.Region = sdk.String(regionParts[1]) - } else { - createOptions.Region = sdk.String(currentRegion) - } + opts.Region = sdk.String(v.(string)) } if v, ok := d.GetOk("comment"); ok { - createOptions.Comment = sdk.String(v.(string)) + opts.Comment = sdk.String(v.(string)) } - err := client.Accounts.Create(ctx, objectIdentifier, createOptions) + createResponse, err := client.Accounts.Create(ctx, id, opts) if err != nil { return diag.FromErr(err) } - var account *sdk.Account - err = util.Retry(5, 3*time.Second, func() (error, bool) { - account, err = client.Accounts.ShowByID(ctx, objectIdentifier) + d.SetId(helpers.EncodeResourceIdentifier(sdk.NewAccountIdentifier(createResponse.OrganizationName, createResponse.AccountName))) + + if v, ok := d.GetOk("is_org_admin"); ok && v == BooleanTrue { + err := client.Accounts.Alter(ctx, &sdk.AlterAccountOptions{ + SetIsOrgAdmin: &sdk.AccountSetIsOrgAdmin{ + Name: id, + OrgAdmin: true, + }, + }) if err != nil { - log.Printf("[DEBUG] retryable operation resulted in error: %v\n", err) - return nil, false + return diag.FromErr(err) } - return nil, true - }) - if err != nil { - return diag.FromErr(err) } - d.SetId(helpers.EncodeSnowflakeID(account.AccountLocator)) - return ReadAccount(ctx, d, meta) + return ReadAccount(false)(ctx, d, meta) } -// ReadAccount implements schema.ReadFunc. -func ReadAccount(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { - client := meta.(*provider.Context).Client +func ReadAccount(withExternalChangesMarking bool) schema.ReadContextFunc { + return func(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + client := meta.(*provider.Context).Client - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + isOrgAdmin, err := client.ContextFunctions.IsRoleInSession(ctx, snowflakeroles.Orgadmin) + if err != nil { + return diag.FromErr(err) + } + if !isOrgAdmin { + return diag.FromErr(errors.New("current user doesn't have the orgadmin role in session")) + } - var acc *sdk.Account - var err error - err = util.Retry(5, 3*time.Second, func() (error, bool) { - acc, err = client.Accounts.ShowByID(ctx, id) + id, err := sdk.ParseAccountIdentifier(d.Id()) if err != nil { - log.Printf("[DEBUG] retryable operation resulted in error: %v\n", err) - return nil, false + return diag.FromErr(err) } - return nil, true - }) - if err != nil { - return diag.FromErr(err) + + account, err := client.Accounts.ShowByID(ctx, id.AsAccountObjectIdentifier()) + if err != nil { + if errors.Is(err, sdk.ErrObjectNotFound) { + d.SetId("") + return diag.Diagnostics{ + diag.Diagnostic{ + Severity: diag.Warning, + Summary: "Failed to query account. Marking the resource as removed.", + Detail: fmt.Sprintf("Account: %s, Err: %s", id.FullyQualifiedName(), err), + }, + } + } + return diag.FromErr(err) + } + + if withExternalChangesMarking { + var regionGroup string + if account.RegionGroup != nil { + regionGroup = *account.RegionGroup + + // For organizations that have accounts in multiple region groups, returns . so we need to split on "." + parts := strings.Split(regionGroup, ".") + if len(parts) == 2 { + regionGroup = parts[0] + } + } + if err = handleExternalChangesToObjectInShow(d, + outputMapping{"edition", "edition", *account.Edition, *account.Edition, nil}, + outputMapping{"is_org_admin", "is_org_admin", *account.IsOrgAdmin, booleanStringFromBool(*account.IsOrgAdmin), nil}, + outputMapping{"region_group", "region_group", regionGroup, regionGroup, nil}, + outputMapping{"snowflake_region", "region", account.SnowflakeRegion, account.SnowflakeRegion, nil}, + outputMapping{"comment", "comment", *account.Comment, *account.Comment, nil}, + ); err != nil { + return diag.FromErr(err) + } + } else { + if err = setStateToValuesFromConfig(d, accountSchema, []string{ + "name", + "admin_name", + "admin_password", + "admin_rsa_public_key", + "admin_user_type", + "first_name", + "last_name", + "email", + "must_change_password", + "edition", + "region_group", + "region", + "comment", + "is_org_admin", + "grace_period_in_days", + }); err != nil { + return diag.FromErr(err) + } + } + + if errs := errors.Join( + d.Set(FullyQualifiedNameAttributeName, id.FullyQualifiedName()), + d.Set(ShowOutputAttributeName, []map[string]any{schemas.AccountToSchema(account)}), + ); errs != nil { + return diag.FromErr(errs) + } + + return nil } +} - if err := d.Set(FullyQualifiedNameAttributeName, id.FullyQualifiedName()); err != nil { +func UpdateAccount(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + client := meta.(*provider.Context).Client + + isOrgAdmin, err := client.ContextFunctions.IsRoleInSession(ctx, snowflakeroles.Orgadmin) + if err != nil { return diag.FromErr(err) } - - if err = d.Set("name", acc.AccountName); err != nil { - return diag.FromErr(fmt.Errorf("error setting name: %w", err)) + if !isOrgAdmin { + return diag.FromErr(errors.New("current user doesn't have the orgadmin role in session")) } - if err = d.Set("edition", acc.Edition); err != nil { - return diag.FromErr(fmt.Errorf("error setting edition: %w", err)) + id, err := sdk.ParseAccountIdentifier(d.Id()) + if err != nil { + return diag.FromErr(err) } - if err = d.Set("region_group", acc.RegionGroup); err != nil { - return diag.FromErr(fmt.Errorf("error setting region_group: %w", err)) - } + if d.HasChange("name") { + newId := sdk.NewAccountIdentifier(id.OrganizationName(), d.Get("name").(string)) - if err = d.Set("region", acc.SnowflakeRegion); err != nil { - return diag.FromErr(fmt.Errorf("error setting region: %w", err)) - } + err = client.Accounts.Alter(ctx, &sdk.AlterAccountOptions{ + Rename: &sdk.AccountRename{ + Name: id.AsAccountObjectIdentifier(), + NewName: newId.AsAccountObjectIdentifier(), + }, + }) + if err != nil { + return diag.FromErr(err) + } - if err = d.Set("comment", acc.Comment); err != nil { - return diag.FromErr(fmt.Errorf("error setting comment: %w", err)) + d.SetId(helpers.EncodeResourceIdentifier(newId)) + id = newId } - if err = d.Set("is_org_admin", acc.IsOrgAdmin); err != nil { - return diag.FromErr(fmt.Errorf("error setting is_org_admin: %w", err)) - } + if d.HasChange("is_org_admin") { + oldIsOrgAdmin, newIsOrgAdmin := d.GetChange("is_org_admin") - return nil -} + // Setting from default to false and vice versa is not allowed because Snowflake throws an error on already disabled IsOrgAdmin + canUpdate := true + if (oldIsOrgAdmin.(string) == BooleanFalse && newIsOrgAdmin.(string) == BooleanDefault) || + (oldIsOrgAdmin.(string) == BooleanDefault && newIsOrgAdmin.(string) == BooleanFalse) { + canUpdate = false + } -// UpdateAccount implements schema.UpdateFunc. -func UpdateAccount(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { - /* - todo: comments may eventually work again for accounts, so this can be uncommented when that happens - client := meta.(*provider.Context).Client - client := sdk.NewClientFromDB(db) - ctx := context.Background() - - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - - // Change comment - if d.HasChange("comment") { - // changing comment isn't supported for accounts - err := client.Comments.Set(ctx, &sdk.SetCommentOptions{ - ObjectType: sdk.ObjectTypeAccount, - ObjectName: sdk.NewAccountObjectIdentifier(d.Get("name").(string)), - Value: sdk.String(d.Get("comment").(string)), - }) - if err != nil { - return err + if canUpdate { + if newIsOrgAdmin.(string) != BooleanDefault { + parsed, err := booleanStringToBool(newIsOrgAdmin.(string)) + if err != nil { + return diag.FromErr(err) + } + if err := client.Accounts.Alter(ctx, &sdk.AlterAccountOptions{ + SetIsOrgAdmin: &sdk.AccountSetIsOrgAdmin{ + Name: id.AsAccountObjectIdentifier(), + OrgAdmin: parsed, + }, + }); err != nil { + return diag.FromErr(err) + } + } else { + // No unset available for this field (setting Snowflake default) + if err := client.Accounts.Alter(ctx, &sdk.AlterAccountOptions{ + SetIsOrgAdmin: &sdk.AccountSetIsOrgAdmin{ + Name: id.AsAccountObjectIdentifier(), + OrgAdmin: false, + }, + }); err != nil { + return diag.FromErr(err) + } } } - */ - return nil + } + + return ReadAccount(false)(ctx, d, meta) } -// DeleteAccount implements schema.DeleteFunc. -func DeleteAccount(ctx context.Context, d *schema.ResourceData, meta interface{}) diag.Diagnostics { +func DeleteAccount(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - gracePeriodInDays := d.Get("grace_period_in_days").(int) - err := client.Accounts.Drop(ctx, helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier), gracePeriodInDays, &sdk.DropAccountOptions{ + + isOrgAdmin, err := client.ContextFunctions.IsRoleInSession(ctx, snowflakeroles.Orgadmin) + if err != nil { + return diag.FromErr(err) + } + if !isOrgAdmin { + return diag.FromErr(errors.New("current user doesn't have the orgadmin role in session")) + } + + id, err := sdk.ParseAccountIdentifier(d.Id()) + if err != nil { + return diag.FromErr(err) + } + + err = client.Accounts.Drop(ctx, id.AsAccountObjectIdentifier(), d.Get("grace_period_in_days").(int), &sdk.DropAccountOptions{ IfExists: sdk.Bool(true), }) - return diag.FromErr(err) + if err != nil { + return diag.FromErr(err) + } + + d.SetId("") + + return nil } diff --git a/pkg/resources/account_acceptance_test.go b/pkg/resources/account_acceptance_test.go index ceb1a5df64..3b2e699d9f 100644 --- a/pkg/resources/account_acceptance_test.go +++ b/pkg/resources/account_acceptance_test.go @@ -2,9 +2,24 @@ package resources_test import ( "fmt" + "regexp" "testing" + tfconfig "github.com/hashicorp/terraform-plugin-testing/config" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/resourceassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/resourceshowoutputassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/config/model" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/snowflakeenvs" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/snowflakeroles" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-testing/plancheck" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" @@ -12,81 +27,650 @@ import ( "github.com/hashicorp/terraform-plugin-testing/tfversion" ) -func TestAcc_Account_complete(t *testing.T) { +func TestAcc_Account_Minimal(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + organizationName := acc.TestClient().Context.CurrentAccountId(t).OrganizationName() + id := random.AdminName() + accountId := sdk.NewAccountIdentifier(organizationName, id) + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + region := acc.TestClient().Context.CurrentRegion(t) + + configModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminRsaPublicKey(key) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + Config: config.FromModel(t, configModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModel.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminNameString(name). + HasAdminRsaPublicKeyString(key). + HasNoAdminUserType(). + HasEmailString(email). + HasNoFirstName(). + HasNoLastName(). + HasMustChangePasswordString(r.BooleanDefault). + HasNoRegionGroup(). + HasNoRegion(). + HasNoComment(). + HasIsOrgAdminString(r.BooleanDefault). + HasGracePeriodInDaysString("3"), + resourceshowoutputassert.AccountShowOutput(t, configModel.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasSnowflakeRegion(region). + HasRegionGroup(""). + HasEdition(sdk.EditionStandard). + HasAccountUrlNotEmpty(). + HasCreatedOnNotEmpty(). + HasComment("SNOWFLAKE"). + HasAccountLocatorNotEmpty(). + HasAccountLocatorUrlNotEmpty(). + HasManagedAccounts(0). + HasConsumptionBillingEntityNameNotEmpty(). + HasMarketplaceConsumerBillingEntityName(""). + HasMarketplaceProviderBillingEntityNameNotEmpty(). + HasOldAccountURL(""). + HasIsOrgAdmin(false). + HasAccountOldUrlSavedOnEmpty(). + HasAccountOldUrlLastUsedEmpty(). + HasOrganizationOldUrl(""). + HasOrganizationOldUrlSavedOnEmpty(). + HasOrganizationOldUrlLastUsedEmpty(). + HasIsEventsAccount(false). + HasIsOrganizationAccount(false). + HasDroppedOnEmpty(). + HasScheduledDeletionTimeEmpty(). + HasRestoredOnEmpty(). + HasMovedToOrganization(""). + HasMovedOn(""). + HasOrganizationUrlExpirationOnEmpty(), + ), + }, + { + ResourceName: configModel.ResourceReference(), + Config: config.FromModel(t, configModel), + ImportState: true, + ImportStateCheck: assert.AssertThatImport(t, + resourceassert.ImportedAccountResource(t, helpers.EncodeResourceIdentifier(accountId)). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasNoAdminName(). + HasNoAdminRsaPublicKey(). + HasNoAdminUserType(). + HasNoEmail(). + HasNoFirstName(). + HasNoLastName(). + HasNoMustChangePassword(). + HasEditionString(string(sdk.EditionStandard)). + HasNoRegionGroup(). + HasRegionString(region). + HasCommentString("SNOWFLAKE"). + HasIsOrgAdminString(r.BooleanFalse). + HasNoGracePeriodInDays(), + ), + }, + }, + }) +} + +func TestAcc_Account_Complete(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - password := acc.TestClient().Ids.AlphaContaining("123ABC") + organizationName := acc.TestClient().Context.CurrentAccountId(t).OrganizationName() + id := random.AdminName() + accountId := sdk.NewAccountIdentifier(organizationName, id) + firstName := acc.TestClient().Ids.Alpha() + lastName := acc.TestClient().Ids.Alpha() + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + region := acc.TestClient().Context.CurrentRegion(t) + comment := random.Comment() + + configModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypePerson). + WithAdminRsaPublicKey(key). + WithFirstName(firstName). + WithLastName(lastName). + WithMustChangePassword(r.BooleanTrue). + WithRegionGroup("PUBLIC"). + WithRegion(region). + WithComment(comment). + WithIsOrgAdmin(r.BooleanFalse) resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - PreCheck: func() { acc.TestAccPreCheck(t) }, TerraformVersionChecks: []tfversion.TerraformVersionCheck{ tfversion.RequireAbove(tfversion.Version1_5_0), }, CheckDestroy: acc.CheckDestroy(t, resources.Account), - // this errors with: Error running post-test destroy, there may be dangling resources: exit status 1 - // unless we change the resource to return nil on destroy then this is unavoidable Steps: []resource.TestStep{ { - Config: accountConfig(id.Name(), password, "Terraform acceptance test", 3), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_account.test", "name", id.Name()), - resource.TestCheckResourceAttr("snowflake_account.test", "fully_qualified_name", id.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_account.test", "admin_name", "someadmin"), - resource.TestCheckResourceAttr("snowflake_account.test", "first_name", "Ad"), - resource.TestCheckResourceAttr("snowflake_account.test", "last_name", "Min"), - resource.TestCheckResourceAttr("snowflake_account.test", "email", "admin@example.com"), - resource.TestCheckResourceAttr("snowflake_account.test", "must_change_password", "false"), - resource.TestCheckResourceAttr("snowflake_account.test", "edition", "BUSINESS_CRITICAL"), - resource.TestCheckResourceAttr("snowflake_account.test", "comment", "Terraform acceptance test"), - resource.TestCheckResourceAttr("snowflake_account.test", "grace_period_in_days", "3"), + Config: config.FromModel(t, configModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModel.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(sdk.NewAccountIdentifier(organizationName, id).FullyQualifiedName()). + HasAdminNameString(name). + HasAdminRsaPublicKeyString(key). + HasAdminUserType(sdk.UserTypePerson). + HasEmailString(email). + HasFirstNameString(firstName). + HasLastNameString(lastName). + HasMustChangePasswordString(r.BooleanTrue). + HasRegionGroupString("PUBLIC"). + HasRegionString(region). + HasCommentString(comment). + HasIsOrgAdminString(r.BooleanFalse). + HasGracePeriodInDaysString("3"), + resourceshowoutputassert.AccountShowOutput(t, configModel.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasSnowflakeRegion(region). + HasRegionGroup(""). + HasEdition(sdk.EditionStandard). + HasAccountUrlNotEmpty(). + HasCreatedOnNotEmpty(). + HasComment(comment). + HasAccountLocatorNotEmpty(). + HasAccountLocatorUrlNotEmpty(). + HasManagedAccounts(0). + HasConsumptionBillingEntityNameNotEmpty(). + HasMarketplaceConsumerBillingEntityName(""). + HasMarketplaceProviderBillingEntityNameNotEmpty(). + HasOldAccountURL(""). + HasIsOrgAdmin(false). + HasAccountOldUrlSavedOnEmpty(). + HasAccountOldUrlLastUsedEmpty(). + HasOrganizationOldUrl(""). + HasOrganizationOldUrlSavedOnEmpty(). + HasOrganizationOldUrlLastUsedEmpty(). + HasIsEventsAccount(false). + HasIsOrganizationAccount(false). + HasDroppedOnEmpty(). + HasScheduledDeletionTimeEmpty(). + HasRestoredOnEmpty(). + HasMovedToOrganization(""). + HasMovedOn(""). + HasOrganizationUrlExpirationOnEmpty(), ), - Destroy: false, }, - // Change Grace Period In Days { - Config: accountConfig(id.Name(), password, "Terraform acceptance test", 4), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_account.test", "grace_period_in_days", "4"), + ResourceName: configModel.ResourceReference(), + Config: config.FromModel(t, configModel), + ImportState: true, + ImportStateCheck: assert.AssertThatImport(t, + resourceassert.ImportedAccountResource(t, helpers.EncodeResourceIdentifier(accountId)). + HasNameString(id). + HasFullyQualifiedNameString(sdk.NewAccountIdentifier(organizationName, id).FullyQualifiedName()). + HasNoAdminName(). + HasNoAdminRsaPublicKey(). + HasNoEmail(). + HasNoFirstName(). + HasNoLastName(). + HasNoAdminUserType(). + HasNoMustChangePassword(). + HasEditionString(string(sdk.EditionStandard)). + HasNoRegionGroup(). + HasRegionString(region). + HasCommentString(comment). + HasIsOrgAdminString(r.BooleanFalse). + HasNoGracePeriodInDays(), ), }, - // IMPORT - { - ResourceName: "snowflake_account.test", - ImportState: true, - ImportStateVerify: true, - ImportStateVerifyIgnore: []string{ - "admin_name", - "admin_password", - "admin_rsa_public_key", - "email", - "must_change_password", - "first_name", - "last_name", - "grace_period_in_days", + }, + }) +} + +func TestAcc_Account_Rename(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + organizationName := acc.TestClient().Context.CurrentAccountId(t).OrganizationName() + id := random.AdminName() + accountId := sdk.NewAccountIdentifier(organizationName, id) + + newId := random.AdminName() + newAccountId := sdk.NewAccountIdentifier(organizationName, newId) + + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + + configModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + newConfigModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, newId). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + Config: config.FromModel(t, configModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModel.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService), + resourceshowoutputassert.AccountShowOutput(t, configModel.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id), + ), + }, + { + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction(newConfigModel.ResourceReference(), plancheck.ResourceActionUpdate), + }, }, + Config: config.FromModel(t, newConfigModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, newConfigModel.ResourceReference()). + HasNameString(newId). + HasFullyQualifiedNameString(newAccountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService), + resourceshowoutputassert.AccountShowOutput(t, newConfigModel.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(newId), + ), }, }, }) } -func accountConfig(name string, password string, comment string, gracePeriodInDays int) string { - return fmt.Sprintf(` -data "snowflake_current_account" "current" {} +func TestAcc_Account_IsOrgAdmin(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + organizationName := acc.TestClient().Context.CurrentAccountId(t).OrganizationName() + id := random.AdminName() + accountId := sdk.NewAccountIdentifier(organizationName, id) + + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + + configModelWithOrgAdminTrue := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key). + WithIsOrgAdmin(r.BooleanTrue) + + configModelWithOrgAdminFalse := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key). + WithIsOrgAdmin(r.BooleanFalse) + + configModelWithoutOrgAdmin := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + // Create with ORGADMIN enabled + { + Config: config.FromModel(t, configModelWithOrgAdminTrue), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModelWithOrgAdminTrue.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService). + HasIsOrgAdminString(r.BooleanTrue), + resourceshowoutputassert.AccountShowOutput(t, configModelWithOrgAdminTrue.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasIsOrgAdmin(true), + ), + }, + // Disable ORGADMIN + { + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction(configModelWithOrgAdminFalse.ResourceReference(), plancheck.ResourceActionUpdate), + }, + }, + Config: config.FromModel(t, configModelWithOrgAdminFalse), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModelWithOrgAdminFalse.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService). + HasIsOrgAdminString(r.BooleanFalse), + resourceshowoutputassert.AccountShowOutput(t, configModelWithOrgAdminFalse.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasIsOrgAdmin(false), + ), + }, + // Remove is_org_admin from the config and go back to default (disabled) + { + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction(configModelWithoutOrgAdmin.ResourceReference(), plancheck.ResourceActionUpdate), + }, + }, + Config: config.FromModel(t, configModelWithoutOrgAdmin), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModelWithoutOrgAdmin.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService). + HasIsOrgAdminString(r.BooleanDefault), + resourceshowoutputassert.AccountShowOutput(t, configModelWithoutOrgAdmin.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasIsOrgAdmin(false), + ), + }, + // External change (enable ORGADMIN) + { + PreConfig: func() { + acc.TestClient().Account.Alter(t, &sdk.AlterAccountOptions{ + SetIsOrgAdmin: &sdk.AccountSetIsOrgAdmin{ + Name: accountId.AsAccountObjectIdentifier(), + OrgAdmin: true, + }, + }) + }, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction(configModelWithoutOrgAdmin.ResourceReference(), plancheck.ResourceActionUpdate), + }, + }, + Config: config.FromModel(t, configModelWithoutOrgAdmin), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModelWithoutOrgAdmin.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminUserType(sdk.UserTypeService). + HasIsOrgAdminString(r.BooleanDefault), + resourceshowoutputassert.AccountShowOutput(t, configModelWithoutOrgAdmin.ResourceReference()). + HasOrganizationName(organizationName). + HasAccountName(id). + HasIsOrgAdmin(false), + ), + }, + }, + }) +} + +func TestAcc_Account_IgnoreUpdateAfterCreationOnCertainFields(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + organizationName := acc.TestClient().Context.CurrentAccountId(t).OrganizationName() + id := random.AdminName() + accountId := sdk.NewAccountIdentifier(organizationName, id) + + firstName := random.AdminName() + lastName := random.AdminName() + email := random.Email() + name := random.AdminName() + pass := random.Password() + newFirstName := random.AdminName() + newLastName := random.AdminName() + newEmail := random.Email() + newName := random.AdminName() + newPass := random.Password() + + configModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypePerson). + WithFirstName(firstName). + WithLastName(lastName). + WithMustChangePassword(r.BooleanTrue). + WithAdminPassword(pass) + + newConfigModel := model.Account("test", newName, string(sdk.EditionStandard), newEmail, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminPassword(newPass). + WithFirstName(newFirstName). + WithLastName(newLastName) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + Config: config.FromModel(t, configModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModel.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminNameString(name). + HasAdminPasswordString(pass). + HasAdminUserType(sdk.UserTypePerson). + HasEmailString(email). + HasFirstNameString(firstName). + HasLastNameString(lastName). + HasMustChangePasswordString(r.BooleanTrue), + ), + }, + { + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction(newConfigModel.ResourceReference(), plancheck.ResourceActionNoop), + }, + }, + Config: config.FromModel(t, newConfigModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, newConfigModel.ResourceReference()). + HasNameString(id). + HasFullyQualifiedNameString(accountId.FullyQualifiedName()). + HasAdminNameString(name). + HasAdminPasswordString(pass). + HasAdminUserType(sdk.UserTypePerson). + HasEmailString(email). + HasFirstNameString(firstName). + HasLastNameString(lastName). + HasMustChangePasswordString(r.BooleanTrue), + ), + }, + }, + }) +} + +func TestAcc_Account_TryToCreateWithoutOrgadmin(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + id := random.AdminName() + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + + t.Setenv(string(testenvs.ConfigureClientOnce), "") + t.Setenv(snowflakeenvs.Role, snowflakeroles.Accountadmin.Name()) + + configModel := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + Config: config.FromModel(t, configModel), + ExpectError: regexp.MustCompile("Error: current user doesn't have the orgadmin role in session"), + }, + }, + }) +} + +func TestAcc_Account_InvalidValues(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + id := random.AdminName() + email := random.Email() + name := random.AdminName() + key, _ := random.GenerateRSAPublicKey(t) + + configModelInvalidUserType := model.Account("test", name, string(sdk.EditionStandard), email, 3, id). + WithAdminUserType("invalid_user_type"). + WithAdminRsaPublicKey(key) + + configModelInvalidAccountEdition := model.Account("test", name, "invalid_account_edition", email, 3, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + + configModelInvalidGracePeriodInDays := model.Account("test", name, string(sdk.EditionStandard), email, 2, id). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminRsaPublicKey(key) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + Config: config.FromModel(t, configModelInvalidUserType), + ExpectError: regexp.MustCompile("invalid user type: invalid_user_type"), + }, + { + Config: config.FromModel(t, configModelInvalidAccountEdition), + ExpectError: regexp.MustCompile("unknown account edition: invalid_account_edition"), + }, + { + Config: config.FromModel(t, configModelInvalidGracePeriodInDays), + ExpectError: regexp.MustCompile(`Error: expected grace_period_in_days to be at least \(3\), got 2`), + }, + }, + }) +} + +func TestAcc_Account_UpgradeFrom_v0_99_0(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + _ = testenvs.GetOrSkipTest(t, testenvs.TestAccountCreate) + + email := random.Email() + name := random.AdminName() + adminName := random.AdminName() + adminPassword := random.Password() + firstName := random.AdminName() + lastName := random.AdminName() + region := acc.TestClient().Context.CurrentRegion(t) + comment := random.Comment() + + configModel := model.Account("test", adminName, string(sdk.EditionStandard), email, 3, name). + WithAdminUserTypeEnum(sdk.UserTypeService). + WithAdminPassword(adminPassword). + WithFirstName(firstName). + WithLastName(lastName). + WithMustChangePasswordValue(tfconfig.BoolVariable(true)). + WithRegion(region). + WithIsOrgAdmin(r.BooleanFalse). + WithComment(comment) + + resource.Test(t, resource.TestCase{ + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Account), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.99.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: accountConfig_v0_99_0(name, adminName, adminPassword, email, sdk.EditionStandard, firstName, lastName, true, region, 3, comment), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: config.FromModel(t, configModel), + Check: assert.AssertThat(t, + resourceassert.AccountResource(t, configModel.ResourceReference()). + HasNameString(name). + HasAdminNameString(adminName). + HasAdminPasswordString(adminPassword). + HasEmailString(email). + HasFirstNameString(firstName). + HasLastNameString(lastName). + HasMustChangePasswordString(r.BooleanTrue). + HasRegionGroupString(""). + HasRegionString(region). + HasCommentString(comment). + HasIsOrgAdminString(r.BooleanFalse). + HasGracePeriodInDaysString("3"), + ), + }, + }, + }) +} + +func accountConfig_v0_99_0( + name string, + adminName string, + adminPassword string, + email string, + edition sdk.AccountEdition, + firstName string, + lastName string, + mustChangePassword bool, + region string, + gracePeriodInDays int, + comment string, +) string { + return fmt.Sprintf(` resource "snowflake_account" "test" { - name = "%s" - admin_name = "someadmin" - admin_password = "%s" - first_name = "Ad" - last_name = "Min" - email = "admin@example.com" - must_change_password = false - edition = "BUSINESS_CRITICAL" - comment = "%s" - region = data.snowflake_current_account.current.region - grace_period_in_days = %d + name = "%[1]s" + admin_name = "%[2]s" + admin_password = "%[3]s" + email = "%[4]s" + edition = "%[5]s" + first_name = "%[6]s" + last_name = "%[7]s" + must_change_password = %[8]t + region = "%[9]s" + grace_period_in_days = %[10]d + comment = "%[11]s" } -`, name, password, comment, gracePeriodInDays) +`, + name, + adminName, + adminPassword, + email, + edition, + firstName, + lastName, + mustChangePassword, + region, + gracePeriodInDays, + comment, + ) } diff --git a/pkg/resources/account_state_upgraders.go b/pkg/resources/account_state_upgraders.go new file mode 100644 index 0000000000..bfd0b60bba --- /dev/null +++ b/pkg/resources/account_state_upgraders.go @@ -0,0 +1,28 @@ +package resources + +import ( + "context" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" +) + +func v0_99_0_AccountStateUpgrader(ctx context.Context, state map[string]any, meta any) (map[string]any, error) { + if state == nil { + return state, nil + } + + client := meta.(*provider.Context).Client + state["must_change_password"] = booleanStringFromBool(state["must_change_password"].(bool)) + state["is_org_admin"] = booleanStringFromBool(state["is_org_admin"].(bool)) + account, err := client.Accounts.ShowByID(ctx, sdk.NewAccountObjectIdentifier(state["name"].(string))) + if err != nil { + return nil, err + } + + state["id"] = helpers.EncodeResourceIdentifier(sdk.NewAccountIdentifier(account.OrganizationName, account.AccountName)) + + return state, nil +} diff --git a/pkg/resources/api_authentication_integration_common.go b/pkg/resources/api_authentication_integration_common.go index 8791a5d0df..b70f42cbc2 100644 --- a/pkg/resources/api_authentication_integration_common.go +++ b/pkg/resources/api_authentication_integration_common.go @@ -33,7 +33,7 @@ var apiAuthCommonSchema = map[string]*schema.Schema{ "oauth_client_secret": { Type: schema.TypeString, Required: true, - Description: "Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance.", + Description: externalChangesNotDetectedFieldDescription("Specifies the client secret for the OAuth application in the ServiceNow instance from the previous step. The connector uses this to request an access token from the ServiceNow instance."), }, "oauth_token_endpoint": { Type: schema.TypeString, diff --git a/pkg/resources/common.go b/pkg/resources/common.go index 643524f9d9..4c84ac1c4c 100644 --- a/pkg/resources/common.go +++ b/pkg/resources/common.go @@ -60,7 +60,7 @@ func ctyValToSliceString(valueElems []cty.Value) []string { return elems } -func ImportName[T sdk.AccountObjectIdentifier | sdk.DatabaseObjectIdentifier | sdk.SchemaObjectIdentifier](ctx context.Context, d *schema.ResourceData, meta any) ([]*schema.ResourceData, error) { +func ImportName[T sdk.AccountObjectIdentifier | sdk.DatabaseObjectIdentifier | sdk.SchemaObjectIdentifier | sdk.AccountIdentifier](ctx context.Context, d *schema.ResourceData, meta any) ([]*schema.ResourceData, error) { switch any(new(T)).(type) { case *sdk.AccountObjectIdentifier: id, err := sdk.ParseAccountObjectIdentifier(d.Id()) @@ -101,6 +101,15 @@ func ImportName[T sdk.AccountObjectIdentifier | sdk.DatabaseObjectIdentifier | s if err := d.Set("schema", id.SchemaName()); err != nil { return nil, err } + case *sdk.AccountIdentifier: + id, err := sdk.ParseAccountIdentifier(d.Id()) + if err != nil { + return nil, err + } + + if err := d.Set("name", id.AccountName()); err != nil { + return nil, err + } } return []*schema.ResourceData{d}, nil diff --git a/pkg/resources/custom_diffs.go b/pkg/resources/custom_diffs.go index 872c7ed24f..eb33b246b7 100644 --- a/pkg/resources/custom_diffs.go +++ b/pkg/resources/custom_diffs.go @@ -275,7 +275,7 @@ func RecreateWhenStreamIsStale() schema.CustomizeDiffFunc { func RecreateWhenResourceBoolFieldChangedExternally(boolField string, wantValue bool) schema.CustomizeDiffFunc { return func(_ context.Context, diff *schema.ResourceDiff, _ interface{}) error { if n := diff.Get(boolField); n != nil { - logging.DebugLogger.Printf("[DEBUG] new external value for %v: %v\n", boolField, n.(bool)) + logging.DebugLogger.Printf("[DEBUG] new external value for %v: %v, recreating the resource...\n", boolField, n.(bool)) if n.(bool) != wantValue { return errors.Join(diff.SetNew(boolField, wantValue), diff.ForceNew(boolField)) diff --git a/pkg/resources/database.go b/pkg/resources/database.go index b9b0935bcc..2bd4784e78 100644 --- a/pkg/resources/database.go +++ b/pkg/resources/database.go @@ -9,6 +9,7 @@ import ( "time" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + providerresources "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/util" @@ -61,7 +62,7 @@ var databaseSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, // TODO(SNOW-1438810): Add account identifier validator - Description: "Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `\"\".\"\"`.", + Description: relatedResourceDescription("Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `\"\".\"\"`.", providerresources.Account), }, "with_failover": { Type: schema.TypeBool, @@ -477,6 +478,7 @@ func DeleteDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag. return diag.FromErr(err) } + // TODO(SNOW-1818849): unassign network policies inside the database before dropping err = client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ IfExists: sdk.Bool(true), }) diff --git a/pkg/resources/database_commons.go b/pkg/resources/database_commons.go index ab40bc7085..0b12163e17 100644 --- a/pkg/resources/database_commons.go +++ b/pkg/resources/database_commons.go @@ -89,19 +89,15 @@ func init() { Name: sdk.ObjectParameterLogLevel, Type: schema.TypeString, Description: fmt.Sprintf("Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: %v. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level).", sdk.AsStringList(sdk.AllLogLevels)), - ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllLogLevels), true), - DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return strings.EqualFold(oldValue, newValue) - }, + ValidateDiag: sdkValidation(sdk.ToLogLevel), + DiffSuppress: NormalizeAndCompare(sdk.ToLogLevel), }, { Name: sdk.ObjectParameterTraceLevel, Type: schema.TypeString, Description: fmt.Sprintf("Controls how trace events are ingested into the event table. Valid options are: %v. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level).", sdk.AsStringList(sdk.AllTraceLevels)), - ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllTraceLevels), true), - DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return strings.EqualFold(oldValue, newValue) - }, + ValidateDiag: sdkValidation(sdk.ToTraceLevel), + DiffSuppress: NormalizeAndCompare(sdk.ToTraceLevel), }, { Name: sdk.ObjectParameterMaxDataExtensionTimeInDays, @@ -118,10 +114,8 @@ func init() { Name: sdk.ObjectParameterStorageSerializationPolicy, Type: schema.TypeString, Description: fmt.Sprintf("The storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: %v. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake. For more information, see [STORAGE_SERIALIZATION_POLICY](https://docs.snowflake.com/en/sql-reference/parameters#storage-serialization-policy).", sdk.AsStringList(sdk.AllStorageSerializationPolicies)), - ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllStorageSerializationPolicies), true), - DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return strings.EqualFold(oldValue, newValue) - }, + ValidateDiag: sdkValidation(sdk.ToStorageSerializationPolicy), + DiffSuppress: NormalizeAndCompare(sdk.ToStorageSerializationPolicy), }, { Name: sdk.ObjectParameterSuspendTaskAfterNumFailures, diff --git a/pkg/resources/database_role.go b/pkg/resources/database_role.go index 8888c01f1d..86d9ab84e4 100644 --- a/pkg/resources/database_role.go +++ b/pkg/resources/database_role.go @@ -51,11 +51,15 @@ var databaseRoleSchema = map[string]*schema.Schema{ func DatabaseRole() *schema.Resource { return &schema.Resource{ + SchemaVersion: 1, + CreateContext: TrackingCreateWrapper(resources.DatabaseRole, CreateDatabaseRole), ReadContext: TrackingReadWrapper(resources.DatabaseRole, ReadDatabaseRole), UpdateContext: TrackingUpdateWrapper(resources.DatabaseRole, UpdateDatabaseRole), DeleteContext: TrackingDeleteWrapper(resources.DatabaseRole, DeleteDatabaseRole), + Description: "Resource used to manage database roles. For more information, check [database roles documentation](https://docs.snowflake.com/en/sql-reference/sql/create-database-role).", + Schema: databaseRoleSchema, Importer: &schema.ResourceImporter{ StateContext: TrackingImportWrapper(resources.DatabaseRole, ImportName[sdk.DatabaseObjectIdentifier]), @@ -63,9 +67,9 @@ func DatabaseRole() *schema.Resource { CustomizeDiff: TrackingCustomDiffWrapper(resources.DatabaseRole, customdiff.All( ComputedIfAnyAttributeChanged(databaseRoleSchema, ShowOutputAttributeName, "comment", "name"), + ComputedIfAnyAttributeChanged(databaseRoleSchema, FullyQualifiedNameAttributeName, "name"), )), - SchemaVersion: 1, StateUpgraders: []schema.StateUpgrader{ { Version: 0, diff --git a/pkg/resources/diff_suppressions.go b/pkg/resources/diff_suppressions.go index 14efa760b2..4e80bbf0e4 100644 --- a/pkg/resources/diff_suppressions.go +++ b/pkg/resources/diff_suppressions.go @@ -265,6 +265,27 @@ func IgnoreNewEmptyListOrSubfields(ignoredSubfields ...string) schema.SchemaDiff } } +// IgnoreMatchingColumnNameAndMaskingPolicyUsingFirstElem ignores when the first element of USING is matching the column name. +// see USING section in https://docs.snowflake.com/en/sql-reference/sql/create-view#optional-parameters +func IgnoreMatchingColumnNameAndMaskingPolicyUsingFirstElem() schema.SchemaDiffSuppressFunc { + return func(k, old, new string, d *schema.ResourceData) bool { + // suppress diff when the name of the column matches the name of using + parts := strings.SplitN(k, ".", 6) + if len(parts) < 6 { + log.Printf("[DEBUG] invalid resource key: %s", parts) + return false + } + // key is element count + if parts[5] == "#" && old == "1" && new == "0" { + return true + } + colNameKey := strings.Join([]string{parts[0], parts[1], "column_name"}, ".") + colName := d.Get(colNameKey).(string) + + return new == "" && old == colName + } +} + func ignoreTrimSpaceSuppressFunc(_, old, new string, _ *schema.ResourceData) bool { return strings.TrimSpace(old) == strings.TrimSpace(new) } diff --git a/pkg/resources/diff_suppressions_test.go b/pkg/resources/diff_suppressions_test.go index 7c54f03938..e34a75114d 100644 --- a/pkg/resources/diff_suppressions_test.go +++ b/pkg/resources/diff_suppressions_test.go @@ -204,3 +204,85 @@ func Test_ignoreNewEmptyList(t *testing.T) { }) } } + +func Test_IgnoreMatchingColumnNameAndMaskingPolicyUsingFirstElem(t *testing.T) { + resourceSchema := map[string]*schema.Schema{ + "column": { + Type: schema.TypeList, + Optional: true, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "column_name": { + Type: schema.TypeString, + Required: true, + }, + "masking_policy": { + Type: schema.TypeList, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "using": { + Type: schema.TypeList, + Optional: true, + Elem: &schema.Schema{ + Type: schema.TypeString, + }, + }, + }, + }, + }, + }, + }, + }, + } + resourceData := func(using ...any) map[string]any { + return map[string]any{ + "column": []any{ + map[string]any{ + "column_name": "foo", + "masking_policy": []any{ + map[string]any{ + "using": using, + }, + }, + }, + }, + } + } + tests := []struct { + name string + key string + old string + new string + resourceData *schema.ResourceData + wantSuppress bool + }{ + { + name: "suppress when USING is not specified in the config, but is in the state - check count", + key: "column.0.masking_policy.0.using.#", + old: "1", + new: "0", + resourceData: schema.TestResourceDataRaw(t, resourceSchema, resourceData("foo")), + wantSuppress: true, + }, + { + name: "suppress when USING is not specified in the config, but is in the state - check elem", + key: "column.0.masking_policy.0.using.0", + old: "foo", + new: "", + resourceData: schema.TestResourceDataRaw(t, resourceSchema, resourceData("foo")), + wantSuppress: true, + }, + { + name: "do not suppress when there is column name mismatch", + key: "column.0.masking_policy.0.using.0", + old: "foo", + new: "bar", + resourceData: schema.TestResourceDataRaw(t, resourceSchema, resourceData("foo")), + }, + } + for _, tt := range tests { + t.Run(tt.name, func(t *testing.T) { + require.Equal(t, tt.wantSuppress, resources.IgnoreMatchingColumnNameAndMaskingPolicyUsingFirstElem()(tt.key, tt.old, tt.new, tt.resourceData)) + }) + } +} diff --git a/pkg/resources/doc_helpers.go b/pkg/resources/doc_helpers.go index eec39dcf4a..eb437015f9 100644 --- a/pkg/resources/doc_helpers.go +++ b/pkg/resources/doc_helpers.go @@ -5,6 +5,7 @@ import ( "strings" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider/docs" + providerresources "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" ) func possibleValuesListed[T ~string | ~int](values []T) string { @@ -28,11 +29,11 @@ func externalChangesNotDetectedFieldDescription(description string) string { } func withPrivilegedRolesDescription(description, paramName string) string { - return fmt.Sprintf(`%s By default, this list includes the ACCOUNTADMIN, ORGADMIN and SECURITYADMIN roles. To remove these privileged roles from the list, use the ALTER ACCOUNT command to set the %s account parameter to FALSE. `, description, paramName) + return fmt.Sprintf(`%s By default, this list includes the ACCOUNTADMIN, ORGADMIN and SECURITYADMIN roles. To remove these privileged roles from the list, use the ALTER ACCOUNT command to set the %s account parameter to FALSE.`, description, paramName) } func blocklistedCharactersFieldDescription(description string) string { - return fmt.Sprintf(`%s Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: %s`, description, characterList([]rune{'|', '.', '"'})) + return fmt.Sprintf(`%s Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: %s.`, description, characterList([]rune{'|', '.', '"'})) } func diffSuppressStatementFieldDescription(description string) string { @@ -44,5 +45,13 @@ func dataTypeFieldDescription(description string) string { } func deprecatedResourceDescription(alternatives ...string) string { - return fmt.Sprintf(`This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: %s`, possibleValuesListed(alternatives)) + return fmt.Sprintf(`This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: %s.`, possibleValuesListed(alternatives)) +} + +func copyGrantsDescription(description string) string { + return fmt.Sprintf("%s This is used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect during creating a new object with Terraform.", description) +} + +func relatedResourceDescription(description string, resource providerresources.Resource) string { + return fmt.Sprintf(`%s For more information about this resource, see [docs](./%s).`, description, strings.TrimPrefix(resource.String(), "snowflake_")) } diff --git a/pkg/resources/external_function.go b/pkg/resources/external_function.go index 2580fb6141..5330787394 100644 --- a/pkg/resources/external_function.go +++ b/pkg/resources/external_function.go @@ -275,7 +275,7 @@ func CreateContextExternalFunction(ctx context.Context, d *schema.ResourceData, case v.(string) == "CALLED ON NULL INPUT": req.WithNullInputBehavior(sdk.NullInputBehaviorCalledOnNullInput) case v.(string) == "RETURNS NULL ON NULL INPUT": - req.WithNullInputBehavior(sdk.NullInputBehaviorReturnNullInput) + req.WithNullInputBehavior(sdk.NullInputBehaviorReturnsNullInput) default: req.WithNullInputBehavior(sdk.NullInputBehaviorStrict) } @@ -501,11 +501,11 @@ func UpdateContextExternalFunction(ctx context.Context, d *schema.ResourceData, req := sdk.NewAlterFunctionRequest(id) if d.HasChange("comment") { - _, new := d.GetChange("comment") - if new == "" { - req.UnsetComment = sdk.Bool(true) + _, newComment := d.GetChange("comment") + if newComment.(string) == "" { + req.WithUnset(*sdk.NewFunctionUnsetRequest().WithComment(true)) } else { - req.SetComment = sdk.String(new.(string)) + req.WithSet(*sdk.NewFunctionSetRequest().WithComment(newComment.(string))) } err := client.Functions.Alter(ctx, req) if err != nil { diff --git a/pkg/resources/external_oauth_integration.go b/pkg/resources/external_oauth_integration.go index 8ec52d3327..785efca961 100644 --- a/pkg/resources/external_oauth_integration.go +++ b/pkg/resources/external_oauth_integration.go @@ -88,7 +88,7 @@ var externalOauthIntegrationSchema = map[string]*schema.Schema{ Type: schema.TypeSet, Elem: &schema.Schema{Type: schema.TypeString}, Optional: true, - Description: withPrivilegedRolesDescription("Specifies the list of roles that a client cannot set as the primary role.", string(sdk.AccountParameterExternalOAuthAddPrivilegedRolesToBlockedList)), + Description: relatedResourceDescription(withPrivilegedRolesDescription("Specifies the list of roles that a client cannot set as the primary role.", string(sdk.AccountParameterExternalOAuthAddPrivilegedRolesToBlockedList)), resources.AccountRole), DiffSuppressFunc: IgnoreValuesFromSetIfParamSet("external_oauth_blocked_roles_list", string(sdk.AccountParameterExternalOAuthAddPrivilegedRolesToBlockedList), privilegedRoles), ConflictsWith: []string{"external_oauth_allowed_roles_list"}, }, @@ -96,7 +96,7 @@ var externalOauthIntegrationSchema = map[string]*schema.Schema{ Type: schema.TypeSet, Elem: &schema.Schema{Type: schema.TypeString}, Optional: true, - Description: "Specifies the list of roles that the client can set as the primary role.", + Description: relatedResourceDescription("Specifies the list of roles that the client can set as the primary role.", resources.AccountRole), ConflictsWith: []string{"external_oauth_blocked_roles_list"}, }, "external_oauth_audience_list": { diff --git a/pkg/resources/function.go b/pkg/resources/function.go index ba91184217..38c6619a37 100644 --- a/pkg/resources/function.go +++ b/pkg/resources/function.go @@ -240,7 +240,7 @@ func createJavaFunction(ctx context.Context, d *schema.ResourceData, meta interf // create request with required request := sdk.NewCreateForJavaFunctionRequest(id, *returns, handler) functionDefinition := d.Get("statement").(string) - request.WithFunctionDefinition(functionDefinition) + request.WithFunctionDefinitionWrapped(functionDefinition) // Set optionals if v, ok := d.GetOk("is_secure"); ok { @@ -310,9 +310,16 @@ func createScalaFunction(ctx context.Context, d *schema.ResourceData, meta inter } functionDefinition := d.Get("statement").(string) handler := d.Get("handler").(string) + var runtimeVersion string + if v, ok := d.GetOk("runtime_version"); ok { + runtimeVersion = v.(string) + } else { + return diag.Errorf("Runtime version is required for Scala function") + } + // create request with required - request := sdk.NewCreateForScalaFunctionRequest(id, nil, handler).WithResultDataTypeOld(sdk.LegacyDataTypeFrom(returnDataType)) - request.WithFunctionDefinition(functionDefinition) + request := sdk.NewCreateForScalaFunctionRequest(id, nil, handler, runtimeVersion).WithResultDataTypeOld(sdk.LegacyDataTypeFrom(returnDataType)) + request.WithFunctionDefinitionWrapped(functionDefinition) // Set optionals if v, ok := d.GetOk("is_secure"); ok { @@ -331,9 +338,6 @@ func createScalaFunction(ctx context.Context, d *schema.ResourceData, meta inter if v, ok := d.GetOk("return_behavior"); ok { request.WithReturnResultsBehavior(sdk.ReturnResultsBehavior(v.(string))) } - if v, ok := d.GetOk("runtime_version"); ok { - request.WithRuntimeVersion(v.(string)) - } if v, ok := d.GetOk("comment"); ok { request.WithComment(v.(string)) } @@ -381,7 +385,7 @@ func createSQLFunction(ctx context.Context, d *schema.ResourceData, meta interfa } functionDefinition := d.Get("statement").(string) // create request with required - request := sdk.NewCreateForSQLFunctionRequest(id, *returns, functionDefinition) + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id, *returns, functionDefinition) // Set optionals if v, ok := d.GetOk("is_secure"); ok { @@ -430,7 +434,7 @@ func createPythonFunction(ctx context.Context, d *schema.ResourceData, meta inte handler := d.Get("handler").(string) // create request with required request := sdk.NewCreateForPythonFunctionRequest(id, *returns, version, handler) - request.WithFunctionDefinition(functionDefinition) + request.WithFunctionDefinitionWrapped(functionDefinition) // Set optionals if v, ok := d.GetOk("is_secure"); ok { @@ -494,7 +498,7 @@ func createJavascriptFunction(ctx context.Context, d *schema.ResourceData, meta } functionDefinition := d.Get("statement").(string) // create request with required - request := sdk.NewCreateForJavascriptFunctionRequest(id, *returns, functionDefinition) + request := sdk.NewCreateForJavascriptFunctionRequestDefinitionWrapped(id, *returns, functionDefinition) // Set optionals if v, ok := d.GetOk("is_secure"); ok { @@ -568,10 +572,13 @@ func ReadContextFunction(ctx context.Context, d *schema.ResourceData, meta inter } } for _, desc := range functionDetails { + if desc.Value == nil { + continue + } switch desc.Property { case "signature": // Format in Snowflake DB is: (argName argType, argName argType, ...) - value := strings.ReplaceAll(strings.ReplaceAll(desc.Value, "(", ""), ")", "") + value := strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "(", ""), ")", "") if value != "" { // Do nothing for functions without arguments pairs := strings.Split(value, ", ") @@ -588,22 +595,22 @@ func ReadContextFunction(ctx context.Context, d *schema.ResourceData, meta inter } } case "null handling": - if err := d.Set("null_input_behavior", desc.Value); err != nil { + if err := d.Set("null_input_behavior", *desc.Value); err != nil { diag.FromErr(err) } case "volatility": - if err := d.Set("return_behavior", desc.Value); err != nil { + if err := d.Set("return_behavior", *desc.Value); err != nil { diag.FromErr(err) } case "body": - if err := d.Set("statement", desc.Value); err != nil { + if err := d.Set("statement", *desc.Value); err != nil { diag.FromErr(err) } case "returns": // Format in Snowflake DB is returnType() re := regexp.MustCompile(`^(.*)\([0-9]*\)$`) - match := re.FindStringSubmatch(desc.Value) - rt := desc.Value + rt := *desc.Value + match := re.FindStringSubmatch(rt) if match != nil { rt = match[1] } @@ -611,15 +618,15 @@ func ReadContextFunction(ctx context.Context, d *schema.ResourceData, meta inter diag.FromErr(err) } case "language": - if snowflake.Contains(languages, strings.ToLower(desc.Value)) { - if err := d.Set("language", desc.Value); err != nil { + if snowflake.Contains(languages, strings.ToLower(*desc.Value)) { + if err := d.Set("language", *desc.Value); err != nil { diag.FromErr(err) } } else { - log.Printf("[INFO] Unexpected language for function %v returned from Snowflake", desc.Value) + log.Printf("[INFO] Unexpected language for function %v returned from Snowflake", *desc.Value) } case "packages": - value := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(desc.Value, "[", ""), "]", ""), "'", "") + value := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "[", ""), "]", ""), "'", "") if value != "" { // Do nothing for Java / Python functions without packages packages := strings.Split(value, ",") if err := d.Set("packages", packages); err != nil { @@ -627,7 +634,7 @@ func ReadContextFunction(ctx context.Context, d *schema.ResourceData, meta inter } } case "imports": - value := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(desc.Value, "[", ""), "]", ""), "'", "") + value := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "[", ""), "]", ""), "'", "") if value != "" { // Do nothing for Java functions without imports imports := strings.Split(value, ",") if err := d.Set("imports", imports); err != nil { @@ -635,19 +642,19 @@ func ReadContextFunction(ctx context.Context, d *schema.ResourceData, meta inter } } case "handler": - if err := d.Set("handler", desc.Value); err != nil { + if err := d.Set("handler", *desc.Value); err != nil { diag.FromErr(err) } case "target_path": - if err := d.Set("target_path", desc.Value); err != nil { + if err := d.Set("target_path", *desc.Value); err != nil { diag.FromErr(err) } case "runtime_version": - if err := d.Set("runtime_version", desc.Value); err != nil { + if err := d.Set("runtime_version", *desc.Value); err != nil { diag.FromErr(err) } default: - log.Printf("[INFO] Unexpected function property %v returned from Snowflake with value %v", desc.Property, desc.Value) + log.Printf("[INFO] Unexpected function property %v returned from Snowflake with value %v", desc.Property, *desc.Value) } } @@ -702,11 +709,11 @@ func UpdateContextFunction(ctx context.Context, d *schema.ResourceData, meta int if d.HasChange("comment") { comment := d.Get("comment") if comment != "" { - if err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetComment(comment.(string))); err != nil { + if err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSet(*sdk.NewFunctionSetRequest().WithComment(comment.(string)))); err != nil { return diag.FromErr(err) } } else { - if err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetComment(true)); err != nil { + if err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnset(*sdk.NewFunctionUnsetRequest().WithComment(true))); err != nil { return diag.FromErr(err) } } diff --git a/pkg/resources/function_acceptance_test.go b/pkg/resources/function_acceptance_test.go index 52d60e3717..df8bb28014 100644 --- a/pkg/resources/function_acceptance_test.go +++ b/pkg/resources/function_acceptance_test.go @@ -142,7 +142,7 @@ func TestAcc_Function_complex(t *testing.T) { resource.TestCheckResourceAttr(resourceName, "comment", "Terraform acceptance test"), resource.TestCheckResourceAttr(resourceName, "statement", statement), resource.TestCheckResourceAttr(resourceName, "arguments.#", "1"), - resource.TestCheckResourceAttr(resourceName, "arguments.0.name", "D"), + resource.TestCheckResourceAttr(resourceName, "arguments.0.name", "d"), resource.TestCheckResourceAttr(resourceName, "arguments.0.type", "FLOAT"), resource.TestCheckResourceAttr(resourceName, "return_behavior", "VOLATILE"), resource.TestCheckResourceAttr(resourceName, "return_type", "FLOAT"), diff --git a/pkg/resources/function_commons.go b/pkg/resources/function_commons.go new file mode 100644 index 0000000000..d2e703a0e6 --- /dev/null +++ b/pkg/resources/function_commons.go @@ -0,0 +1,344 @@ +package resources + +import ( + "fmt" + "slices" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func init() { + javaFunctionSchema = setUpFunctionSchema(javaFunctionSchemaDefinition) + javascriptFunctionSchema = setUpFunctionSchema(javascriptFunctionSchemaDefinition) + pythonFunctionSchema = setUpFunctionSchema(pythonFunctionSchemaDefinition) + scalaFunctionSchema = setUpFunctionSchema(scalaFunctionSchemaDefinition) + sqlFunctionSchema = setUpFunctionSchema(sqlFunctionSchemaDefinition) +} + +type functionSchemaDef struct { + additionalArguments []string + functionDefinitionDescription string + runtimeVersionRequired bool + runtimeVersionDescription string + importsDescription string + packagesDescription string + handlerDescription string + targetPathDescription string +} + +func setUpFunctionSchema(definition functionSchemaDef) map[string]*schema.Schema { + currentSchema := make(map[string]*schema.Schema) + for k, v := range functionBaseSchema { + v := v + if slices.Contains(definition.additionalArguments, k) || slices.Contains(commonFunctionArguments, k) { + currentSchema[k] = &v + } + } + if v, ok := currentSchema["function_definition"]; ok && v != nil { + v.Description = definition.functionDefinitionDescription + } + if v, ok := currentSchema["runtime_version"]; ok && v != nil { + if definition.runtimeVersionRequired { + v.Required = true + } else { + v.Optional = true + } + v.Description = definition.runtimeVersionDescription + } + if v, ok := currentSchema["imports"]; ok && v != nil { + v.Description = definition.importsDescription + } + if v, ok := currentSchema["packages"]; ok && v != nil { + v.Description = definition.packagesDescription + } + if v, ok := currentSchema["handler"]; ok && v != nil { + v.Description = definition.handlerDescription + } + if v, ok := currentSchema["target_path"]; ok && v != nil { + v.Description = definition.handlerDescription + } + return currentSchema +} + +func functionDefinitionTemplate(language string, linkUrl string) string { + return fmt.Sprintf("Defines the handler code executed when the UDF is called. Wrapping `$$` signs are added by the provider automatically; do not include them. The `function_definition` value must be %[1]s source code. For more information, see [Introduction to %[1]s UDFs](%[2]s).", language, linkUrl) +} + +var ( + commonFunctionArguments = []string{ + "database", + "schema", + "name", + "is_secure", + "arguments", + "return_type", + "null_input_behavior", + "return_behavior", + "comment", + "function_definition", + "function_language", + ShowOutputAttributeName, + ParametersAttributeName, + FullyQualifiedNameAttributeName, + } + javaFunctionSchemaDefinition = functionSchemaDef{ + additionalArguments: []string{ + "runtime_version", + "imports", + "packages", + "handler", + "external_access_integrations", + "secrets", + "target_path", + }, + functionDefinitionDescription: functionDefinitionTemplate("Java", "https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-introduction"), + runtimeVersionRequired: false, + runtimeVersionDescription: "Specifies the Java JDK runtime version to use. The supported versions of Java are 11.x and 17.x. If RUNTIME_VERSION is not set, Java JDK 11 is used.", + importsDescription: "The location (stage), path, and name of the file(s) to import. A file can be a JAR file or another type of file. If the file is a JAR file, it can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). Java UDFs can also read non-JAR files. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#java).", + packagesDescription: "The name and version number of Snowflake system packages required as dependencies. The value should be of the form `package_name:version_number`, where `package_name` is `snowflake_domain:package`.", + handlerDescription: "The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class.", + targetPathDescription: "The TARGET_PATH clause specifies the location to which Snowflake should write the compiled code (JAR file) after compiling the source code specified in the `function_definition`. If this clause is included, the user should manually remove the JAR file when it is no longer needed (typically when the Java UDF is dropped). If this clause is omitted, Snowflake re-compiles the source code each time the code is needed. The JAR file is not stored permanently, and the user does not need to clean up the JAR file. Snowflake returns an error if the TARGET_PATH matches an existing file; you cannot use TARGET_PATH to overwrite an existing file.", + } + javascriptFunctionSchemaDefinition = functionSchemaDef{ + additionalArguments: []string{}, + functionDefinitionDescription: functionDefinitionTemplate("JavaScript", "https://docs.snowflake.com/en/developer-guide/udf/javascript/udf-javascript-introduction"), + } + pythonFunctionSchemaDefinition = functionSchemaDef{ + additionalArguments: []string{ + "is_aggregate", + "runtime_version", + "imports", + "packages", + "handler", + "external_access_integrations", + "secrets", + }, + functionDefinitionDescription: functionDefinitionTemplate("Python", "https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-introduction"), + runtimeVersionRequired: true, + runtimeVersionDescription: "Specifies the Python version to use. The supported versions of Python are: 3.9, 3.10, and 3.11.", + importsDescription: "The location (stage), path, and name of the file(s) to import. A file can be a `.py` file or another type of file. Python UDFs can also read non-Python files, such as text files. For an example, see [Reading a file](https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-examples.html#label-udf-python-read-files). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#python).", + packagesDescription: "The name and version number of packages required as dependencies. The value should be of the form `package_name==version_number`.", + handlerDescription: "The name of the handler function or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a function name. If the handler code is in-line with the CREATE FUNCTION statement, you can use the function name alone. When the handler code is referenced at a stage, this value should be qualified with the module name, as in the following form: `my_module.my_function`. If the handler is for a tabular UDF, the HANDLER value should be the name of a handler class.", + } + scalaFunctionSchemaDefinition = functionSchemaDef{ + additionalArguments: []string{ + "runtime_version", + "imports", + "packages", + "handler", + "external_access_integrations", + "secrets", + "target_path", + }, + functionDefinitionDescription: functionDefinitionTemplate("Scala", "https://docs.snowflake.com/en/developer-guide/udf/scala/udf-scala-introduction"), + runtimeVersionRequired: true, + runtimeVersionDescription: "Specifies the Scala runtime version to use. The supported versions of Scala are: 2.12.", + importsDescription: "The location (stage), path, and name of the file(s) to import, such as a JAR or other kind of file. The JAR file might contain handler dependency libraries. It can contain one or more .class files and zero or more resource files. JNI (Java Native Interface) is not supported. Snowflake prohibits loading libraries that contain native code (as opposed to Java bytecode). A non-JAR file might a file read by handler code. For an example, see [Reading a file specified statically in IMPORTS](https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-cookbook.html#label-reading-file-from-java-udf-imports). Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#scala).", + packagesDescription: "The name and version number of Snowflake system packages required as dependencies. The value should be of the form `package_name:version_number`, where `package_name` is `snowflake_domain:package`.", + handlerDescription: "The name of the handler method or class. If the handler is for a scalar UDF, returning a non-tabular value, the HANDLER value should be a method name, as in the following form: `MyClass.myMethod`.", + targetPathDescription: "The TARGET_PATH clause specifies the location to which Snowflake should write the compiled code (JAR file) after compiling the source code specified in the `function_definition`. If this clause is included, you should manually remove the JAR file when it is no longer needed (typically when the UDF is dropped). If this clause is omitted, Snowflake re-compiles the source code each time the code is needed. The JAR file is not stored permanently, and you do not need to clean up the JAR file. Snowflake returns an error if the TARGET_PATH matches an existing file; you cannot use TARGET_PATH to overwrite an existing file.", + } + sqlFunctionSchemaDefinition = functionSchemaDef{ + additionalArguments: []string{}, + functionDefinitionDescription: functionDefinitionTemplate("SQL", "https://docs.snowflake.com/en/developer-guide/udf/sql/udf-sql-introduction"), + } +) + +var ( + javaFunctionSchema map[string]*schema.Schema + javascriptFunctionSchema map[string]*schema.Schema + pythonFunctionSchema map[string]*schema.Schema + scalaFunctionSchema map[string]*schema.Schema + sqlFunctionSchema map[string]*schema.Schema +) + +// TODO [SNOW-1348103]: add null/not null +// TODO [SNOW-1348103]: currently database and schema are ForceNew but based on the docs it is possible to rename with moving to different db/schema +// TODO [SNOW-1348103]: copyGrants and orReplace logic omitted for now, will be added to the limitations docs +// TODO [SNOW-1348103]: temporary is not supported because it creates a per-session object; add to limitations/design decisions +var functionBaseSchema = map[string]schema.Schema{ + "database": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + DiffSuppressFunc: suppressIdentifierQuoting, + Description: blocklistedCharactersFieldDescription("The database in which to create the function."), + }, + "schema": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + DiffSuppressFunc: suppressIdentifierQuoting, + Description: blocklistedCharactersFieldDescription("The schema in which to create the function."), + }, + "name": { + Type: schema.TypeString, + Required: true, + Description: blocklistedCharactersFieldDescription("The name of the function; the identifier does not need to be unique for the schema in which the function is created because UDFs are identified and resolved by the combination of the name and argument types. Check the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages)."), + DiffSuppressFunc: suppressIdentifierQuoting, + }, + "is_secure": { + Type: schema.TypeString, + Optional: true, + Default: BooleanDefault, + ValidateDiagFunc: validateBooleanString, + DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("is_secure"), + Description: booleanStringFieldDescription("Specifies that the function is secure. By design, the Snowflake's `SHOW FUNCTIONS` command does not provide information about secure views (consult [function docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#id1) and [Protecting Sensitive Information with Secure UDFs and Stored Procedures](https://docs.snowflake.com/en/developer-guide/secure-udf-procedure)) which is essential to manage/import function with Terraform. Use the role owning the function while managing secure functions."), + }, + "is_aggregate": { + Type: schema.TypeString, + Optional: true, + Default: BooleanDefault, + ValidateDiagFunc: validateBooleanString, + DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("is_aggregate"), + Description: booleanStringFieldDescription("Specifies that the function is an aggregate function. For more information about user-defined aggregate functions, see [Python user-defined aggregate functions](https://docs.snowflake.com/en/developer-guide/udf/python/udf-python-aggregate-functions)."), + }, + "arguments": { + Type: schema.TypeList, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "arg_name": { + Type: schema.TypeString, + Required: true, + // TODO [SNOW-1348103]: adjust diff suppression accordingly. + Description: "The argument name.", + }, + // TODO [SNOW-1348103]: after testing weird names add limitations to the docs and add validation here + "arg_data_type": { + Type: schema.TypeString, + Required: true, + ValidateDiagFunc: IsDataTypeValid, + DiffSuppressFunc: DiffSuppressDataTypes, + Description: "The argument type.", + }, + }, + }, + Optional: true, + ForceNew: true, + Description: "List of the arguments for the function. Consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages) for more details.", + }, + // TODO [SNOW-1348103]: for now, the proposal is to leave return type as string, add TABLE to data types, and here always parse (easier handling and diff suppression) + "return_type": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + ValidateDiagFunc: IsDataTypeValid, + DiffSuppressFunc: DiffSuppressDataTypes, + Description: "Specifies the results returned by the UDF, which determines the UDF type. Use `` to create a scalar UDF that returns a single value with the specified data type. Use `TABLE (col_name col_data_type, ...)` to creates a table UDF that returns tabular results with the specified table column(s) and column type(s). For the details, consult the [docs](https://docs.snowflake.com/en/sql-reference/sql/create-function#all-languages).", + // TODO [SNOW-1348103]: adjust DiffSuppressFunc + }, + "null_input_behavior": { + Type: schema.TypeString, + Optional: true, + ForceNew: true, + ValidateDiagFunc: sdkValidation(sdk.ToNullInputBehavior), + DiffSuppressFunc: SuppressIfAny(NormalizeAndCompare(sdk.ToNullInputBehavior), IgnoreChangeToCurrentSnowflakeValueInShow("null_input_behavior")), + Description: fmt.Sprintf("Specifies the behavior of the function when called with null inputs. Valid values are (case-insensitive): %s.", possibleValuesListed(sdk.AllAllowedNullInputBehaviors)), + }, + "return_behavior": { + Type: schema.TypeString, + Optional: true, + ForceNew: true, + ValidateDiagFunc: sdkValidation(sdk.ToReturnResultsBehavior), + DiffSuppressFunc: SuppressIfAny(NormalizeAndCompare(sdk.ToReturnResultsBehavior), IgnoreChangeToCurrentSnowflakeValueInShow("return_behavior")), + Description: fmt.Sprintf("Specifies the behavior of the function when returning results. Valid values are (case-insensitive): %s.", possibleValuesListed(sdk.AllAllowedReturnResultsBehaviors)), + }, + "runtime_version": { + Type: schema.TypeString, + ForceNew: true, + // TODO [SNOW-1348103]: may be optional for java without consequence because if it is not set, the describe is not returning any version. + }, + "comment": { + Type: schema.TypeString, + Optional: true, + // TODO [SNOW-1348103]: handle dynamic comment - this is a workaround for now + Default: "user-defined function", + Description: "Specifies a comment for the function.", + }, + // TODO [SNOW-1348103]: because of https://docs.snowflake.com/en/sql-reference/sql/create-function#id6, maybe it will be better to split into stage_name + target_path + "imports": { + Type: schema.TypeSet, + Elem: &schema.Schema{Type: schema.TypeString}, + Optional: true, + ForceNew: true, + }, + // TODO [SNOW-1348103]: what do we do with the version "latest". + "packages": { + Type: schema.TypeSet, + Elem: &schema.Schema{Type: schema.TypeString}, + Optional: true, + ForceNew: true, + }, + "handler": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + }, + // TODO [SNOW-1348103]: use suppress from network policies when adding logic + "external_access_integrations": { + Type: schema.TypeSet, + Elem: &schema.Schema{ + Type: schema.TypeString, + ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), + }, + Optional: true, + ForceNew: true, + Description: "The names of [external access integrations](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) needed in order for this function’s handler code to access external networks. An external access integration specifies [network rules](https://docs.snowflake.com/en/sql-reference/sql/create-network-rule) and [secrets](https://docs.snowflake.com/en/sql-reference/sql/create-secret) that specify external locations and credentials (if any) allowed for use by handler code when making requests of an external network, such as an external REST API.", + }, + "secrets": { + Type: schema.TypeSet, + Optional: true, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "secret_variable_name": { + Type: schema.TypeString, + Required: true, + Description: "The variable that will be used in handler code when retrieving information from the secret.", + }, + "secret_id": { + Type: schema.TypeString, + Required: true, + Description: "Fully qualified name of the allowed secret. You will receive an error if you specify a SECRETS value whose secret isn’t also included in an integration specified by the EXTERNAL_ACCESS_INTEGRATIONS parameter.", + DiffSuppressFunc: suppressIdentifierQuoting, + }, + }, + }, + Description: "Assigns the names of secrets to variables so that you can use the variables to reference the secrets when retrieving information from secrets in handler code. Secrets you specify here must be allowed by the [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) specified as a value of this CREATE FUNCTION command’s EXTERNAL_ACCESS_INTEGRATIONS parameter.", + }, + // TODO [SNOW-1348103]: because of https://docs.snowflake.com/en/sql-reference/sql/create-function#id6, maybe it will be better to split into stage + path + "target_path": { + Type: schema.TypeString, + Optional: true, + ForceNew: true, + }, + "function_definition": { + Type: schema.TypeString, + Required: true, + ForceNew: true, + DiffSuppressFunc: DiffSuppressStatement, + }, + "function_language": { + Type: schema.TypeString, + Computed: true, + Description: "Specifies language for the user. Used to detect external changes.", + }, + ShowOutputAttributeName: { + Type: schema.TypeList, + Computed: true, + Description: "Outputs the result of `SHOW FUNCTION` for the given function.", + Elem: &schema.Resource{ + Schema: schemas.ShowFunctionSchema, + }, + }, + ParametersAttributeName: { + Type: schema.TypeList, + Computed: true, + Description: "Outputs the result of `SHOW PARAMETERS IN FUNCTION` for the given function.", + Elem: &schema.Resource{ + Schema: functionParametersSchema, + }, + }, + FullyQualifiedNameAttributeName: *schemas.FullyQualifiedNameSchema, +} diff --git a/pkg/resources/function_java.go b/pkg/resources/function_java.go new file mode 100644 index 0000000000..5e05d3007f --- /dev/null +++ b/pkg/resources/function_java.go @@ -0,0 +1,52 @@ +package resources + +import ( + "context" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func FunctionJava() *schema.Resource { + return &schema.Resource{ + CreateContext: TrackingCreateWrapper(resources.FunctionJava, CreateContextFunctionJava), + ReadContext: TrackingReadWrapper(resources.FunctionJava, ReadContextFunctionJava), + UpdateContext: TrackingUpdateWrapper(resources.FunctionJava, UpdateContextFunctionJava), + DeleteContext: TrackingDeleteWrapper(resources.FunctionJava, DeleteContextFunctionJava), + Description: "Resource used to manage java function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function).", + + CustomizeDiff: TrackingCustomDiffWrapper(resources.FunctionJava, customdiff.All( + // TODO[SNOW-1348103]: ComputedIfAnyAttributeChanged(javaFunctionSchema, ShowOutputAttributeName, ...), + ComputedIfAnyAttributeChanged(javaFunctionSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(functionParametersSchema, ParametersAttributeName, collections.Map(sdk.AsStringList(sdk.AllFunctionParameters), strings.ToLower)...), + functionParametersCustomDiff, + // TODO[SNOW-1348103]: recreate when type changed externally + )), + + Schema: collections.MergeMaps(javaFunctionSchema, functionParametersSchema), + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +func CreateContextFunctionJava(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func ReadContextFunctionJava(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func UpdateContextFunctionJava(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func DeleteContextFunctionJava(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} diff --git a/pkg/resources/function_javascript.go b/pkg/resources/function_javascript.go new file mode 100644 index 0000000000..f1b3e17e2a --- /dev/null +++ b/pkg/resources/function_javascript.go @@ -0,0 +1,52 @@ +package resources + +import ( + "context" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func FunctionJavascript() *schema.Resource { + return &schema.Resource{ + CreateContext: TrackingCreateWrapper(resources.FunctionJavascript, CreateContextFunctionJavascript), + ReadContext: TrackingReadWrapper(resources.FunctionJavascript, ReadContextFunctionJavascript), + UpdateContext: TrackingUpdateWrapper(resources.FunctionJavascript, UpdateContextFunctionJavascript), + DeleteContext: TrackingDeleteWrapper(resources.FunctionJavascript, DeleteContextFunctionJavascript), + Description: "Resource used to manage javascript function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function).", + + CustomizeDiff: TrackingCustomDiffWrapper(resources.FunctionJavascript, customdiff.All( + // TODO[SNOW-1348103]: ComputedIfAnyAttributeChanged(javascriptFunctionSchema, ShowOutputAttributeName, ...), + ComputedIfAnyAttributeChanged(javascriptFunctionSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(functionParametersSchema, ParametersAttributeName, collections.Map(sdk.AsStringList(sdk.AllFunctionParameters), strings.ToLower)...), + functionParametersCustomDiff, + // TODO[SNOW-1348103]: recreate when type changed externally + )), + + Schema: collections.MergeMaps(javascriptFunctionSchema, functionParametersSchema), + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +func CreateContextFunctionJavascript(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func ReadContextFunctionJavascript(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func UpdateContextFunctionJavascript(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func DeleteContextFunctionJavascript(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} diff --git a/pkg/resources/function_parameters.go b/pkg/resources/function_parameters.go new file mode 100644 index 0000000000..bccbe0666a --- /dev/null +++ b/pkg/resources/function_parameters.go @@ -0,0 +1,99 @@ +package resources + +import ( + "context" + "strconv" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +var ( + functionParametersSchema = make(map[string]*schema.Schema) + functionParametersCustomDiff = ParametersCustomDiff( + functionParametersProvider, + parameter[sdk.FunctionParameter]{sdk.FunctionParameterEnableConsoleOutput, valueTypeBool, sdk.ParameterTypeFunction}, + parameter[sdk.FunctionParameter]{sdk.FunctionParameterLogLevel, valueTypeString, sdk.ParameterTypeFunction}, + parameter[sdk.FunctionParameter]{sdk.FunctionParameterMetricLevel, valueTypeString, sdk.ParameterTypeFunction}, + parameter[sdk.FunctionParameter]{sdk.FunctionParameterTraceLevel, valueTypeString, sdk.ParameterTypeFunction}, + ) +) + +func init() { + functionParameterFields := []parameterDef[sdk.FunctionParameter]{ + // session params + {Name: sdk.FunctionParameterEnableConsoleOutput, Type: schema.TypeBool, Description: "Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL)."}, + {Name: sdk.FunctionParameterLogLevel, Type: schema.TypeString, Description: "LOG_LEVEL to use when filtering events"}, + {Name: sdk.FunctionParameterMetricLevel, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToMetricLevel), DiffSuppress: NormalizeAndCompare(sdk.ToMetricLevel), Description: "METRIC_LEVEL value to control whether to emit metrics to Event Table"}, + {Name: sdk.FunctionParameterTraceLevel, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToTraceLevel), DiffSuppress: NormalizeAndCompare(sdk.ToTraceLevel), Description: "Trace level value to use when generating/filtering trace events"}, + } + + for _, field := range functionParameterFields { + fieldName := strings.ToLower(string(field.Name)) + + functionParametersSchema[fieldName] = &schema.Schema{ + Type: field.Type, + Description: enrichWithReferenceToParameterDocs(field.Name, field.Description), + Computed: true, + Optional: true, + ValidateDiagFunc: field.ValidateDiag, + DiffSuppressFunc: field.DiffSuppress, + ConflictsWith: field.ConflictsWith, + } + } +} + +func functionParametersProvider(ctx context.Context, d ResourceIdProvider, meta any) ([]*sdk.Parameter, error) { + return parametersProvider(ctx, d, meta.(*provider.Context), functionParametersProviderFunc, sdk.ParseSchemaObjectIdentifierWithArguments) +} + +func functionParametersProviderFunc(c *sdk.Client) showParametersFunc[sdk.SchemaObjectIdentifierWithArguments] { + return c.Functions.ShowParameters +} + +func handleFunctionParameterRead(d *schema.ResourceData, functionParameters []*sdk.Parameter) error { + for _, p := range functionParameters { + switch p.Key { + case + string(sdk.FunctionParameterLogLevel), + string(sdk.FunctionParameterMetricLevel), + string(sdk.FunctionParameterTraceLevel): + if err := d.Set(strings.ToLower(p.Key), p.Value); err != nil { + return err + } + case + string(sdk.FunctionParameterEnableConsoleOutput): + value, err := strconv.ParseBool(p.Value) + if err != nil { + return err + } + if err := d.Set(strings.ToLower(p.Key), value); err != nil { + return err + } + } + } + + return nil +} + +// They do not work in create, that's why are set in alter +func handleFunctionParametersCreate(d *schema.ResourceData, alterOpts *sdk.FunctionSet) diag.Diagnostics { + return JoinDiags( + handleParameterCreate(d, sdk.FunctionParameterEnableConsoleOutput, &alterOpts.EnableConsoleOutput), + handleParameterCreateWithMapping(d, sdk.FunctionParameterLogLevel, &alterOpts.LogLevel, stringToStringEnumProvider(sdk.ToLogLevel)), + handleParameterCreateWithMapping(d, sdk.FunctionParameterMetricLevel, &alterOpts.MetricLevel, stringToStringEnumProvider(sdk.ToMetricLevel)), + handleParameterCreateWithMapping(d, sdk.FunctionParameterTraceLevel, &alterOpts.TraceLevel, stringToStringEnumProvider(sdk.ToTraceLevel)), + ) +} + +func handleFunctionParametersUpdate(d *schema.ResourceData, set *sdk.FunctionSet, unset *sdk.FunctionUnset) diag.Diagnostics { + return JoinDiags( + handleParameterUpdate(d, sdk.FunctionParameterEnableConsoleOutput, &set.EnableConsoleOutput, &unset.EnableConsoleOutput), + handleParameterUpdateWithMapping(d, sdk.FunctionParameterLogLevel, &set.LogLevel, &unset.LogLevel, stringToStringEnumProvider(sdk.ToLogLevel)), + handleParameterUpdateWithMapping(d, sdk.FunctionParameterMetricLevel, &set.MetricLevel, &unset.MetricLevel, stringToStringEnumProvider(sdk.ToMetricLevel)), + handleParameterUpdateWithMapping(d, sdk.FunctionParameterTraceLevel, &set.TraceLevel, &unset.TraceLevel, stringToStringEnumProvider(sdk.ToTraceLevel)), + ) +} diff --git a/pkg/resources/function_python.go b/pkg/resources/function_python.go new file mode 100644 index 0000000000..e270f80ef6 --- /dev/null +++ b/pkg/resources/function_python.go @@ -0,0 +1,52 @@ +package resources + +import ( + "context" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func FunctionPython() *schema.Resource { + return &schema.Resource{ + CreateContext: TrackingCreateWrapper(resources.FunctionPython, CreateContextFunctionPython), + ReadContext: TrackingReadWrapper(resources.FunctionPython, ReadContextFunctionPython), + UpdateContext: TrackingUpdateWrapper(resources.FunctionPython, UpdateContextFunctionPython), + DeleteContext: TrackingDeleteWrapper(resources.FunctionPython, DeleteContextFunctionPython), + Description: "Resource used to manage python function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function).", + + CustomizeDiff: TrackingCustomDiffWrapper(resources.FunctionPython, customdiff.All( + // TODO[SNOW-1348103]: ComputedIfAnyAttributeChanged(pythonFunctionSchema, ShowOutputAttributeName, ...), + ComputedIfAnyAttributeChanged(pythonFunctionSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(functionParametersSchema, ParametersAttributeName, collections.Map(sdk.AsStringList(sdk.AllFunctionParameters), strings.ToLower)...), + functionParametersCustomDiff, + // TODO[SNOW-1348103]: recreate when type changed externally + )), + + Schema: collections.MergeMaps(pythonFunctionSchema, functionParametersSchema), + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +func CreateContextFunctionPython(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func ReadContextFunctionPython(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func UpdateContextFunctionPython(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func DeleteContextFunctionPython(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} diff --git a/pkg/resources/function_scala.go b/pkg/resources/function_scala.go new file mode 100644 index 0000000000..2c3adf0bc3 --- /dev/null +++ b/pkg/resources/function_scala.go @@ -0,0 +1,52 @@ +package resources + +import ( + "context" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func FunctionScala() *schema.Resource { + return &schema.Resource{ + CreateContext: TrackingCreateWrapper(resources.FunctionScala, CreateContextFunctionScala), + ReadContext: TrackingReadWrapper(resources.FunctionScala, ReadContextFunctionScala), + UpdateContext: TrackingUpdateWrapper(resources.FunctionScala, UpdateContextFunctionScala), + DeleteContext: TrackingDeleteWrapper(resources.FunctionScala, DeleteContextFunctionScala), + Description: "Resource used to manage scala function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function).", + + CustomizeDiff: TrackingCustomDiffWrapper(resources.FunctionScala, customdiff.All( + // TODO[SNOW-1348103]: ComputedIfAnyAttributeChanged(scalaFunctionSchema, ShowOutputAttributeName, ...), + ComputedIfAnyAttributeChanged(scalaFunctionSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(functionParametersSchema, ParametersAttributeName, collections.Map(sdk.AsStringList(sdk.AllFunctionParameters), strings.ToLower)...), + functionParametersCustomDiff, + // TODO[SNOW-1348103]: recreate when type changed externally + )), + + Schema: collections.MergeMaps(scalaFunctionSchema, functionParametersSchema), + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +func CreateContextFunctionScala(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func ReadContextFunctionScala(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func UpdateContextFunctionScala(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func DeleteContextFunctionScala(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} diff --git a/pkg/resources/function_sql.go b/pkg/resources/function_sql.go new file mode 100644 index 0000000000..48ea385f71 --- /dev/null +++ b/pkg/resources/function_sql.go @@ -0,0 +1,52 @@ +package resources + +import ( + "context" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +func FunctionSql() *schema.Resource { + return &schema.Resource{ + CreateContext: TrackingCreateWrapper(resources.FunctionSql, CreateContextFunctionSql), + ReadContext: TrackingReadWrapper(resources.FunctionSql, ReadContextFunctionSql), + UpdateContext: TrackingUpdateWrapper(resources.FunctionSql, UpdateContextFunctionSql), + DeleteContext: TrackingDeleteWrapper(resources.FunctionSql, DeleteContextFunctionSql), + Description: "Resource used to manage sql function objects. For more information, check [function documentation](https://docs.snowflake.com/en/sql-reference/sql/create-function).", + + CustomizeDiff: TrackingCustomDiffWrapper(resources.FunctionSql, customdiff.All( + // TODO[SNOW-1348103]: ComputedIfAnyAttributeChanged(sqlFunctionSchema, ShowOutputAttributeName, ...), + ComputedIfAnyAttributeChanged(sqlFunctionSchema, FullyQualifiedNameAttributeName, "name"), + ComputedIfAnyAttributeChanged(functionParametersSchema, ParametersAttributeName, collections.Map(sdk.AsStringList(sdk.AllFunctionParameters), strings.ToLower)...), + functionParametersCustomDiff, + // TODO[SNOW-1348103]: recreate when type changed externally + )), + + Schema: collections.MergeMaps(sqlFunctionSchema, functionParametersSchema), + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +func CreateContextFunctionSql(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func ReadContextFunctionSql(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func UpdateContextFunctionSql(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} + +func DeleteContextFunctionSql(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + return nil +} diff --git a/pkg/resources/grant_account_role.go b/pkg/resources/grant_account_role.go index ce02dc6f9c..1fd010f3f7 100644 --- a/pkg/resources/grant_account_role.go +++ b/pkg/resources/grant_account_role.go @@ -21,14 +21,14 @@ var grantAccountRoleSchema = map[string]*schema.Schema{ "role_name": { Type: schema.TypeString, Required: true, - Description: "The fully qualified name of the role which will be granted to the user or parent role.", + Description: relatedResourceDescription("The fully qualified name of the role which will be granted to the user or parent role.", resources.AccountRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), }, "user_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the user on which specified role will be granted.", + Description: relatedResourceDescription("The fully qualified name of the user on which specified role will be granted.", resources.User), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), ExactlyOneOf: []string{ @@ -39,7 +39,7 @@ var grantAccountRoleSchema = map[string]*schema.Schema{ "parent_role_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the parent role which will create a parent-child relationship between the roles.", + Description: relatedResourceDescription("The fully qualified name of the parent role which will create a parent-child relationship between the roles.", resources.AccountRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), ExactlyOneOf: []string{ diff --git a/pkg/resources/grant_application_role.go b/pkg/resources/grant_application_role.go index d1f12ebf54..952649e9d6 100644 --- a/pkg/resources/grant_application_role.go +++ b/pkg/resources/grant_application_role.go @@ -28,7 +28,7 @@ var grantApplicationRoleSchema = map[string]*schema.Schema{ "parent_account_role_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the account role on which application role will be granted.", + Description: relatedResourceDescription("The fully qualified name of the account role on which application role will be granted.", resources.AccountRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, diff --git a/pkg/resources/grant_database_role.go b/pkg/resources/grant_database_role.go index e67946fc5d..226cba801b 100644 --- a/pkg/resources/grant_database_role.go +++ b/pkg/resources/grant_database_role.go @@ -20,7 +20,7 @@ var grantDatabaseRoleSchema = map[string]*schema.Schema{ "database_role_name": { Type: schema.TypeString, Required: true, - Description: "The fully qualified name of the database role which will be granted to share or parent role.", + Description: relatedResourceDescription("The fully qualified name of the database role which will be granted to share or parent role.", resources.DatabaseRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, @@ -28,7 +28,7 @@ var grantDatabaseRoleSchema = map[string]*schema.Schema{ "parent_role_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the parent account role which will create a parent-child relationship between the roles.", + Description: relatedResourceDescription("The fully qualified name of the parent account role which will create a parent-child relationship between the roles.", resources.AccountRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, @@ -41,7 +41,7 @@ var grantDatabaseRoleSchema = map[string]*schema.Schema{ "parent_database_role_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the parent database role which will create a parent-child relationship between the roles.", + Description: relatedResourceDescription("The fully qualified name of the parent database role which will create a parent-child relationship between the roles.", resources.DatabaseRole), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, @@ -54,7 +54,7 @@ var grantDatabaseRoleSchema = map[string]*schema.Schema{ "share_name": { Type: schema.TypeString, Optional: true, - Description: "The fully qualified name of the share on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the share on which privileges will be granted.", resources.Share), ForceNew: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, diff --git a/pkg/resources/grant_ownership.go b/pkg/resources/grant_ownership.go index 24077bcd48..887d991613 100644 --- a/pkg/resources/grant_ownership.go +++ b/pkg/resources/grant_ownership.go @@ -21,7 +21,7 @@ var grantOwnershipSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the account role to which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the account role to which privileges will be granted.", resources.AccountRole), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: []string{ @@ -33,7 +33,7 @@ var grantOwnershipSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the database role to which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the database role to which privileges will be granted.", resources.DatabaseRole), ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: []string{ @@ -132,7 +132,7 @@ func grantOwnershipBulkOperationSchema(branchName string) map[string]*schema.Sch Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the database.", + Description: relatedResourceDescription("The fully qualified name of the database.", resources.Database), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: []string{ @@ -144,7 +144,7 @@ func grantOwnershipBulkOperationSchema(branchName string) map[string]*schema.Sch Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the schema.", + Description: relatedResourceDescription("The fully qualified name of the schema.", resources.Schema), ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: []string{ diff --git a/pkg/resources/grant_privileges_to_account_role.go b/pkg/resources/grant_privileges_to_account_role.go index ccce92e80b..33a3860cc8 100644 --- a/pkg/resources/grant_privileges_to_account_role.go +++ b/pkg/resources/grant_privileges_to_account_role.go @@ -25,7 +25,7 @@ var grantPrivilegesToAccountRoleSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "The fully qualified name of the account role to which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the account role to which privileges will be granted.", resources.AccountRole), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -36,7 +36,7 @@ var grantPrivilegesToAccountRoleSchema = map[string]*schema.Schema{ "privileges": { Type: schema.TypeSet, Optional: true, - Description: "The privileges to grant on the account role.", + Description: "The privileges to grant on the account role. This field is case-sensitive; use only upper-case privileges.", ExactlyOneOf: []string{ "privileges", "all_privileges", diff --git a/pkg/resources/grant_privileges_to_database_role.go b/pkg/resources/grant_privileges_to_database_role.go index 04e4375c91..ec6ca642e1 100644 --- a/pkg/resources/grant_privileges_to_database_role.go +++ b/pkg/resources/grant_privileges_to_database_role.go @@ -25,7 +25,7 @@ var grantPrivilegesToDatabaseRoleSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "The fully qualified name of the database role to which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the database role to which privileges will be granted.", resources.DatabaseRole), ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -75,7 +75,7 @@ var grantPrivilegesToDatabaseRoleSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the database on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the database on which privileges will be granted.", resources.Database), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: []string{ diff --git a/pkg/resources/grant_privileges_to_share.go b/pkg/resources/grant_privileges_to_share.go index 30d4cc2e71..c83ef72137 100644 --- a/pkg/resources/grant_privileges_to_share.go +++ b/pkg/resources/grant_privileges_to_share.go @@ -31,7 +31,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "The fully qualified name of the share on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the share on which privileges will be granted.", resources.Share), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -45,7 +45,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the database on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the database on which privileges will be granted.", resources.Database), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: grantPrivilegesToShareGrantExactlyOneOfValidation, @@ -54,7 +54,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the schema on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the schema on which privileges will be granted.", resources.Schema), ValidateDiagFunc: IsValidIdentifier[sdk.DatabaseObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: grantPrivilegesToShareGrantExactlyOneOfValidation, @@ -63,7 +63,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the table on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the table on which privileges will be granted.", resources.Table), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: grantPrivilegesToShareGrantExactlyOneOfValidation, @@ -81,7 +81,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the tag on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the tag on which privileges will be granted.", resources.Tag), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: grantPrivilegesToShareGrantExactlyOneOfValidation, @@ -90,7 +90,7 @@ var grantPrivilegesToShareSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, ForceNew: true, - Description: "The fully qualified name of the view on which privileges will be granted.", + Description: relatedResourceDescription("The fully qualified name of the view on which privileges will be granted.", resources.View), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, ExactlyOneOf: grantPrivilegesToShareGrantExactlyOneOfValidation, diff --git a/pkg/resources/network_policy.go b/pkg/resources/network_policy.go index 26ab6a1b31..318df5001b 100644 --- a/pkg/resources/network_policy.go +++ b/pkg/resources/network_policy.go @@ -34,7 +34,7 @@ var networkPolicySchema = map[string]*schema.Schema{ }, DiffSuppressFunc: NormalizeAndCompareIdentifiersInSet("allowed_network_rule_list"), Optional: true, - Description: "Specifies a list of fully qualified network rules that contain the network identifiers that are allowed access to Snowflake.", + Description: relatedResourceDescription("Specifies a list of fully qualified network rules that contain the network identifiers that are allowed access to Snowflake.", resources.NetworkRule), }, "blocked_network_rule_list": { Type: schema.TypeSet, @@ -44,7 +44,7 @@ var networkPolicySchema = map[string]*schema.Schema{ }, DiffSuppressFunc: NormalizeAndCompareIdentifiersInSet("blocked_network_rule_list"), Optional: true, - Description: "Specifies a list of fully qualified network rules that contain the network identifiers that are denied access to Snowflake.", + Description: relatedResourceDescription("Specifies a list of fully qualified network rules that contain the network identifiers that are denied access to Snowflake.", resources.NetworkRule), }, "allowed_ip_list": { Type: schema.TypeSet, diff --git a/pkg/resources/network_rule.go b/pkg/resources/network_rule.go index cc965da040..5ba1a49c55 100644 --- a/pkg/resources/network_rule.go +++ b/pkg/resources/network_rule.go @@ -223,6 +223,7 @@ func DeleteContextNetworkRule(ctx context.Context, d *schema.ResourceData, meta client := meta.(*provider.Context).Client id := helpers.DecodeSnowflakeID(name).(sdk.SchemaObjectIdentifier) + // TODO(SNOW-1818849): unassign network rules before dropping if err := client.NetworkRules.Drop(ctx, sdk.NewDropNetworkRuleRequest(id).WithIfExists(sdk.Bool(true))); err != nil { diag.FromErr(err) } diff --git a/pkg/resources/oauth_integration_for_custom_clients.go b/pkg/resources/oauth_integration_for_custom_clients.go index 8f737d8dc6..0d3b6e6040 100644 --- a/pkg/resources/oauth_integration_for_custom_clients.go +++ b/pkg/resources/oauth_integration_for_custom_clients.go @@ -81,7 +81,7 @@ var oauthIntegrationForCustomClientsSchema = map[string]*schema.Schema{ ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), }, Optional: true, - Description: "A set of Snowflake roles that a user does not need to explicitly consent to using after authenticating.", + Description: relatedResourceDescription("A set of Snowflake roles that a user does not need to explicitly consent to using after authenticating.", resources.AccountRole), }, "blocked_roles_list": { Type: schema.TypeSet, @@ -91,7 +91,7 @@ var oauthIntegrationForCustomClientsSchema = map[string]*schema.Schema{ }, // TODO(SNOW-1517937): Check if can make optional Required: true, - Description: "A set of Snowflake roles that a user cannot explicitly consent to using after authenticating.", + Description: relatedResourceDescription("A set of Snowflake roles that a user cannot explicitly consent to using after authenticating.", resources.AccountRole), }, "oauth_issue_refresh_tokens": { Type: schema.TypeString, @@ -111,7 +111,7 @@ var oauthIntegrationForCustomClientsSchema = map[string]*schema.Schema{ "network_policy": { Type: schema.TypeString, Optional: true, - Description: "Specifies an existing network policy. This network policy controls network traffic that is attempting to exchange an authorization code for an access or refresh token or to use a refresh token to obtain a new access token.", + Description: relatedResourceDescription("Specifies an existing network policy. This network policy controls network traffic that is attempting to exchange an authorization code for an access or refresh token or to use a refresh token to obtain a new access token.", resources.NetworkPolicy), ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -119,13 +119,13 @@ var oauthIntegrationForCustomClientsSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, DiffSuppressFunc: ignoreTrimSpaceSuppressFunc, - Description: "Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource using `terraform taint`.", + Description: externalChangesNotDetectedFieldDescription("Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers."), }, "oauth_client_rsa_public_key_2": { Type: schema.TypeString, Optional: true, DiffSuppressFunc: ignoreTrimSpaceSuppressFunc, - Description: "Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource using `terraform taint`.", + Description: externalChangesNotDetectedFieldDescription("Specifies a Base64-encoded RSA public key, without the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- headers."), }, "comment": { Type: schema.TypeString, diff --git a/pkg/resources/oauth_integration_for_partner_applications.go b/pkg/resources/oauth_integration_for_partner_applications.go index 3fc0d5a586..7781c1973f 100644 --- a/pkg/resources/oauth_integration_for_partner_applications.go +++ b/pkg/resources/oauth_integration_for_partner_applications.go @@ -83,7 +83,7 @@ var oauthIntegrationForPartnerApplicationsSchema = map[string]*schema.Schema{ }, // TODO(SNOW-1517937): Check if can make optional Required: true, - Description: "A set of Snowflake roles that a user cannot explicitly consent to using after authenticating.", + Description: relatedResourceDescription("A set of Snowflake roles that a user cannot explicitly consent to using after authenticating.", resources.AccountRole), DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeListValueInDescribe("blocked_roles_list"), }, "comment": { diff --git a/pkg/resources/primary_connection.go b/pkg/resources/primary_connection.go index 3dc4f8444b..3c4487d855 100644 --- a/pkg/resources/primary_connection.go +++ b/pkg/resources/primary_connection.go @@ -32,7 +32,7 @@ var primaryConnectionSchema = map[string]*schema.Schema{ "enable_failover_to_accounts": { Type: schema.TypeList, Optional: true, - Description: "Enables failover for given connection to provided accounts. Specifies a list of accounts in your organization where a secondary connection for this primary connection can be promoted to serve as the primary connection. Include your organization name for each account in the list.", + Description: relatedResourceDescription("Enables failover for given connection to provided accounts. Specifies a list of accounts in your organization where a secondary connection for this primary connection can be promoted to serve as the primary connection. Include your organization name for each account in the list.", resources.Account), Elem: &schema.Schema{ Type: schema.TypeString, DiffSuppressFunc: suppressIdentifierQuoting, diff --git a/pkg/resources/procedure.go b/pkg/resources/procedure.go index f7577833f9..8665f71d09 100644 --- a/pkg/resources/procedure.go +++ b/pkg/resources/procedure.go @@ -267,7 +267,7 @@ func createJavaProcedure(ctx context.Context, d *schema.ResourceData, meta inter } handler := d.Get("handler").(string) req := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, runtimeVersion, packages, handler) - req.WithProcedureDefinition(procedureDefinition) + req.WithProcedureDefinitionWrapped(procedureDefinition) if len(args) > 0 { req.WithArguments(args) } @@ -322,7 +322,7 @@ func createJavaScriptProcedure(ctx context.Context, d *schema.ResourceData, meta return diags } procedureDefinition := d.Get("statement").(string) - req := sdk.NewCreateForJavaScriptProcedureRequest(id.SchemaObjectId(), nil, procedureDefinition).WithResultDataTypeOld(sdk.LegacyDataTypeFrom(returnDataType)) + req := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), nil, procedureDefinition).WithResultDataTypeOld(sdk.LegacyDataTypeFrom(returnDataType)) if len(args) > 0 { req.WithArguments(args) } @@ -379,7 +379,7 @@ func createScalaProcedure(ctx context.Context, d *schema.ResourceData, meta inte } handler := d.Get("handler").(string) req := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, runtimeVersion, packages, handler) - req.WithProcedureDefinition(procedureDefinition) + req.WithProcedureDefinitionWrapped(procedureDefinition) if len(args) > 0 { req.WithArguments(args) } @@ -433,7 +433,7 @@ func createSQLProcedure(ctx context.Context, d *schema.ResourceData, meta interf return diags } procedureDefinition := d.Get("statement").(string) - req := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, procedureDefinition) + req := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, procedureDefinition) if len(args) > 0 { req.WithArguments(args) } @@ -490,7 +490,7 @@ func createPythonProcedure(ctx context.Context, d *schema.ResourceData, meta int } handler := d.Get("handler").(string) req := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, runtimeVersion, packages, handler) - req.WithProcedureDefinition(procedureDefinition) + req.WithProcedureDefinitionWrapped(procedureDefinition) if len(args) > 0 { req.WithArguments(args) } @@ -570,10 +570,13 @@ func ReadContextProcedure(ctx context.Context, d *schema.ResourceData, meta inte } } for _, desc := range procedureDetails { + if desc.Value == nil { + continue + } switch desc.Property { case "signature": // Format in Snowflake DB is: (argName argType, argName argType, ...) - args := strings.ReplaceAll(strings.ReplaceAll(desc.Value, "(", ""), ")", "") + args := strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "(", ""), ")", "") if args != "" { // Do nothing for functions without arguments argPairs := strings.Split(args, ", ") @@ -593,31 +596,31 @@ func ReadContextProcedure(ctx context.Context, d *schema.ResourceData, meta inte } } case "null handling": - if err := d.Set("null_input_behavior", desc.Value); err != nil { + if err := d.Set("null_input_behavior", *desc.Value); err != nil { return diag.FromErr(err) } case "body": - if err := d.Set("statement", desc.Value); err != nil { + if err := d.Set("statement", *desc.Value); err != nil { return diag.FromErr(err) } case "execute as": - if err := d.Set("execute_as", desc.Value); err != nil { + if err := d.Set("execute_as", *desc.Value); err != nil { return diag.FromErr(err) } case "returns": - if err := d.Set("return_type", desc.Value); err != nil { + if err := d.Set("return_type", *desc.Value); err != nil { return diag.FromErr(err) } case "language": - if err := d.Set("language", desc.Value); err != nil { + if err := d.Set("language", *desc.Value); err != nil { return diag.FromErr(err) } case "runtime_version": - if err := d.Set("runtime_version", desc.Value); err != nil { + if err := d.Set("runtime_version", *desc.Value); err != nil { return diag.FromErr(err) } case "packages": - packagesString := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(desc.Value, "[", ""), "]", ""), "'", "") + packagesString := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "[", ""), "]", ""), "'", "") if packagesString != "" { // Do nothing for Java / Python functions without packages packages := strings.Split(packagesString, ",") if err := d.Set("packages", packages); err != nil { @@ -625,7 +628,7 @@ func ReadContextProcedure(ctx context.Context, d *schema.ResourceData, meta inte } } case "imports": - importsString := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(desc.Value, "[", ""), "]", ""), "'", ""), " ", "") + importsString := strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(strings.ReplaceAll(*desc.Value, "[", ""), "]", ""), "'", ""), " ", "") if importsString != "" { // Do nothing for Java functions without imports imports := strings.Split(importsString, ",") if err := d.Set("imports", imports); err != nil { @@ -633,15 +636,15 @@ func ReadContextProcedure(ctx context.Context, d *schema.ResourceData, meta inte } } case "handler": - if err := d.Set("handler", desc.Value); err != nil { + if err := d.Set("handler", *desc.Value); err != nil { return diag.FromErr(err) } case "volatility": - if err := d.Set("return_behavior", desc.Value); err != nil { + if err := d.Set("return_behavior", *desc.Value); err != nil { return diag.FromErr(err) } default: - log.Printf("[INFO] Unexpected procedure property %v returned from Snowflake with value %v", desc.Property, desc.Value) + log.Printf("[INFO] Unexpected procedure property %v returned from Snowflake with value %v", desc.Property, *desc.Value) } } @@ -685,11 +688,11 @@ func UpdateContextProcedure(ctx context.Context, d *schema.ResourceData, meta in if d.HasChange("comment") { comment := d.Get("comment") if comment != "" { - if err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithSetComment(comment.(string))); err != nil { + if err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithSet(*sdk.NewProcedureSetRequest().WithComment(comment.(string)))); err != nil { return diag.FromErr(err) } } else { - if err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithUnsetComment(true)); err != nil { + if err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithUnset(*sdk.NewProcedureUnsetRequest().WithComment(true))); err != nil { return diag.FromErr(err) } } diff --git a/pkg/resources/procedure_acceptance_test.go b/pkg/resources/procedure_acceptance_test.go index 05cbfbfd73..1039ebc459 100644 --- a/pkg/resources/procedure_acceptance_test.go +++ b/pkg/resources/procedure_acceptance_test.go @@ -157,9 +157,9 @@ func TestAcc_Procedure_complex(t *testing.T) { resource.TestCheckResourceAttr(resourceName, "statement", statement), resource.TestCheckResourceAttr(resourceName, "execute_as", "CALLER"), resource.TestCheckResourceAttr(resourceName, "arguments.#", "2"), - resource.TestCheckResourceAttr(resourceName, "arguments.0.name", "ARG1"), + resource.TestCheckResourceAttr(resourceName, "arguments.0.name", "arg1"), resource.TestCheckResourceAttr(resourceName, "arguments.0.type", "VARCHAR"), - resource.TestCheckResourceAttr(resourceName, "arguments.1.name", "ARG2"), + resource.TestCheckResourceAttr(resourceName, "arguments.1.name", "arg2"), resource.TestCheckResourceAttr(resourceName, "arguments.1.type", "DATE"), resource.TestCheckResourceAttr(resourceName, "null_input_behavior", "RETURNS NULL ON NULL INPUT"), diff --git a/pkg/resources/procedure_parameters.go b/pkg/resources/procedure_parameters.go new file mode 100644 index 0000000000..eba2a378b2 --- /dev/null +++ b/pkg/resources/procedure_parameters.go @@ -0,0 +1,99 @@ +package resources + +import ( + "context" + "strconv" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +var ( + procedureParametersSchema = make(map[string]*schema.Schema) + procedureParametersCustomDiff = ParametersCustomDiff( + procedureParametersProvider, + parameter[sdk.ProcedureParameter]{sdk.ProcedureParameterEnableConsoleOutput, valueTypeBool, sdk.ParameterTypeProcedure}, + parameter[sdk.ProcedureParameter]{sdk.ProcedureParameterLogLevel, valueTypeString, sdk.ParameterTypeProcedure}, + parameter[sdk.ProcedureParameter]{sdk.ProcedureParameterMetricLevel, valueTypeString, sdk.ParameterTypeProcedure}, + parameter[sdk.ProcedureParameter]{sdk.ProcedureParameterTraceLevel, valueTypeString, sdk.ParameterTypeProcedure}, + ) +) + +func init() { + procedureParameterFields := []parameterDef[sdk.ProcedureParameter]{ + // session params + {Name: sdk.ProcedureParameterEnableConsoleOutput, Type: schema.TypeBool, Description: "Enable stdout/stderr fast path logging for anonyous stored procs. This is a public parameter (similar to LOG_LEVEL)."}, + {Name: sdk.ProcedureParameterLogLevel, Type: schema.TypeString, Description: "LOG_LEVEL to use when filtering events"}, + {Name: sdk.ProcedureParameterMetricLevel, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToMetricLevel), DiffSuppress: NormalizeAndCompare(sdk.ToMetricLevel), Description: "METRIC_LEVEL value to control whether to emit metrics to Event Table"}, + {Name: sdk.ProcedureParameterTraceLevel, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToTraceLevel), DiffSuppress: NormalizeAndCompare(sdk.ToTraceLevel), Description: "Trace level value to use when generating/filtering trace events"}, + } + + for _, field := range procedureParameterFields { + fieldName := strings.ToLower(string(field.Name)) + + procedureParametersSchema[fieldName] = &schema.Schema{ + Type: field.Type, + Description: enrichWithReferenceToParameterDocs(field.Name, field.Description), + Computed: true, + Optional: true, + ValidateDiagFunc: field.ValidateDiag, + DiffSuppressFunc: field.DiffSuppress, + ConflictsWith: field.ConflictsWith, + } + } +} + +func procedureParametersProvider(ctx context.Context, d ResourceIdProvider, meta any) ([]*sdk.Parameter, error) { + return parametersProvider(ctx, d, meta.(*provider.Context), procedureParametersProviderFunc, sdk.ParseSchemaObjectIdentifierWithArguments) +} + +func procedureParametersProviderFunc(c *sdk.Client) showParametersFunc[sdk.SchemaObjectIdentifierWithArguments] { + return c.Procedures.ShowParameters +} + +func handleProcedureParameterRead(d *schema.ResourceData, procedureParameters []*sdk.Parameter) error { + for _, p := range procedureParameters { + switch p.Key { + case + string(sdk.ProcedureParameterLogLevel), + string(sdk.ProcedureParameterMetricLevel), + string(sdk.ProcedureParameterTraceLevel): + if err := d.Set(strings.ToLower(p.Key), p.Value); err != nil { + return err + } + case + string(sdk.ProcedureParameterEnableConsoleOutput): + value, err := strconv.ParseBool(p.Value) + if err != nil { + return err + } + if err := d.Set(strings.ToLower(p.Key), value); err != nil { + return err + } + } + } + + return nil +} + +// They do not work in create, that's why are set in alter +func handleProcedureParametersCreate(d *schema.ResourceData, alterOpts *sdk.ProcedureSet) diag.Diagnostics { + return JoinDiags( + handleParameterCreate(d, sdk.ProcedureParameterEnableConsoleOutput, &alterOpts.EnableConsoleOutput), + handleParameterCreateWithMapping(d, sdk.ProcedureParameterLogLevel, &alterOpts.LogLevel, stringToStringEnumProvider(sdk.ToLogLevel)), + handleParameterCreateWithMapping(d, sdk.ProcedureParameterMetricLevel, &alterOpts.MetricLevel, stringToStringEnumProvider(sdk.ToMetricLevel)), + handleParameterCreateWithMapping(d, sdk.ProcedureParameterTraceLevel, &alterOpts.TraceLevel, stringToStringEnumProvider(sdk.ToTraceLevel)), + ) +} + +func handleProcedureParametersUpdate(d *schema.ResourceData, set *sdk.ProcedureSet, unset *sdk.ProcedureUnset) diag.Diagnostics { + return JoinDiags( + handleParameterUpdate(d, sdk.ProcedureParameterEnableConsoleOutput, &set.EnableConsoleOutput, &unset.EnableConsoleOutput), + handleParameterUpdateWithMapping(d, sdk.ProcedureParameterLogLevel, &set.LogLevel, &unset.LogLevel, stringToStringEnumProvider(sdk.ToLogLevel)), + handleParameterUpdateWithMapping(d, sdk.ProcedureParameterMetricLevel, &set.MetricLevel, &unset.MetricLevel, stringToStringEnumProvider(sdk.ToMetricLevel)), + handleParameterUpdateWithMapping(d, sdk.ProcedureParameterTraceLevel, &set.TraceLevel, &unset.TraceLevel, stringToStringEnumProvider(sdk.ToTraceLevel)), + ) +} diff --git a/pkg/resources/resource_monitor.go b/pkg/resources/resource_monitor.go index 53c07c77e4..ab9f9c7082 100644 --- a/pkg/resources/resource_monitor.go +++ b/pkg/resources/resource_monitor.go @@ -30,9 +30,10 @@ var resourceMonitorSchema = map[string]*schema.Schema{ "notify_users": { Type: schema.TypeSet, Optional: true, - Description: "Specifies the list of users (their identifiers) to receive email notifications on resource monitors.", + Description: relatedResourceDescription("Specifies the list of users (their identifiers) to receive email notifications on resource monitors.", resources.User), Elem: &schema.Schema{ - Type: schema.TypeString, + Type: schema.TypeString, + DiffSuppressFunc: suppressIdentifierQuoting, }, }, "credit_quota": { diff --git a/pkg/resources/saml2_integration.go b/pkg/resources/saml2_integration.go index cde54cfd63..b62f2317f7 100644 --- a/pkg/resources/saml2_integration.go +++ b/pkg/resources/saml2_integration.go @@ -161,7 +161,7 @@ func SAML2Integration() *schema.Resource { ReadContext: TrackingReadWrapper(resources.Saml2SecurityIntegration, ReadContextSAML2Integration(true)), UpdateContext: TrackingUpdateWrapper(resources.Saml2SecurityIntegration, UpdateContextSAML2Integration), DeleteContext: TrackingDeleteWrapper(resources.Saml2SecurityIntegration, DeleteContextSAM2LIntegration), - Description: "Resource used to manage saml2 security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2).", + Description: "Resource used to manage SAML2 security integration objects. For more information, check [security integrations documentation](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-saml2).", Schema: saml2IntegrationSchema, Importer: &schema.ResourceImporter{ diff --git a/pkg/resources/schema.go b/pkg/resources/schema.go index 66e4be7a28..32fd784e5d 100644 --- a/pkg/resources/schema.go +++ b/pkg/resources/schema.go @@ -25,13 +25,13 @@ var schemaSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, Required: true, - Description: "Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state.", + Description: blocklistedCharactersFieldDescription("Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state."), DiffSuppressFunc: suppressIdentifierQuoting, }, "database": { Type: schema.TypeString, Required: true, - Description: "The database in which to create the schema.", + Description: blocklistedCharactersFieldDescription("The database in which to create the schema."), ForceNew: true, DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -91,6 +91,8 @@ var schemaSchema = map[string]*schema.Schema{ // Schema returns a pointer to the resource representing a schema. func Schema() *schema.Resource { return &schema.Resource{ + SchemaVersion: 2, + CreateContext: TrackingCreateWrapper(resources.Schema, CreateContextSchema), ReadContext: TrackingReadWrapper(resources.Schema, ReadContextSchema(true)), UpdateContext: TrackingUpdateWrapper(resources.Schema, UpdateContextSchema), @@ -110,7 +112,6 @@ func Schema() *schema.Resource { StateContext: TrackingImportWrapper(resources.Schema, ImportSchema), }, - SchemaVersion: 2, StateUpgraders: []schema.StateUpgrader{ { Version: 0, diff --git a/pkg/resources/scim_integration.go b/pkg/resources/scim_integration.go index bf97e28f1a..f5482200fa 100644 --- a/pkg/resources/scim_integration.go +++ b/pkg/resources/scim_integration.go @@ -41,8 +41,8 @@ var scimIntegrationSchema = map[string]*schema.Schema{ Required: true, ForceNew: true, Description: fmt.Sprintf("Specifies the client type for the scim integration. Valid options are: %v.", possibleValuesListed(sdk.AllScimSecurityIntegrationScimClients)), - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationScimClients), true), - DiffSuppressFunc: ignoreCaseAndTrimSpaceSuppressFunc, + ValidateDiagFunc: sdkValidation(sdk.ToScimSecurityIntegrationScimClientOption), + DiffSuppressFunc: NormalizeAndCompare(sdk.ToScimSecurityIntegrationScimClientOption), }, "run_as_role": { Type: schema.TypeString, @@ -50,19 +50,14 @@ var scimIntegrationSchema = map[string]*schema.Schema{ ForceNew: true, Description: fmt.Sprintf("Specify the SCIM role in Snowflake that owns any users and roles that are imported from the identity provider into Snowflake using SCIM."+ " Provider assumes that the specified role is already provided. Valid options are: %v.", possibleValuesListed(sdk.AllScimSecurityIntegrationRunAsRoles)), - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationRunAsRoles), true), - DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { - normalize := func(s string) string { - return strings.ToUpper(strings.ReplaceAll(s, "-", "")) - } - return normalize(old) == normalize(new) - }, + ValidateDiagFunc: sdkValidation(sdk.ToScimSecurityIntegrationRunAsRoleOption), + DiffSuppressFunc: NormalizeAndCompare(sdk.ToScimSecurityIntegrationRunAsRoleOption), }, "network_policy": { Type: schema.TypeString, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), Optional: true, - Description: "Specifies an existing network policy that controls SCIM network traffic.", + Description: relatedResourceDescription("Specifies an existing network policy that controls SCIM network traffic.", resources.NetworkPolicy), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeListValueInDescribe("network_policy")), }, "sync_password": { diff --git a/pkg/resources/scim_integration_acceptance_test.go b/pkg/resources/scim_integration_acceptance_test.go index aa8f838cdc..277dbed973 100644 --- a/pkg/resources/scim_integration_acceptance_test.go +++ b/pkg/resources/scim_integration_acceptance_test.go @@ -282,7 +282,7 @@ func TestAcc_ScimIntegration_InvalidScimClient(t *testing.T) { { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_ScimIntegration/complete"), ConfigVariables: m(), - ExpectError: regexp.MustCompile(`expected \[{{} scim_client}] to be one of \["OKTA" "AZURE" "GENERIC"], got invalid`), + ExpectError: regexp.MustCompile(`invalid ScimSecurityIntegrationScimClientOption: INVALID`), }, }, }) @@ -311,7 +311,7 @@ func TestAcc_ScimIntegration_InvalidRunAsRole(t *testing.T) { { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_ScimIntegration/complete"), ConfigVariables: m(), - ExpectError: regexp.MustCompile(`expected \[{{} run_as_role}] to be one of \["OKTA_PROVISIONER" "AAD_PROVISIONER" "GENERIC_SCIM_PROVISIONER"], got invalid`), + ExpectError: regexp.MustCompile(`invalid ScimSecurityIntegrationRunAsRoleOption: INVALID`), }, }, }) diff --git a/pkg/resources/secondary_connection.go b/pkg/resources/secondary_connection.go index 6f9dbca91c..b948b106f1 100644 --- a/pkg/resources/secondary_connection.go +++ b/pkg/resources/secondary_connection.go @@ -33,7 +33,7 @@ var secondaryConnectionSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "Specifies the identifier for a primary connection from which to create a replica (i.e. a secondary connection).", + Description: relatedResourceDescription("Specifies the identifier for a primary connection from which to create a replica (i.e. a secondary connection).", resources.PrimaryConnection), DiffSuppressFunc: suppressIdentifierQuoting, }, "comment": { diff --git a/pkg/resources/secondary_database.go b/pkg/resources/secondary_database.go index 448f9e5179..528a6ebf4d 100644 --- a/pkg/resources/secondary_database.go +++ b/pkg/resources/secondary_database.go @@ -28,7 +28,7 @@ var secondaryDatabaseSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`.", + Description: relatedResourceDescription("A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`.", resources.Database), // TODO(SNOW-1495079): Add validation when ExternalObjectIdentifier will be available in IsValidIdentifierDescription: "A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`.", DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -237,6 +237,7 @@ func DeleteSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a return diag.FromErr(err) } + // TODO(SNOW-1818849): unassign network policies inside the database before dropping err = client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ IfExists: sdk.Bool(true), }) diff --git a/pkg/resources/secret_with_oauth_authorization_code_grant.go b/pkg/resources/secret_with_oauth_authorization_code_grant.go index 6a9eaa85cf..e823b741df 100644 --- a/pkg/resources/secret_with_oauth_authorization_code_grant.go +++ b/pkg/resources/secret_with_oauth_authorization_code_grant.go @@ -36,7 +36,7 @@ var secretAuthorizationCodeGrantSchema = func() map[string]*schema.Schema { Type: schema.TypeString, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), Required: true, - Description: "Specifies the name value of the Snowflake security integration that connects Snowflake to an external service.", + Description: relatedResourceDescription("Specifies the name value of the Snowflake security integration that connects Snowflake to an external service.", resources.ApiAuthenticationIntegrationWithAuthorizationCodeGrant), DiffSuppressFunc: suppressIdentifierQuoting, }, } diff --git a/pkg/resources/secret_with_oauth_client_credentials.go b/pkg/resources/secret_with_oauth_client_credentials.go index 1df7c77feb..816385e3e2 100644 --- a/pkg/resources/secret_with_oauth_client_credentials.go +++ b/pkg/resources/secret_with_oauth_client_credentials.go @@ -24,7 +24,7 @@ var secretClientCredentialsSchema = func() map[string]*schema.Schema { Type: schema.TypeString, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), Required: true, - Description: "Specifies the name value of the Snowflake security integration that connects Snowflake to an external service.", + Description: relatedResourceDescription("Specifies the name value of the Snowflake security integration that connects Snowflake to an external service.", resources.ApiAuthenticationIntegrationWithClientCredentials), DiffSuppressFunc: suppressIdentifierQuoting, }, "oauth_scopes": { diff --git a/pkg/resources/shared_database.go b/pkg/resources/shared_database.go index 537bafb03a..50cc1d34bc 100644 --- a/pkg/resources/shared_database.go +++ b/pkg/resources/shared_database.go @@ -28,7 +28,7 @@ var sharedDatabaseSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `\"\".\"\".\"\"`.", + Description: relatedResourceDescription("A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `\"\".\"\".\"\"`.", resources.Share), // TODO(SNOW-1495079): Add validation when ExternalObjectIdentifier will be available in IsValidIdentifierDescription: "A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `\"\".\"\".\"\"`.", DiffSuppressFunc: suppressIdentifierQuoting, }, @@ -37,7 +37,7 @@ var sharedDatabaseSchema = map[string]*schema.Schema{ Optional: true, Description: "Specifies a comment for the database.", }, - // TODO(SNOW-1325381): Add it as an item to discuss and either remove or uncomment (and implement) it + // TODO(SNOW-1843347): Add it as an item to discuss and either remove or uncomment (and implement) it // "is_transient": { // Type: schema.TypeBool, // Optional: true, @@ -80,7 +80,7 @@ func CreateSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) } opts := &sdk.CreateSharedDatabaseOptions{ - // TODO(SNOW-1325381) + // TODO(SNOW-1843347) // Transient: GetPropertyAsPointer[bool](d, "is_transient"), Comment: GetConfigPropertyAsPointerAllowingZeroValue[string](d, "comment"), } @@ -179,7 +179,7 @@ func ReadSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) d } } - // TODO(SNOW-1325381) + // TODO(SNOW-1843347) // if err := d.Set("is_transient", database.Transient); err != nil { // return diag.FromErr(err) // } @@ -207,6 +207,7 @@ func DeleteSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) return diag.FromErr(err) } + // TODO(SNOW-1818849): unassign network policies inside the database before dropping err = client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ IfExists: sdk.Bool(true), }) diff --git a/pkg/resources/shared_database_acceptance_test.go b/pkg/resources/shared_database_acceptance_test.go index 5c05b8dce2..2108bfb776 100644 --- a/pkg/resources/shared_database_acceptance_test.go +++ b/pkg/resources/shared_database_acceptance_test.go @@ -262,9 +262,10 @@ func TestAcc_CreateSharedDatabase_InvalidValues(t *testing.T) { { ConfigVariables: configVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/complete"), - ExpectError: regexp.MustCompile(`(expected \[{{} log_level}\] to be one of \[\"TRACE\" \"DEBUG\" \"INFO\" \"WARN\" \"ERROR\" \"FATAL\" \"OFF\"\], got invalid_value)|` + - `(expected \[{{} trace_level}\] to be one of \[\"ALWAYS\" \"ON_EVENT\" \"OFF\"\], got invalid_value)|` + - `(expected \[{{} storage_serialization_policy}\] to be one of \[\"COMPATIBLE\" \"OPTIMIZED\"\], got invalid_value)`), + ExpectError: regexp.MustCompile(`(unknown log level: invalid_value)|` + + `(unknown trace level: invalid_value)|` + + `(unknown storage serialization policy: invalid_value)|` + + `(invalid warehouse size:)`), }, }, }) diff --git a/pkg/resources/stream_common.go b/pkg/resources/stream_common.go index dde23c0ea8..1c6b039fe0 100644 --- a/pkg/resources/stream_common.go +++ b/pkg/resources/stream_common.go @@ -38,7 +38,7 @@ var streamCommonSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Optional: true, Default: false, - Description: "Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream.", + Description: copyGrantsDescription("Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause."), // Changing ONLY copy grants should have no effect. It is only used as an "option" during CREATE OR REPLACE - when other attributes change, it's not an object state. There is no point in recreating the object when only this field is changed. DiffSuppressFunc: IgnoreAfterCreation, }, diff --git a/pkg/resources/stream_on_directory_table.go b/pkg/resources/stream_on_directory_table.go index c341f6b9f7..b3e726f97e 100644 --- a/pkg/resources/stream_on_directory_table.go +++ b/pkg/resources/stream_on_directory_table.go @@ -23,7 +23,7 @@ var streamOnDirectoryTableSchema = func() map[string]*schema.Schema { "stage": { Type: schema.TypeString, Required: true, - Description: blocklistedCharactersFieldDescription("Specifies an identifier for the stage the stream will monitor. Due to Snowflake limitations, the provider can not read the stage's database and schema. For stages, Snowflake returns only partially qualified name instead of fully qualified name. Please use stages located in the same schema as the stream."), + Description: relatedResourceDescription(blocklistedCharactersFieldDescription("Specifies an identifier for the stage the stream will monitor. Due to Snowflake limitations, the provider can not read the stage's database and schema. For stages, Snowflake returns only partially qualified name instead of fully qualified name. Please use stages located in the same schema as the stream."), resources.Stage), // TODO (SNOW-1733130): the returned value is not a fully qualified name DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuotingPartiallyQualifiedName, IgnoreChangeToCurrentSnowflakeValueInShow("stage")), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), diff --git a/pkg/resources/stream_on_external_table.go b/pkg/resources/stream_on_external_table.go index 05d4b3289a..be12b44638 100644 --- a/pkg/resources/stream_on_external_table.go +++ b/pkg/resources/stream_on_external_table.go @@ -23,7 +23,7 @@ var streamOnExternalTableSchema = func() map[string]*schema.Schema { "external_table": { Type: schema.TypeString, Required: true, - Description: blocklistedCharactersFieldDescription("Specifies an identifier for the external table the stream will monitor."), + Description: relatedResourceDescription(blocklistedCharactersFieldDescription("Specifies an identifier for the external table the stream will monitor."), resources.ExternalTable), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("table_name")), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), }, diff --git a/pkg/resources/stream_on_table.go b/pkg/resources/stream_on_table.go index cc9b56f371..76506f0b4a 100644 --- a/pkg/resources/stream_on_table.go +++ b/pkg/resources/stream_on_table.go @@ -23,7 +23,7 @@ var streamOnTableSchema = func() map[string]*schema.Schema { "table": { Type: schema.TypeString, Required: true, - Description: blocklistedCharactersFieldDescription("Specifies an identifier for the table the stream will monitor."), + Description: relatedResourceDescription(blocklistedCharactersFieldDescription("Specifies an identifier for the table the stream will monitor."), resources.Table), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("table_name")), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), }, diff --git a/pkg/resources/stream_on_view.go b/pkg/resources/stream_on_view.go index b093ed726b..016a63eba0 100644 --- a/pkg/resources/stream_on_view.go +++ b/pkg/resources/stream_on_view.go @@ -23,7 +23,7 @@ var StreamOnViewSchema = func() map[string]*schema.Schema { "view": { Type: schema.TypeString, Required: true, - Description: blocklistedCharactersFieldDescription("Specifies an identifier for the view the stream will monitor."), + Description: relatedResourceDescription(blocklistedCharactersFieldDescription("Specifies an identifier for the view the stream will monitor."), resources.View), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("table_name")), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), }, diff --git a/pkg/resources/streamlit.go b/pkg/resources/streamlit.go index c7b8f84ca1..ff18febaca 100644 --- a/pkg/resources/streamlit.go +++ b/pkg/resources/streamlit.go @@ -24,27 +24,27 @@ var streamlitSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, Required: true, - Description: "String that specifies the identifier (i.e. name) for the streamlit; must be unique in your account.", + Description: blocklistedCharactersFieldDescription("String that specifies the identifier (i.e. name) for the streamlit; must be unique in your account."), DiffSuppressFunc: suppressIdentifierQuoting, }, "database": { Type: schema.TypeString, Required: true, - Description: "The database in which to create the streamlit", + Description: blocklistedCharactersFieldDescription("The database in which to create the streamlit"), ForceNew: true, DiffSuppressFunc: suppressIdentifierQuoting, }, "schema": { Type: schema.TypeString, Required: true, - Description: "The schema in which to create the streamlit.", + Description: blocklistedCharactersFieldDescription("The schema in which to create the streamlit."), ForceNew: true, DiffSuppressFunc: suppressIdentifierQuoting, }, "stage": { Type: schema.TypeString, Required: true, - Description: "The stage in which streamlit files are located.", + Description: relatedResourceDescription("The stage in which streamlit files are located.", resources.Stage), ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInDescribe("root_location")), }, @@ -57,14 +57,14 @@ var streamlitSchema = map[string]*schema.Schema{ "main_file": { Type: schema.TypeString, Required: true, - Description: "Specifies the filename of the Streamlit Python application. This filename is relative to the value of `root_location`", + Description: "Specifies the filename of the Streamlit Python application. This filename is relative to the value of `directory_location`", DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInDescribe("main_file"), }, "query_warehouse": { Type: schema.TypeString, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), Optional: true, - Description: "Specifies the warehouse where SQL queries issued by the Streamlit application are run.", + Description: relatedResourceDescription("Specifies the warehouse where SQL queries issued by the Streamlit application are run. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters.", resources.Warehouse), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("query_warehouse")), }, "external_access_integrations": { @@ -75,7 +75,7 @@ var streamlitSchema = map[string]*schema.Schema{ }, Optional: true, Description: "External access integrations connected to the Streamlit.", - DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInDescribe("external_access_integrations")), + DiffSuppressFunc: SuppressIfAny(NormalizeAndCompareIdentifiersInSet("external_access_integrations"), IgnoreChangeToCurrentSnowflakeValueInDescribe("external_access_integrations")), }, "title": { Type: schema.TypeString, @@ -108,6 +108,8 @@ var streamlitSchema = map[string]*schema.Schema{ func Streamlit() *schema.Resource { return &schema.Resource{ + SchemaVersion: 1, + CreateContext: TrackingCreateWrapper(resources.Streamlit, CreateContextStreamlit), ReadContext: TrackingReadWrapper(resources.Streamlit, ReadContextStreamlit), UpdateContext: TrackingUpdateWrapper(resources.Streamlit, UpdateContextStreamlit), @@ -125,7 +127,6 @@ func Streamlit() *schema.Resource { ComputedIfAnyAttributeChanged(streamlitSchema, DescribeOutputAttributeName, "title", "comment", "root_location", "main_file", "query_warehouse", "external_access_integrations"), )), - SchemaVersion: 1, StateUpgraders: []schema.StateUpgrader{ { Version: 0, diff --git a/pkg/resources/tag.go b/pkg/resources/tag.go index 668617450e..7695fd318b 100644 --- a/pkg/resources/tag.go +++ b/pkg/resources/tag.go @@ -59,7 +59,7 @@ var tagSchema = map[string]*schema.Schema{ }, Optional: true, DiffSuppressFunc: NormalizeAndCompareIdentifiersInSet("masking_policies"), - Description: "Set of masking policies for the tag. A tag can support one masking policy for each data type. If masking policies are assigned to the tag, before dropping the tag, the provider automatically unassigns them.", + Description: relatedResourceDescription("Set of masking policies for the tag. A tag can support one masking policy for each data type. If masking policies are assigned to the tag, before dropping the tag, the provider automatically unassigns them.", resources.MaskingPolicy), }, FullyQualifiedNameAttributeName: schemas.FullyQualifiedNameSchema, ShowOutputAttributeName: { @@ -114,7 +114,7 @@ func Tag() *schema.Resource { ReadContext: TrackingReadWrapper(resources.Tag, ReadContextTag), UpdateContext: TrackingUpdateWrapper(resources.Tag, UpdateContextTag), DeleteContext: TrackingDeleteWrapper(resources.Tag, DeleteContextTag), - Description: "Resource used to manage tags. For more information, check [tag documentation](https://docs.snowflake.com/en/sql-reference/sql/create-tag).", + Description: "Resource used to manage tags. For more information, check [tag documentation](https://docs.snowflake.com/en/sql-reference/sql/create-tag). For asssigning tags to Snowflake objects, see [tag_association resource](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/tag_association).", CustomizeDiff: TrackingCustomDiffWrapper(resources.Tag, customdiff.All( ComputedIfAnyAttributeChanged(tagSchema, ShowOutputAttributeName, "name", "comment", "allowed_values"), @@ -190,7 +190,6 @@ func ReadContextTag(ctx context.Context, d *schema.ResourceData, meta any) diag. return diag.FromErr(err) } errs := errors.Join( - d.Set("name", tag.Name), d.Set(FullyQualifiedNameAttributeName, id.FullyQualifiedName()), d.Set(ShowOutputAttributeName, []map[string]any{schemas.TagToSchema(tag)}), d.Set("comment", tag.Comment), diff --git a/pkg/resources/task.go b/pkg/resources/task.go index f9f697fda3..8cab5404f8 100644 --- a/pkg/resources/task.go +++ b/pkg/resources/task.go @@ -66,7 +66,7 @@ var taskSchema = map[string]*schema.Schema{ Optional: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: suppressIdentifierQuoting, - Description: "The warehouse the task will use. Omit this parameter to use Snowflake-managed compute resources for runs of this task. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters. (Conflicts with user_task_managed_initial_warehouse_size)", + Description: relatedResourceDescription("The warehouse the task will use. Omit this parameter to use Snowflake-managed compute resources for runs of this task. Due to Snowflake limitations warehouse identifier can consist of only upper-cased letters. (Conflicts with user_task_managed_initial_warehouse_size)", resources.Warehouse), ConflictsWith: []string{"user_task_managed_initial_warehouse_size"}, }, "schedule": { @@ -113,7 +113,7 @@ var taskSchema = map[string]*schema.Schema{ Optional: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("error_integration")), - Description: blocklistedCharactersFieldDescription("Specifies the name of the notification integration used for error notifications."), + Description: relatedResourceDescription(blocklistedCharactersFieldDescription("Specifies the name of the notification integration used for error notifications."), resources.NotificationIntegration), }, "comment": { Type: schema.TypeString, diff --git a/pkg/resources/task_parameters.go b/pkg/resources/task_parameters.go index 0c6f24d66f..5609bca254 100644 --- a/pkg/resources/task_parameters.go +++ b/pkg/resources/task_parameters.go @@ -85,7 +85,7 @@ func init() { // task parameters {Name: sdk.TaskParameterSuspendTaskAfterNumFailures, Type: schema.TypeInt, ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), Description: "Specifies the number of consecutive failed task runs after which the current task is suspended automatically. The default is 0 (no automatic suspension)."}, {Name: sdk.TaskParameterTaskAutoRetryAttempts, Type: schema.TypeInt, ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), Description: "Specifies the number of automatic task graph retry attempts. If any task graphs complete in a FAILED state, Snowflake can automatically retry the task graphs from the last task in the graph that failed."}, - {Name: sdk.TaskParameterUserTaskManagedInitialWarehouseSize, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToWarehouseSize), DiffSuppress: NormalizeAndCompare(sdk.ToWarehouseSize), ConflictsWith: []string{"warehouse"}, Description: "Specifies the size of the compute resources to provision for the first run of the task, before a task history is available for Snowflake to determine an ideal size. Once a task has successfully completed a few runs, Snowflake ignores this parameter setting. Valid values are (case-insensitive): %s. (Conflicts with warehouse)"}, + {Name: sdk.TaskParameterUserTaskManagedInitialWarehouseSize, Type: schema.TypeString, ValidateDiag: sdkValidation(sdk.ToWarehouseSize), DiffSuppress: NormalizeAndCompare(sdk.ToWarehouseSize), ConflictsWith: []string{"warehouse"}, Description: "Specifies the size of the compute resources to provision for the first run of the task, before a task history is available for Snowflake to determine an ideal size. Once a task has successfully completed a few runs, Snowflake ignores this parameter setting. Valid values are (case-insensitive): %s. (Conflicts with warehouse). For more information about warehouses, see [docs](./warehouse)."}, {Name: sdk.TaskParameterUserTaskMinimumTriggerIntervalInSeconds, Type: schema.TypeInt, ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), Description: "Minimum amount of time between Triggered Task executions in seconds"}, {Name: sdk.TaskParameterUserTaskTimeoutMs, Type: schema.TypeInt, ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), Description: "Specifies the time limit on a single run of the task before it times out (in milliseconds)."}, // session params diff --git a/pkg/resources/testdata/TestAcc_View/columns/test.tf b/pkg/resources/testdata/TestAcc_View/columns/test.tf index fd5b201fe7..7c76773ae6 100644 --- a/pkg/resources/testdata/TestAcc_View/columns/test.tf +++ b/pkg/resources/testdata/TestAcc_View/columns/test.tf @@ -13,7 +13,7 @@ resource "snowflake_view" "test" { masking_policy { policy_name = var.masking_name - using = var.masking_using + using = try(var.masking_using, null) } } diff --git a/pkg/resources/testdata/TestAcc_View/columns/variables.tf b/pkg/resources/testdata/TestAcc_View/columns/variables.tf index ba6e4bfe0d..462b89d908 100644 --- a/pkg/resources/testdata/TestAcc_View/columns/variables.tf +++ b/pkg/resources/testdata/TestAcc_View/columns/variables.tf @@ -23,5 +23,6 @@ variable "masking_name" { } variable "masking_using" { - type = list(string) + type = list(string) + default = null } diff --git a/pkg/resources/user.go b/pkg/resources/user.go index 8bf0fde1b0..1fb6f15127 100644 --- a/pkg/resources/user.go +++ b/pkg/resources/user.go @@ -33,7 +33,7 @@ var userSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, Sensitive: true, - Description: "Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully.", + Description: externalChangesNotDetectedFieldDescription("Password for the user. **WARNING:** this will put the password in the terraform state file. Use carefully."), }, "login_name": { Type: schema.TypeString, @@ -106,7 +106,7 @@ var userSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, DiffSuppressFunc: suppressIdentifierQuoting, - Description: "Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists.", + Description: relatedResourceDescription("Specifies the virtual warehouse that is active by default for the user’s session upon login. Note that the CREATE USER operation does not verify that the warehouse exists.", resources.Warehouse), }, "default_namespace": { Type: schema.TypeString, @@ -118,7 +118,7 @@ var userSchema = map[string]*schema.Schema{ Type: schema.TypeString, Optional: true, DiffSuppressFunc: suppressIdentifierQuoting, - Description: "Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists.", + Description: relatedResourceDescription("Specifies the role that is active by default for the user’s session upon login. Note that specifying a default role for a user does **not** grant the role to the user. The role must be granted explicitly to the user using the [GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role) command. In addition, the CREATE USER operation does not verify that the role exists.", resources.AccountRole), }, "default_secondary_roles_option": { Type: schema.TypeString, diff --git a/pkg/resources/user_parameters.go b/pkg/resources/user_parameters.go index 05ecc15e0c..2c4f34f2ea 100644 --- a/pkg/resources/user_parameters.go +++ b/pkg/resources/user_parameters.go @@ -174,8 +174,8 @@ func userParametersProviderFunc(c *sdk.Client) showParametersFunc[sdk.AccountObj } // TODO [SNOW-1645342]: make generic based on type definition -func handleUserParameterRead(d *schema.ResourceData, warehouseParameters []*sdk.Parameter) error { - for _, p := range warehouseParameters { +func handleUserParameterRead(d *schema.ResourceData, userParameters []*sdk.Parameter) error { + for _, p := range userParameters { switch p.Key { case string(sdk.UserParameterClientMemoryLimit), diff --git a/pkg/resources/view.go b/pkg/resources/view.go index aaa77b53d3..71717adf96 100644 --- a/pkg/resources/view.go +++ b/pkg/resources/view.go @@ -49,7 +49,7 @@ var viewSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Optional: true, Default: false, - Description: "Retains the access permissions from the original view when a new view is created using the OR REPLACE clause.", + Description: copyGrantsDescription("Retains the access permissions from the original view when a view is recreated using the OR REPLACE clause."), DiffSuppressFunc: IgnoreAfterCreation, }, "is_secure": { @@ -94,6 +94,7 @@ var viewSchema = map[string]*schema.Schema{ Required: true, Description: "Identifier of the data metric function to add to the table or view or drop from the table or view. This function identifier must be provided without arguments in parenthesis.", DiffSuppressFunc: suppressIdentifierQuoting, + ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), }, "on": { Type: schema.TypeSet, @@ -159,7 +160,8 @@ var viewSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, DiffSuppressFunc: suppressIdentifierQuoting, - Description: "Specifies the masking policy to set on a column.", + ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), + Description: relatedResourceDescription("Specifies the masking policy to set on a column.", resources.MaskingPolicy), }, "using": { Type: schema.TypeList, @@ -167,7 +169,8 @@ var viewSchema = map[string]*schema.Schema{ Elem: &schema.Schema{ Type: schema.TypeString, }, - Description: "Specifies the arguments to pass into the conditional masking policy SQL expression. The first column in the list specifies the column for the policy conditions to mask or tokenize the data and must match the column to which the masking policy is set. The additional columns specify the columns to evaluate to determine whether to mask or tokenize the data in each row of the query result when a query is made on the first column. If the USING clause is omitted, Snowflake treats the conditional masking policy as a normal masking policy.", + DiffSuppressFunc: IgnoreMatchingColumnNameAndMaskingPolicyUsingFirstElem(), + Description: "Specifies the arguments to pass into the conditional masking policy SQL expression. The first column in the list specifies the column for the policy conditions to mask or tokenize the data and must match the column to which the masking policy is set. The additional columns specify the columns to evaluate to determine whether to mask or tokenize the data in each row of the query result when a query is made on the first column. If the USING clause is omitted, Snowflake treats the conditional masking policy as a normal masking policy.", }, }, }, @@ -182,6 +185,7 @@ var viewSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, DiffSuppressFunc: suppressIdentifierQuoting, + ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), Description: "Specifies the projection policy to set on a column.", }, }, @@ -212,7 +216,8 @@ var viewSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, DiffSuppressFunc: suppressIdentifierQuoting, - Description: "Row access policy name.", + ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), + Description: relatedResourceDescription("Row access policy name.", resources.RowAccessPolicy), }, "on": { Type: schema.TypeSet, @@ -236,6 +241,7 @@ var viewSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, DiffSuppressFunc: suppressIdentifierQuoting, + ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](), Description: "Aggregation policy name.", }, "entity_key": { @@ -253,7 +259,7 @@ var viewSchema = map[string]*schema.Schema{ "statement": { Type: schema.TypeString, Required: true, - Description: "Specifies the query used to create the view.", + Description: diffSuppressStatementFieldDescription("Specifies the query used to create the view."), DiffSuppressFunc: DiffSuppressStatement, }, ShowOutputAttributeName: { @@ -917,6 +923,7 @@ func UpdateView(ctx context.Context, d *schema.ResourceData, meta any) diag.Diag return diag.FromErr(fmt.Errorf("error setting change_tracking for view %v: %w", d.Id(), err)) } } else { + // No UNSET for CHANGE_TRACKING, so set false instead. err := client.Views.Alter(ctx, sdk.NewAlterViewRequest(id).WithSetChangeTracking(false)) if err != nil { return diag.FromErr(fmt.Errorf("error unsetting change_tracking for view %v: %w", d.Id(), err)) diff --git a/pkg/resources/view_acceptance_test.go b/pkg/resources/view_acceptance_test.go index b64f342a1a..b72822016e 100644 --- a/pkg/resources/view_acceptance_test.go +++ b/pkg/resources/view_acceptance_test.go @@ -770,6 +770,106 @@ end;; }) } +func TestAcc_View_columnsWithMaskingPolicyWithoutUsing(t *testing.T) { + t.Setenv(string(testenvs.ConfigureClientOnce), "") + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) + acc.TestAccPreCheck(t) + + id := acc.TestClient().Ids.RandomSchemaObjectIdentifier() + table, tableCleanup := acc.TestClient().Table.CreateWithColumns(t, []sdk.TableColumnRequest{ + *sdk.NewTableColumnRequest("id", sdk.DataTypeNumber), + *sdk.NewTableColumnRequest("foo", sdk.DataTypeNumber), + *sdk.NewTableColumnRequest("bar", sdk.DataTypeNumber), + }) + t.Cleanup(tableCleanup) + statement := fmt.Sprintf("SELECT id, foo FROM %s", table.ID().FullyQualifiedName()) + + maskingPolicy, maskingPolicyCleanup := acc.TestClient().MaskingPolicy.CreateMaskingPolicyWithOptions(t, + []sdk.TableColumnSignature{ + { + Name: "One", + Type: sdk.DataTypeNumber, + }, + }, + sdk.DataTypeNumber, + ` +case + when One > 0 then One + else 0 +end;; +`, + new(sdk.CreateMaskingPolicyOptions), + ) + t.Cleanup(maskingPolicyCleanup) + + projectionPolicy, projectionPolicyCleanup := acc.TestClient().ProjectionPolicy.CreateProjectionPolicy(t) + t.Cleanup(projectionPolicyCleanup) + + // generators currently don't handle lists of objects, so use the old way + viewWithPolicies := func() config.Variables { + conf := config.Variables{ + "name": config.StringVariable(id.Name()), + "database": config.StringVariable(id.DatabaseName()), + "schema": config.StringVariable(id.SchemaName()), + "statement": config.StringVariable(statement), + } + conf["projection_name"] = config.StringVariable(projectionPolicy.FullyQualifiedName()) + conf["masking_name"] = config.StringVariable(maskingPolicy.ID().FullyQualifiedName()) + return conf + } + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.View), + Steps: []resource.TestStep{ + // With all policies on columns + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_View/columns"), + ConfigVariables: viewWithPolicies(), + Check: assert.AssertThat(t, + resourceassert.ViewResource(t, "snowflake_view.test"). + HasNameString(id.Name()). + HasStatementString(statement). + HasDatabaseString(id.DatabaseName()). + HasSchemaString(id.SchemaName()). + HasColumnLength(2), + objectassert.View(t, id). + HasMaskingPolicyReferences(acc.TestClient(), 1). + HasProjectionPolicyReferences(acc.TestClient(), 1), + ), + }, + // Remove policies on columns externally + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_View/columns"), + ConfigVariables: viewWithPolicies(), + PreConfig: func() { + acc.TestClient().View.Alter(t, sdk.NewAlterViewRequest(id).WithUnsetMaskingPolicyOnColumn(*sdk.NewViewUnsetColumnMaskingPolicyRequest("ID"))) + acc.TestClient().View.Alter(t, sdk.NewAlterViewRequest(id).WithUnsetProjectionPolicyOnColumn(*sdk.NewViewUnsetProjectionPolicyRequest("ID"))) + }, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction("snowflake_view.test", plancheck.ResourceActionUpdate), + }, + }, + Check: assert.AssertThat(t, + resourceassert.ViewResource(t, "snowflake_view.test"). + HasNameString(id.Name()). + HasStatementString(statement). + HasDatabaseString(id.DatabaseName()). + HasSchemaString(id.SchemaName()). + HasColumnLength(2), + objectassert.View(t, id). + HasMaskingPolicyReferences(acc.TestClient(), 1). + HasProjectionPolicyReferences(acc.TestClient(), 1), + ), + }, + }, + }) +} + func TestAcc_View_Rename(t *testing.T) { t.Setenv(string(testenvs.ConfigureClientOnce), "") statement := "SELECT ROLE_NAME, ROLE_OWNER FROM INFORMATION_SCHEMA.APPLICABLE_ROLES" diff --git a/pkg/resources/warehouse.go b/pkg/resources/warehouse.go index 2c6f5bd57a..8925a5ef95 100644 --- a/pkg/resources/warehouse.go +++ b/pkg/resources/warehouse.go @@ -90,7 +90,7 @@ var warehouseSchema = map[string]*schema.Schema{ Optional: true, ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), DiffSuppressFunc: SuppressIfAny(suppressIdentifierQuoting, IgnoreChangeToCurrentSnowflakeValueInShow("resource_monitor")), - Description: "Specifies the name of a resource monitor that is explicitly assigned to the warehouse.", + Description: relatedResourceDescription("Specifies the name of a resource monitor that is explicitly assigned to the warehouse.", resources.ResourceMonitor), }, "comment": { Type: schema.TypeString, diff --git a/pkg/schemas/account_gen.go b/pkg/schemas/account_gen.go index 715e1fb9cf..e6f4413875 100644 --- a/pkg/schemas/account_gen.go +++ b/pkg/schemas/account_gen.go @@ -17,11 +17,11 @@ var ShowAccountSchema = map[string]*schema.Schema{ Type: schema.TypeString, Computed: true, }, - "region_group": { + "snowflake_region": { Type: schema.TypeString, Computed: true, }, - "snowflake_region": { + "region_group": { Type: schema.TypeString, Computed: true, }, @@ -73,6 +73,58 @@ var ShowAccountSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Computed: true, }, + "account_old_url_saved_on": { + Type: schema.TypeString, + Computed: true, + }, + "account_old_url_last_used": { + Type: schema.TypeString, + Computed: true, + }, + "organization_old_url": { + Type: schema.TypeString, + Computed: true, + }, + "organization_old_url_saved_on": { + Type: schema.TypeString, + Computed: true, + }, + "organization_old_url_last_used": { + Type: schema.TypeString, + Computed: true, + }, + "is_events_account": { + Type: schema.TypeBool, + Computed: true, + }, + "is_organization_account": { + Type: schema.TypeBool, + Computed: true, + }, + "dropped_on": { + Type: schema.TypeString, + Computed: true, + }, + "scheduled_deletion_time": { + Type: schema.TypeString, + Computed: true, + }, + "restored_on": { + Type: schema.TypeString, + Computed: true, + }, + "moved_to_organization": { + Type: schema.TypeString, + Computed: true, + }, + "moved_on": { + Type: schema.TypeString, + Computed: true, + }, + "organization_url_expiration_on": { + Type: schema.TypeString, + Computed: true, + }, } var _ = ShowAccountSchema @@ -81,20 +133,82 @@ func AccountToSchema(account *sdk.Account) map[string]any { accountSchema := make(map[string]any) accountSchema["organization_name"] = account.OrganizationName accountSchema["account_name"] = account.AccountName - accountSchema["region_group"] = account.RegionGroup accountSchema["snowflake_region"] = account.SnowflakeRegion - accountSchema["edition"] = account.Edition - accountSchema["account_url"] = account.AccountURL - accountSchema["created_on"] = account.CreatedOn.String() - accountSchema["comment"] = account.Comment + if account.RegionGroup != nil { + accountSchema["region_group"] = account.RegionGroup + } + if account.Edition != nil { + // Manually modified, please don't re-generate + accountSchema["edition"] = string(*account.Edition) + } + if account.AccountURL != nil { + accountSchema["account_url"] = account.AccountURL + } + if account.CreatedOn != nil { + accountSchema["created_on"] = account.CreatedOn.String() + } + if account.Comment != nil { + accountSchema["comment"] = account.Comment + } accountSchema["account_locator"] = account.AccountLocator - accountSchema["account_locator_url"] = account.AccountLocatorURL - accountSchema["managed_accounts"] = account.ManagedAccounts - accountSchema["consumption_billing_entity_name"] = account.ConsumptionBillingEntityName - accountSchema["marketplace_consumer_billing_entity_name"] = account.MarketplaceConsumerBillingEntityName - accountSchema["marketplace_provider_billing_entity_name"] = account.MarketplaceProviderBillingEntityName - accountSchema["old_account_url"] = account.OldAccountURL - accountSchema["is_org_admin"] = account.IsOrgAdmin + if account.AccountLocatorUrl != nil { + accountSchema["account_locator_url"] = account.AccountLocatorUrl + } + if account.ManagedAccounts != nil { + accountSchema["managed_accounts"] = account.ManagedAccounts + } + if account.ConsumptionBillingEntityName != nil { + accountSchema["consumption_billing_entity_name"] = account.ConsumptionBillingEntityName + } + if account.MarketplaceConsumerBillingEntityName != nil { + accountSchema["marketplace_consumer_billing_entity_name"] = account.MarketplaceConsumerBillingEntityName + } + if account.MarketplaceProviderBillingEntityName != nil { + accountSchema["marketplace_provider_billing_entity_name"] = account.MarketplaceProviderBillingEntityName + } + if account.OldAccountURL != nil { + accountSchema["old_account_url"] = account.OldAccountURL + } + if account.IsOrgAdmin != nil { + accountSchema["is_org_admin"] = account.IsOrgAdmin + } + if account.AccountOldUrlSavedOn != nil { + accountSchema["account_old_url_saved_on"] = account.AccountOldUrlSavedOn.String() + } + if account.AccountOldUrlLastUsed != nil { + accountSchema["account_old_url_last_used"] = account.AccountOldUrlLastUsed.String() + } + if account.OrganizationOldUrl != nil { + accountSchema["organization_old_url"] = account.OrganizationOldUrl + } + if account.OrganizationOldUrlSavedOn != nil { + accountSchema["organization_old_url_saved_on"] = account.OrganizationOldUrlSavedOn.String() + } + if account.OrganizationOldUrlLastUsed != nil { + accountSchema["organization_old_url_last_used"] = account.OrganizationOldUrlLastUsed.String() + } + if account.IsEventsAccount != nil { + accountSchema["is_events_account"] = account.IsEventsAccount + } + accountSchema["is_organization_account"] = account.IsOrganizationAccount + if account.DroppedOn != nil { + accountSchema["dropped_on"] = account.DroppedOn.String() + } + if account.ScheduledDeletionTime != nil { + accountSchema["scheduled_deletion_time"] = account.ScheduledDeletionTime.String() + } + if account.RestoredOn != nil { + accountSchema["restored_on"] = account.RestoredOn.String() + } + if account.MovedToOrganization != nil { + accountSchema["moved_to_organization"] = account.MovedToOrganization + } + if account.MovedOn != nil { + accountSchema["moved_on"] = account.MovedOn + } + if account.OrganizationUrlExpirationOn != nil { + accountSchema["organization_url_expiration_on"] = account.OrganizationUrlExpirationOn.String() + } return accountSchema } diff --git a/pkg/schemas/account_parameters.go b/pkg/schemas/account_parameters.go new file mode 100644 index 0000000000..e5885b7967 --- /dev/null +++ b/pkg/schemas/account_parameters.go @@ -0,0 +1,71 @@ +package schemas + +import ( + "slices" + "strings" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +var ( + ShowAccountParametersSchema = make(map[string]*schema.Schema) + accountParameters = []sdk.AccountParameter{ + // TODO(SNOW-1348092 - next prs): Add parameters + // session parameters + sdk.AccountParameterAbortDetachedQuery, + sdk.AccountParameterAutocommit, + sdk.AccountParameterBinaryInputFormat, + sdk.AccountParameterBinaryOutputFormat, + sdk.AccountParameterClientMetadataRequestUseConnectionCtx, + sdk.AccountParameterClientResultColumnCaseInsensitive, + sdk.AccountParameterDateInputFormat, + sdk.AccountParameterDateOutputFormat, + sdk.AccountParameterErrorOnNondeterministicMerge, + sdk.AccountParameterErrorOnNondeterministicUpdate, + sdk.AccountParameterGeographyOutputFormat, + sdk.AccountParameterLockTimeout, + sdk.AccountParameterLogLevel, + sdk.AccountParameterMultiStatementCount, + sdk.AccountParameterQueryTag, + sdk.AccountParameterQuotedIdentifiersIgnoreCase, + sdk.AccountParameterRowsPerResultset, + sdk.AccountParameterS3StageVpceDnsName, + sdk.AccountParameterStatementQueuedTimeoutInSeconds, + sdk.AccountParameterStatementTimeoutInSeconds, + sdk.AccountParameterTimestampDayIsAlways24h, + sdk.AccountParameterTimestampInputFormat, + sdk.AccountParameterTimestampLtzOutputFormat, + sdk.AccountParameterTimestampNtzOutputFormat, + sdk.AccountParameterTimestampOutputFormat, + sdk.AccountParameterTimestampTypeMapping, + sdk.AccountParameterTimestampTzOutputFormat, + sdk.AccountParameterTimezone, + sdk.AccountParameterTimeInputFormat, + sdk.AccountParameterTimeOutputFormat, + sdk.AccountParameterTraceLevel, + sdk.AccountParameterTransactionAbortOnError, + sdk.AccountParameterTransactionDefaultIsolationLevel, + sdk.AccountParameterTwoDigitCenturyStart, + sdk.AccountParameterUnsupportedDdlAction, + sdk.AccountParameterUseCachedResult, + sdk.AccountParameterWeekOfYearPolicy, + sdk.AccountParameterWeekStart, + } +) + +func init() { + for _, param := range accountParameters { + ShowAccountParametersSchema[strings.ToLower(string(param))] = ParameterListSchema + } +} + +func AccountParametersToSchema(parameters []*sdk.Parameter) map[string]any { + accountParametersValue := make(map[string]any) + for _, param := range parameters { + if slices.Contains(accountParameters, sdk.AccountParameter(param.Key)) { + accountParametersValue[strings.ToLower(param.Key)] = []map[string]any{ParameterToSchema(param)} + } + } + return accountParametersValue +} diff --git a/pkg/schemas/function_gen.go b/pkg/schemas/function_gen.go index a211866ea5..f8daa18ecf 100644 --- a/pkg/schemas/function_gen.go +++ b/pkg/schemas/function_gen.go @@ -41,10 +41,10 @@ var ShowFunctionSchema = map[string]*schema.Schema{ Type: schema.TypeInt, Computed: true, }, - "arguments": { - Type: schema.TypeInvalid, - Computed: true, - }, + //"arguments_old": { + // Type: schema.TypeInvalid, + // Computed: true, + //}, "arguments_raw": { Type: schema.TypeString, Computed: true, @@ -69,6 +69,14 @@ var ShowFunctionSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Computed: true, }, + "secrets": { + Type: schema.TypeString, + Computed: true, + }, + "external_access_integrations": { + Type: schema.TypeString, + Computed: true, + }, "is_external_function": { Type: schema.TypeBool, Computed: true, @@ -81,6 +89,10 @@ var ShowFunctionSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Computed: true, }, + "is_data_metric": { + Type: schema.TypeBool, + Computed: true, + }, } var _ = ShowFunctionSchema @@ -95,16 +107,23 @@ func FunctionToSchema(function *sdk.Function) map[string]any { functionSchema["is_ansi"] = function.IsAnsi functionSchema["min_num_arguments"] = function.MinNumArguments functionSchema["max_num_arguments"] = function.MaxNumArguments - functionSchema["arguments"] = function.ArgumentsOld + // functionSchema["arguments_old"] = function.ArgumentsOld functionSchema["arguments_raw"] = function.ArgumentsRaw functionSchema["description"] = function.Description functionSchema["catalog_name"] = function.CatalogName functionSchema["is_table_function"] = function.IsTableFunction functionSchema["valid_for_clustering"] = function.ValidForClustering functionSchema["is_secure"] = function.IsSecure + if function.Secrets != nil { + functionSchema["secrets"] = function.Secrets + } + if function.ExternalAccessIntegrations != nil { + functionSchema["external_access_integrations"] = function.ExternalAccessIntegrations + } functionSchema["is_external_function"] = function.IsExternalFunction functionSchema["language"] = function.Language functionSchema["is_memoizable"] = function.IsMemoizable + functionSchema["is_data_metric"] = function.IsDataMetric return functionSchema } diff --git a/pkg/schemas/procedure_gen.go b/pkg/schemas/procedure_gen.go index 38d5937273..cd90b95298 100644 --- a/pkg/schemas/procedure_gen.go +++ b/pkg/schemas/procedure_gen.go @@ -41,10 +41,10 @@ var ShowProcedureSchema = map[string]*schema.Schema{ Type: schema.TypeInt, Computed: true, }, - "arguments": { - Type: schema.TypeInvalid, - Computed: true, - }, + //"arguments_old": { + // Type: schema.TypeInvalid, + // Computed: true, + //}, "arguments_raw": { Type: schema.TypeString, Computed: true, @@ -69,6 +69,14 @@ var ShowProcedureSchema = map[string]*schema.Schema{ Type: schema.TypeBool, Computed: true, }, + "secrets": { + Type: schema.TypeString, + Computed: true, + }, + "external_access_integrations": { + Type: schema.TypeString, + Computed: true, + }, } var _ = ShowProcedureSchema @@ -83,13 +91,19 @@ func ProcedureToSchema(procedure *sdk.Procedure) map[string]any { procedureSchema["is_ansi"] = procedure.IsAnsi procedureSchema["min_num_arguments"] = procedure.MinNumArguments procedureSchema["max_num_arguments"] = procedure.MaxNumArguments - procedureSchema["arguments"] = procedure.ArgumentsOld + // procedureSchema["arguments_old"] = procedure.ArgumentsOld procedureSchema["arguments_raw"] = procedure.ArgumentsRaw procedureSchema["description"] = procedure.Description procedureSchema["catalog_name"] = procedure.CatalogName procedureSchema["is_table_function"] = procedure.IsTableFunction procedureSchema["valid_for_clustering"] = procedure.ValidForClustering procedureSchema["is_secure"] = procedure.IsSecure + if procedure.Secrets != nil { + procedureSchema["secrets"] = procedure.Secrets + } + if procedure.ExternalAccessIntegrations != nil { + procedureSchema["external_access_integrations"] = procedure.ExternalAccessIntegrations + } return procedureSchema } diff --git a/pkg/sdk/accounts.go b/pkg/sdk/accounts.go index 00557cba7a..997c5f2086 100644 --- a/pkg/sdk/accounts.go +++ b/pkg/sdk/accounts.go @@ -3,9 +3,15 @@ package sdk import ( "context" "database/sql" + "encoding/json" "errors" + "fmt" + "log" + "strings" "time" + "github.com/snowflakedb/gosnowflake" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" ) @@ -16,7 +22,7 @@ var ( ) type Accounts interface { - Create(ctx context.Context, id AccountObjectIdentifier, opts *CreateAccountOptions) error + Create(ctx context.Context, id AccountObjectIdentifier, opts *CreateAccountOptions) (*AccountCreateResponse, error) Alter(ctx context.Context, opts *AlterAccountOptions) error Show(ctx context.Context, opts *ShowAccountOptions) ([]Account, error) ShowByID(ctx context.Context, id AccountObjectIdentifier) (*Account, error) @@ -39,6 +45,21 @@ var ( EditionBusinessCritical AccountEdition = "BUSINESS_CRITICAL" ) +var AllAccountEditions = []AccountEdition{ + EditionStandard, + EditionEnterprise, + EditionBusinessCritical, +} + +func ToAccountEdition(edition string) (AccountEdition, error) { + switch typedEdition := AccountEdition(strings.ToUpper(edition)); typedEdition { + case EditionStandard, EditionEnterprise, EditionBusinessCritical: + return typedEdition, nil + default: + return "", fmt.Errorf("unknown account edition: %s", edition) + } +} + // CreateAccountOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-account. type CreateAccountOptions struct { create bool `ddl:"static" sql:"CREATE"` @@ -81,12 +102,59 @@ func (opts *CreateAccountOptions) validate() error { return errors.Join(errs...) } -func (c *accounts) Create(ctx context.Context, id AccountObjectIdentifier, opts *CreateAccountOptions) error { +type AccountCreateResponse struct { + AccountLocator string `json:"accountLocator,omitempty"` + AccountLocatorUrl string `json:"accountLocatorUrl,omitempty"` + OrganizationName string + AccountName string `json:"accountName,omitempty"` + Url string `json:"url,omitempty"` + Edition AccountEdition `json:"edition,omitempty"` + RegionGroup string `json:"regionGroup,omitempty"` + Cloud string `json:"cloud,omitempty"` + Region string `json:"region,omitempty"` +} + +func ToAccountCreateResponse(v string) (*AccountCreateResponse, error) { + var res AccountCreateResponse + err := json.Unmarshal([]byte(v), &res) + if err != nil { + return nil, err + } + if len(res.Url) > 0 { + url := strings.TrimPrefix(res.Url, `https://`) + url = strings.TrimPrefix(url, `http://`) + parts := strings.SplitN(url, "-", 2) + if len(parts) == 2 { + res.OrganizationName = strings.ToUpper(parts[0]) + } + } + return &res, nil +} + +func (c *accounts) Create(ctx context.Context, id AccountObjectIdentifier, opts *CreateAccountOptions) (*AccountCreateResponse, error) { if opts == nil { opts = &CreateAccountOptions{} } opts.name = id - return validateAndExec(c.client, ctx, opts) + queryChanId := make(chan string, 1) + err := validateAndExec(c.client, gosnowflake.WithQueryIDChan(ctx, queryChanId), opts) + if err != nil { + return nil, err + } + + queryId := <-queryChanId + rows, err := c.client.QueryUnsafe(gosnowflake.WithFetchResultByID(ctx, queryId), "") + if err != nil { + log.Printf("[WARN] Unable to retrieve create account output, err = %v", err) + } + + if len(rows) == 1 && rows[0]["status"] != nil { + if status, ok := (*rows[0]["status"]).(string); ok { + return ToAccountCreateResponse(status) + } + } + + return nil, nil } // AlterAccountOptions is based on https://docs.snowflake.com/en/sql-reference/sql/alter-account. @@ -299,7 +367,7 @@ type Account struct { CreatedOn *time.Time Comment *string AccountLocator string - AccountLocatorURL *string + AccountLocatorUrl *string ManagedAccounts *int ConsumptionBillingEntityName *string MarketplaceConsumerBillingEntityName *string @@ -387,7 +455,7 @@ func (row accountDBRow) convert() *Account { acc.Comment = &row.Comment.String } if row.AccountLocatorURL.Valid { - acc.AccountLocatorURL = &row.AccountLocatorURL.String + acc.AccountLocatorUrl = &row.AccountLocatorURL.String } if row.ManagedAccounts.Valid { acc.ManagedAccounts = Int(int(row.ManagedAccounts.Int32)) diff --git a/pkg/sdk/accounts_test.go b/pkg/sdk/accounts_test.go index d275c82883..e072eabd71 100644 --- a/pkg/sdk/accounts_test.go +++ b/pkg/sdk/accounts_test.go @@ -1,10 +1,13 @@ package sdk import ( + "encoding/json" "fmt" "testing" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" ) func TestAccountCreate(t *testing.T) { @@ -404,3 +407,155 @@ func TestAccountShow(t *testing.T) { assertOptsValidAndSQLEquals(t, opts, `SHOW ACCOUNTS LIKE 'myaccount'`) }) } + +func TestToAccountCreateResponse(t *testing.T) { + testCases := []struct { + Name string + RawInput string + Input AccountCreateResponse + ExpectedOutput *AccountCreateResponse + Error string + }{ + { + Name: "validation: empty input", + RawInput: "", + Error: "unexpected end of JSON input", + }, + { + Name: "validation: only a few fields filled", + Input: AccountCreateResponse{ + AccountName: "acc_name", + Url: `https://org_name-acc_name.snowflakecomputing.com`, + Edition: EditionStandard, + RegionGroup: "region_group", + Cloud: "cloud", + Region: "region", + }, + ExpectedOutput: &AccountCreateResponse{ + AccountName: "acc_name", + Url: `https://org_name-acc_name.snowflakecomputing.com`, + OrganizationName: "ORG_NAME", + Edition: EditionStandard, + RegionGroup: "region_group", + Cloud: "cloud", + Region: "region", + }, + }, + { + Name: "validation: invalid url", + Input: AccountCreateResponse{ + Url: `https://org_name_acc_name.snowflake.computing.com`, + }, + ExpectedOutput: &AccountCreateResponse{ + Url: `https://org_name_acc_name.snowflake.computing.com`, + // OrganizationName is not filled + }, + }, + { + Name: "validation: valid url", + Input: AccountCreateResponse{ + Url: `https://org_name-acc_name.snowflakecomputing.com`, + }, + ExpectedOutput: &AccountCreateResponse{ + Url: `https://org_name-acc_name.snowflakecomputing.com`, + OrganizationName: "ORG_NAME", + }, + }, + { + Name: "validation: valid http url", + Input: AccountCreateResponse{ + Url: `http://org_name-acc_name.snowflakecomputing.com`, + }, + ExpectedOutput: &AccountCreateResponse{ + Url: `http://org_name-acc_name.snowflakecomputing.com`, + OrganizationName: "ORG_NAME", + }, + }, + { + Name: "complete", + Input: AccountCreateResponse{ + AccountLocator: "locator", + AccountLocatorUrl: "locator_url", + AccountName: "acc_name", + Url: `https://org_name-acc_name.snowflakecomputing.com`, + Edition: EditionBusinessCritical, + RegionGroup: "region_group", + Cloud: "cloud", + Region: "region", + }, + ExpectedOutput: &AccountCreateResponse{ + AccountLocator: "locator", + AccountLocatorUrl: "locator_url", + AccountName: "acc_name", + Url: `https://org_name-acc_name.snowflakecomputing.com`, + OrganizationName: "ORG_NAME", + Edition: EditionBusinessCritical, + RegionGroup: "region_group", + Cloud: "cloud", + Region: "region", + }, + }, + } + + for _, tc := range testCases { + t.Run(tc.Name, func(t *testing.T) { + input := tc.RawInput + if tc.Input != (AccountCreateResponse{}) { + bytes, err := json.Marshal(tc.Input) + if err != nil { + assert.Fail(t, err.Error()) + } + input = string(bytes) + } + + createResponse, err := ToAccountCreateResponse(input) + + if tc.Error != "" { + assert.EqualError(t, err, tc.Error) + assert.Nil(t, createResponse) + } else { + assert.NoError(t, err) + assert.Equal(t, tc.ExpectedOutput, createResponse) + } + }) + } +} + +func TestToAccountEdition(t *testing.T) { + type test struct { + input string + want AccountEdition + } + + valid := []test{ + // case insensitive. + {input: "standard", want: EditionStandard}, + + // Supported Values + {input: "STANDARD", want: EditionStandard}, + {input: "ENTERPRISE", want: EditionEnterprise}, + {input: "BUSINESS_CRITICAL", want: EditionBusinessCritical}, + } + + invalid := []test{ + // bad values + {input: ""}, + {input: "foo"}, + {input: "businesscritical"}, + } + + for _, tc := range valid { + t.Run(tc.input, func(t *testing.T) { + got, err := ToAccountEdition(tc.input) + require.NoError(t, err) + require.Equal(t, tc.want, got) + }) + } + + for _, tc := range invalid { + t.Run(tc.input, func(t *testing.T) { + _, err := ToAccountEdition(tc.input) + require.Error(t, err) + }) + } +} diff --git a/pkg/sdk/common_types.go b/pkg/sdk/common_types.go index 7a4975a78e..4276fe58d9 100644 --- a/pkg/sdk/common_types.go +++ b/pkg/sdk/common_types.go @@ -233,17 +233,50 @@ func NullInputBehaviorPointer(v NullInputBehavior) *NullInputBehavior { const ( NullInputBehaviorCalledOnNullInput NullInputBehavior = "CALLED ON NULL INPUT" - NullInputBehaviorReturnNullInput NullInputBehavior = "RETURN NULL ON NULL INPUT" + NullInputBehaviorReturnsNullInput NullInputBehavior = "RETURNS NULL ON NULL INPUT" NullInputBehaviorStrict NullInputBehavior = "STRICT" ) +// ToNullInputBehavior maps STRICT to RETURNS NULL ON NULL INPUT, because Snowflake returns RETURNS NULL ON NULL INPUT for any of these two options +func ToNullInputBehavior(value string) (NullInputBehavior, error) { + switch strings.ToUpper(value) { + case string(NullInputBehaviorCalledOnNullInput): + return NullInputBehaviorCalledOnNullInput, nil + case string(NullInputBehaviorReturnsNullInput), string(NullInputBehaviorStrict): + return NullInputBehaviorReturnsNullInput, nil + default: + return "", fmt.Errorf("unknown null input behavior: %s", value) + } +} + +var AllAllowedNullInputBehaviors = []NullInputBehavior{ + NullInputBehaviorCalledOnNullInput, + NullInputBehaviorReturnsNullInput, +} + type ReturnResultsBehavior string -var ( +const ( ReturnResultsBehaviorVolatile ReturnResultsBehavior = "VOLATILE" ReturnResultsBehaviorImmutable ReturnResultsBehavior = "IMMUTABLE" ) +func ToReturnResultsBehavior(value string) (ReturnResultsBehavior, error) { + switch strings.ToUpper(value) { + case string(ReturnResultsBehaviorVolatile): + return ReturnResultsBehaviorVolatile, nil + case string(ReturnResultsBehaviorImmutable): + return ReturnResultsBehaviorImmutable, nil + default: + return "", fmt.Errorf("unknown return results behavior: %s", value) + } +} + +var AllAllowedReturnResultsBehaviors = []ReturnResultsBehavior{ + ReturnResultsBehaviorVolatile, + ReturnResultsBehaviorImmutable, +} + func ReturnResultsBehaviorPointer(v ReturnResultsBehavior) *ReturnResultsBehavior { return &v } @@ -260,8 +293,9 @@ func ReturnNullValuesPointer(v ReturnNullValues) *ReturnNullValues { } type SecretReference struct { - VariableName string `ddl:"keyword,single_quotes"` - Name string `ddl:"parameter,no_quotes"` + VariableName string `ddl:"keyword,single_quotes"` + equals bool `ddl:"static" sql:"="` + Name SchemaObjectIdentifier `ddl:"identifier"` } type ValuesBehavior string @@ -356,6 +390,60 @@ var AllTraceLevels = []TraceLevel{ TraceLevelOff, } +type MetricLevel string + +const ( + MetricLevelAll MetricLevel = "ALL" + MetricLevelNone MetricLevel = "NONE" +) + +func ToMetricLevel(value string) (MetricLevel, error) { + switch strings.ToUpper(value) { + case string(MetricLevelAll): + return MetricLevelAll, nil + case string(MetricLevelNone): + return MetricLevelNone, nil + default: + return "", fmt.Errorf("unknown metric level: %s", value) + } +} + +var AllMetricLevels = []MetricLevel{ + MetricLevelAll, + MetricLevelNone, +} + +type AutoEventLogging string + +const ( + AutoEventLoggingLogging AutoEventLogging = "LOGGING" + AutoEventLoggingTracing AutoEventLogging = "TRACING" + AutoEventLoggingAll AutoEventLogging = "ALL" + AutoEventLoggingOff AutoEventLogging = "OFF" +) + +func ToAutoEventLogging(value string) (AutoEventLogging, error) { + switch strings.ToUpper(value) { + case string(AutoEventLoggingLogging): + return AutoEventLoggingLogging, nil + case string(AutoEventLoggingTracing): + return AutoEventLoggingTracing, nil + case string(AutoEventLoggingAll): + return AutoEventLoggingAll, nil + case string(AutoEventLoggingOff): + return AutoEventLoggingOff, nil + default: + return "", fmt.Errorf("unknown auto event logging: %s", value) + } +} + +var AllAutoEventLoggings = []AutoEventLogging{ + AutoEventLoggingLogging, + AutoEventLoggingTracing, + AutoEventLoggingAll, + AutoEventLoggingOff, +} + // StringAllowEmpty is a wrapper on string to allow using empty strings in SQL. type StringAllowEmpty struct { Value string `ddl:"keyword,single_quotes"` diff --git a/pkg/sdk/common_types_test.go b/pkg/sdk/common_types_test.go index 1c0e785a88..2cb2f55665 100644 --- a/pkg/sdk/common_types_test.go +++ b/pkg/sdk/common_types_test.go @@ -262,6 +262,71 @@ func TestToLogLevel(t *testing.T) { } } +func Test_ToNullInputBehavior(t *testing.T) { + testCases := []struct { + Name string + Input string + Expected NullInputBehavior + Error string + }{ + {Input: string(NullInputBehaviorCalledOnNullInput), Expected: NullInputBehaviorCalledOnNullInput}, + {Input: string(NullInputBehaviorReturnsNullInput), Expected: NullInputBehaviorReturnsNullInput}, + {Input: string(NullInputBehaviorStrict), Expected: NullInputBehaviorReturnsNullInput}, + {Name: "validation: incorrect null input behavior", Input: "incorrect", Error: "unknown null input behavior: incorrect"}, + {Name: "validation: empty input", Input: "", Error: "unknown null input behavior: "}, + {Name: "validation: lower case input", Input: "called on null input", Expected: NullInputBehaviorCalledOnNullInput}, + } + + for _, testCase := range testCases { + name := testCase.Name + if name == "" { + name = fmt.Sprintf("%v null input behavior", testCase.Input) + } + t.Run(name, func(t *testing.T) { + value, err := ToNullInputBehavior(testCase.Input) + if testCase.Error != "" { + assert.Empty(t, value) + assert.ErrorContains(t, err, testCase.Error) + } else { + assert.NoError(t, err) + assert.Equal(t, testCase.Expected, value) + } + }) + } +} + +func Test_ToReturnResultsBehavior(t *testing.T) { + testCases := []struct { + Name string + Input string + Expected ReturnResultsBehavior + Error string + }{ + {Input: string(ReturnResultsBehaviorVolatile), Expected: ReturnResultsBehaviorVolatile}, + {Input: string(ReturnResultsBehaviorImmutable), Expected: ReturnResultsBehaviorImmutable}, + {Name: "validation: incorrect return results behavior", Input: "incorrect", Error: "unknown return results behavior: incorrect"}, + {Name: "validation: empty input", Input: "", Error: "unknown return results behavior: "}, + {Name: "validation: lower case input", Input: "volatile", Expected: ReturnResultsBehaviorVolatile}, + } + + for _, testCase := range testCases { + name := testCase.Name + if name == "" { + name = fmt.Sprintf("%v null input behavior", testCase.Input) + } + t.Run(name, func(t *testing.T) { + value, err := ToReturnResultsBehavior(testCase.Input) + if testCase.Error != "" { + assert.Empty(t, value) + assert.ErrorContains(t, err, testCase.Error) + } else { + assert.NoError(t, err) + assert.Equal(t, testCase.Expected, value) + } + }) + } +} + func TestToTraceLevel(t *testing.T) { testCases := []struct { Name string @@ -294,3 +359,71 @@ func TestToTraceLevel(t *testing.T) { }) } } + +func Test_ToMetricLevel(t *testing.T) { + testCases := []struct { + Name string + Input string + Expected MetricLevel + ExpectedError string + }{ + {Input: string(MetricLevelAll), Expected: MetricLevelAll}, + {Input: string(MetricLevelNone), Expected: MetricLevelNone}, + {Name: "validation: incorrect metric level", Input: "incorrect", ExpectedError: "unknown metric level: incorrect"}, + {Name: "validation: empty input", Input: "", ExpectedError: "unknown metric level: "}, + {Name: "validation: lower case input", Input: "all", Expected: MetricLevelAll}, + } + + for _, tc := range testCases { + tc := tc + name := tc.Name + if name == "" { + name = fmt.Sprintf("%v metric level", tc.Input) + } + t.Run(name, func(t *testing.T) { + value, err := ToMetricLevel(tc.Input) + if tc.ExpectedError != "" { + assert.Empty(t, value) + assert.ErrorContains(t, err, tc.ExpectedError) + } else { + assert.NoError(t, err) + assert.Equal(t, tc.Expected, value) + } + }) + } +} + +func Test_ToAutoEventLogging(t *testing.T) { + testCases := []struct { + Name string + Input string + Expected AutoEventLogging + ExpectedError string + }{ + {Input: string(AutoEventLoggingLogging), Expected: AutoEventLoggingLogging}, + {Input: string(AutoEventLoggingTracing), Expected: AutoEventLoggingTracing}, + {Input: string(AutoEventLoggingAll), Expected: AutoEventLoggingAll}, + {Input: string(AutoEventLoggingOff), Expected: AutoEventLoggingOff}, + {Name: "validation: incorrect auto event logging", Input: "incorrect", ExpectedError: "unknown auto event logging: incorrect"}, + {Name: "validation: empty input", Input: "", ExpectedError: "unknown auto event logging: "}, + {Name: "validation: lower case input", Input: "all", Expected: AutoEventLoggingAll}, + } + + for _, tc := range testCases { + tc := tc + name := tc.Name + if name == "" { + name = fmt.Sprintf("%v auto event logging", tc.Input) + } + t.Run(name, func(t *testing.T) { + value, err := ToAutoEventLogging(tc.Input) + if tc.ExpectedError != "" { + assert.Empty(t, value) + assert.ErrorContains(t, err, tc.ExpectedError) + } else { + assert.NoError(t, err) + assert.Equal(t, tc.Expected, value) + } + }) + } +} diff --git a/pkg/sdk/functions_def.go b/pkg/sdk/functions_def.go index 825c1d2551..1c92cd4078 100644 --- a/pkg/sdk/functions_def.go +++ b/pkg/sdk/functions_def.go @@ -5,14 +5,14 @@ import g "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/poc/gen //go:generate go run ./poc/main.go var functionArgument = g.NewQueryStruct("FunctionArgument"). - Text("ArgName", g.KeywordOptions().NoQuotes().Required()). + Text("ArgName", g.KeywordOptions().DoubleQuotes().Required()). PredefinedQueryStructField("ArgDataTypeOld", "DataType", g.KeywordOptions().NoQuotes()). PredefinedQueryStructField("ArgDataType", "datatypes.DataType", g.ParameterOptions().NoQuotes().NoEquals().Required()). PredefinedQueryStructField("DefaultValue", "*string", g.ParameterOptions().NoEquals().SQL("DEFAULT")). WithValidation(g.ExactlyOneValueSet, "ArgDataTypeOld", "ArgDataType") var functionColumn = g.NewQueryStruct("FunctionColumn"). - Text("ColumnName", g.KeywordOptions().NoQuotes().Required()). + Text("ColumnName", g.KeywordOptions().DoubleQuotes().Required()). PredefinedQueryStructField("ColumnDataTypeOld", "DataType", g.KeywordOptions().NoQuotes()). PredefinedQueryStructField("ColumnDataType", "datatypes.DataType", g.ParameterOptions().NoQuotes().NoEquals().Required()). WithValidation(g.ExactlyOneValueSet, "ColumnDataTypeOld", "ColumnDataType") @@ -38,8 +38,10 @@ var functionReturns = g.NewQueryStruct("FunctionReturns"). ).WithValidation(g.ExactlyOneValueSet, "ResultDataType", "Table") var ( - functionImports = g.NewQueryStruct("FunctionImport").Text("Import", g.KeywordOptions().SingleQuotes()) - functionPackages = g.NewQueryStruct("FunctionPackage").Text("Package", g.KeywordOptions().SingleQuotes()) + functionImports = g.NewQueryStruct("FunctionImport").Text("Import", g.KeywordOptions().SingleQuotes()) + functionPackages = g.NewQueryStruct("FunctionPackage").Text("Package", g.KeywordOptions().SingleQuotes()) + functionSecretsListWrapper = g.NewQueryStruct("SecretsList"). + List("SecretsList", "SecretReference", g.ListOptions().Required().MustParentheses()) ) var FunctionsDef = g.NewInterface( @@ -87,7 +89,11 @@ var FunctionsDef = g.NewInterface( ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). OptionalTextAssignment("TARGET_PATH", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ValidateValueSet, "Handler"). WithValidation(g.ConflictingFields, "OrReplace", "IfNotExists"), @@ -116,7 +122,11 @@ var FunctionsDef = g.NewInterface( PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("FunctionDefinition", "string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS").Required()). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + PredefinedQueryStructField("FunctionDefinition", "string", g.ParameterOptions().NoEquals().SQL("AS").Required()). WithValidation(g.ValidateValueSet, "FunctionDefinition"). WithValidation(g.ValidIdentifier, "name"), ).CustomOperation( @@ -127,6 +137,7 @@ var FunctionsDef = g.NewInterface( OrReplace(). OptionalSQL("TEMPORARY"). OptionalSQL("SECURE"). + OptionalSQL("AGGREGATE"). SQL("FUNCTION"). IfNotExists(). Identifier("name", g.KindOfT[SchemaObjectIdentifier](), g.IdentifierOptions().Required()). @@ -159,7 +170,11 @@ var FunctionsDef = g.NewInterface( TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). - PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ValidateValueSet, "RuntimeVersion"). WithValidation(g.ValidateValueSet, "Handler"). @@ -187,7 +202,7 @@ var FunctionsDef = g.NewInterface( SQL("LANGUAGE SCALA"). PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). - OptionalTextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes()). + TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). ListQueryStructField( "Imports", @@ -200,8 +215,14 @@ var FunctionsDef = g.NewInterface( g.ParameterOptions().Parentheses().SQL("PACKAGES"), ). TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). + ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). + ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). OptionalTextAssignment("TARGET_PATH", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + PredefinedQueryStructField("FunctionDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ValidateValueSet, "Handler"). WithValidation(g.ConflictingFields, "OrReplace", "IfNotExists"). @@ -230,7 +251,11 @@ var FunctionsDef = g.NewInterface( PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). OptionalSQL("MEMOIZABLE"). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("FunctionDefinition", "string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS").Required()). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + PredefinedQueryStructField("FunctionDefinition", "string", g.ParameterOptions().NoEquals().SQL("AS").Required()). WithValidation(g.ValidateValueSet, "FunctionDefinition"). WithValidation(g.ValidIdentifier, "name"), ).AlterOperation( @@ -241,19 +266,38 @@ var FunctionsDef = g.NewInterface( IfExists(). Name(). Identifier("RenameTo", g.KindOfTPointer[SchemaObjectIdentifier](), g.IdentifierOptions().SQL("RENAME TO")). - OptionalTextAssignment("SET COMMENT", g.ParameterOptions().SingleQuotes()). - OptionalTextAssignment("SET LOG_LEVEL", g.ParameterOptions().SingleQuotes()). - OptionalTextAssignment("SET TRACE_LEVEL", g.ParameterOptions().SingleQuotes()). + OptionalQueryStructField( + "Set", + g.NewQueryStruct("FunctionSet"). + OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). + ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). + OptionalQueryStructField("SecretsList", functionSecretsListWrapper, g.ParameterOptions().SQL("SECRETS").Parentheses()). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + WithValidation(g.AtLeastOneValueSet, "Comment", "ExternalAccessIntegrations", "SecretsList", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel"), + g.ListOptions().SQL("SET"), + ). + OptionalQueryStructField( + "Unset", + g.NewQueryStruct("FunctionUnset"). + OptionalSQL("COMMENT"). + OptionalSQL("EXTERNAL_ACCESS_INTEGRATIONS"). + OptionalSQL("ENABLE_CONSOLE_OUTPUT"). + OptionalSQL("LOG_LEVEL"). + OptionalSQL("METRIC_LEVEL"). + OptionalSQL("TRACE_LEVEL"). + WithValidation(g.AtLeastOneValueSet, "Comment", "ExternalAccessIntegrations", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel"), + g.ListOptions().SQL("UNSET"), + ). OptionalSQL("SET SECURE"). OptionalSQL("UNSET SECURE"). - OptionalSQL("UNSET LOG_LEVEL"). - OptionalSQL("UNSET TRACE_LEVEL"). - OptionalSQL("UNSET COMMENT"). OptionalSetTags(). OptionalUnsetTags(). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ValidIdentifierIfSet, "RenameTo"). - WithValidation(g.ExactlyOneValueSet, "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "SetSecure", "UnsetLogLevel", "UnsetTraceLevel", "UnsetSecure", "UnsetComment", "SetTags", "UnsetTags"), + WithValidation(g.ExactlyOneValueSet, "RenameTo", "Set", "Unset", "SetSecure", "UnsetSecure", "SetTags", "UnsetTags"), ).DropOperation( "https://docs.snowflake.com/en/sql-reference/sql/drop-function", g.NewQueryStruct("DropFunction"). @@ -279,9 +323,12 @@ var FunctionsDef = g.NewInterface( Field("is_table_function", "string"). Field("valid_for_clustering", "string"). Field("is_secure", "sql.NullString"). + OptionalText("secrets"). + OptionalText("external_access_integrations"). Field("is_external_function", "string"). Field("language", "string"). - Field("is_memoizable", "sql.NullString"), + Field("is_memoizable", "sql.NullString"). + Field("is_data_metric", "sql.NullString"), g.PlainStruct("Function"). Field("CreatedOn", "string"). Field("Name", "string"). @@ -297,14 +344,17 @@ var FunctionsDef = g.NewInterface( Field("IsTableFunction", "bool"). Field("ValidForClustering", "bool"). Field("IsSecure", "bool"). + OptionalText("Secrets"). + OptionalText("ExternalAccessIntegrations"). Field("IsExternalFunction", "bool"). Field("Language", "string"). - Field("IsMemoizable", "bool"), + Field("IsMemoizable", "bool"). + Field("IsDataMetric", "bool"), g.NewQueryStruct("ShowFunctions"). Show(). SQL("USER FUNCTIONS"). OptionalLike(). - OptionalIn(), + OptionalExtendedIn(), ).ShowByIdOperation().DescribeOperation( g.DescriptionMappingKindSlice, "https://docs.snowflake.com/en/sql-reference/sql/desc-function", @@ -312,8 +362,8 @@ var FunctionsDef = g.NewInterface( Field("property", "string"). Field("value", "sql.NullString"), g.PlainStruct("FunctionDetail"). - Field("Property", "string"). - Field("Value", "string"), + Text("Property"). + OptionalText("Value"), g.NewQueryStruct("DescribeFunction"). Describe(). SQL("FUNCTION"). diff --git a/pkg/sdk/functions_dto_builders_gen.go b/pkg/sdk/functions_dto_builders_gen.go index 3bb40dfd0e..7d0a49180a 100644 --- a/pkg/sdk/functions_dto_builders_gen.go +++ b/pkg/sdk/functions_dto_builders_gen.go @@ -99,6 +99,26 @@ func (s *CreateForJavaFunctionRequest) WithTargetPath(TargetPath string) *Create return s } +func (s *CreateForJavaFunctionRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *CreateForJavaFunctionRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *CreateForJavaFunctionRequest) WithLogLevel(LogLevel LogLevel) *CreateForJavaFunctionRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *CreateForJavaFunctionRequest) WithMetricLevel(MetricLevel MetricLevel) *CreateForJavaFunctionRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *CreateForJavaFunctionRequest) WithTraceLevel(TraceLevel TraceLevel) *CreateForJavaFunctionRequest { + s.TraceLevel = &TraceLevel + return s +} + func (s *CreateForJavaFunctionRequest) WithFunctionDefinition(FunctionDefinition string) *CreateForJavaFunctionRequest { s.FunctionDefinition = &FunctionDefinition return s @@ -250,6 +270,26 @@ func (s *CreateForJavascriptFunctionRequest) WithComment(Comment string) *Create return s } +func (s *CreateForJavascriptFunctionRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *CreateForJavascriptFunctionRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *CreateForJavascriptFunctionRequest) WithLogLevel(LogLevel LogLevel) *CreateForJavascriptFunctionRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *CreateForJavascriptFunctionRequest) WithMetricLevel(MetricLevel MetricLevel) *CreateForJavascriptFunctionRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *CreateForJavascriptFunctionRequest) WithTraceLevel(TraceLevel TraceLevel) *CreateForJavascriptFunctionRequest { + s.TraceLevel = &TraceLevel + return s +} + func NewCreateForPythonFunctionRequest( name SchemaObjectIdentifier, Returns FunctionReturnsRequest, @@ -279,6 +319,11 @@ func (s *CreateForPythonFunctionRequest) WithSecure(Secure bool) *CreateForPytho return s } +func (s *CreateForPythonFunctionRequest) WithAggregate(Aggregate bool) *CreateForPythonFunctionRequest { + s.Aggregate = &Aggregate + return s +} + func (s *CreateForPythonFunctionRequest) WithIfNotExists(IfNotExists bool) *CreateForPythonFunctionRequest { s.IfNotExists = &IfNotExists return s @@ -334,6 +379,26 @@ func (s *CreateForPythonFunctionRequest) WithSecrets(Secrets []SecretReference) return s } +func (s *CreateForPythonFunctionRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *CreateForPythonFunctionRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *CreateForPythonFunctionRequest) WithLogLevel(LogLevel LogLevel) *CreateForPythonFunctionRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *CreateForPythonFunctionRequest) WithMetricLevel(MetricLevel MetricLevel) *CreateForPythonFunctionRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *CreateForPythonFunctionRequest) WithTraceLevel(TraceLevel TraceLevel) *CreateForPythonFunctionRequest { + s.TraceLevel = &TraceLevel + return s +} + func (s *CreateForPythonFunctionRequest) WithFunctionDefinition(FunctionDefinition string) *CreateForPythonFunctionRequest { s.FunctionDefinition = &FunctionDefinition return s @@ -343,11 +408,13 @@ func NewCreateForScalaFunctionRequest( name SchemaObjectIdentifier, ResultDataType datatypes.DataType, Handler string, + RuntimeVersion string, ) *CreateForScalaFunctionRequest { s := CreateForScalaFunctionRequest{} s.name = name s.ResultDataType = ResultDataType s.Handler = Handler + s.RuntimeVersion = RuntimeVersion return &s } @@ -401,11 +468,6 @@ func (s *CreateForScalaFunctionRequest) WithReturnResultsBehavior(ReturnResultsB return s } -func (s *CreateForScalaFunctionRequest) WithRuntimeVersion(RuntimeVersion string) *CreateForScalaFunctionRequest { - s.RuntimeVersion = &RuntimeVersion - return s -} - func (s *CreateForScalaFunctionRequest) WithComment(Comment string) *CreateForScalaFunctionRequest { s.Comment = &Comment return s @@ -421,11 +483,41 @@ func (s *CreateForScalaFunctionRequest) WithPackages(Packages []FunctionPackageR return s } +func (s *CreateForScalaFunctionRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations []AccountObjectIdentifier) *CreateForScalaFunctionRequest { + s.ExternalAccessIntegrations = ExternalAccessIntegrations + return s +} + +func (s *CreateForScalaFunctionRequest) WithSecrets(Secrets []SecretReference) *CreateForScalaFunctionRequest { + s.Secrets = Secrets + return s +} + func (s *CreateForScalaFunctionRequest) WithTargetPath(TargetPath string) *CreateForScalaFunctionRequest { s.TargetPath = &TargetPath return s } +func (s *CreateForScalaFunctionRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *CreateForScalaFunctionRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *CreateForScalaFunctionRequest) WithLogLevel(LogLevel LogLevel) *CreateForScalaFunctionRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *CreateForScalaFunctionRequest) WithMetricLevel(MetricLevel MetricLevel) *CreateForScalaFunctionRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *CreateForScalaFunctionRequest) WithTraceLevel(TraceLevel TraceLevel) *CreateForScalaFunctionRequest { + s.TraceLevel = &TraceLevel + return s +} + func (s *CreateForScalaFunctionRequest) WithFunctionDefinition(FunctionDefinition string) *CreateForScalaFunctionRequest { s.FunctionDefinition = &FunctionDefinition return s @@ -488,6 +580,26 @@ func (s *CreateForSQLFunctionRequest) WithComment(Comment string) *CreateForSQLF return s } +func (s *CreateForSQLFunctionRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *CreateForSQLFunctionRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *CreateForSQLFunctionRequest) WithLogLevel(LogLevel LogLevel) *CreateForSQLFunctionRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *CreateForSQLFunctionRequest) WithMetricLevel(MetricLevel MetricLevel) *CreateForSQLFunctionRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *CreateForSQLFunctionRequest) WithTraceLevel(TraceLevel TraceLevel) *CreateForSQLFunctionRequest { + s.TraceLevel = &TraceLevel + return s +} + func NewAlterFunctionRequest( name SchemaObjectIdentifierWithArguments, ) *AlterFunctionRequest { @@ -506,18 +618,13 @@ func (s *AlterFunctionRequest) WithRenameTo(RenameTo SchemaObjectIdentifier) *Al return s } -func (s *AlterFunctionRequest) WithSetComment(SetComment string) *AlterFunctionRequest { - s.SetComment = &SetComment - return s -} - -func (s *AlterFunctionRequest) WithSetLogLevel(SetLogLevel string) *AlterFunctionRequest { - s.SetLogLevel = &SetLogLevel +func (s *AlterFunctionRequest) WithSet(Set FunctionSetRequest) *AlterFunctionRequest { + s.Set = &Set return s } -func (s *AlterFunctionRequest) WithSetTraceLevel(SetTraceLevel string) *AlterFunctionRequest { - s.SetTraceLevel = &SetTraceLevel +func (s *AlterFunctionRequest) WithUnset(Unset FunctionUnsetRequest) *AlterFunctionRequest { + s.Unset = &Unset return s } @@ -531,28 +638,94 @@ func (s *AlterFunctionRequest) WithUnsetSecure(UnsetSecure bool) *AlterFunctionR return s } -func (s *AlterFunctionRequest) WithUnsetLogLevel(UnsetLogLevel bool) *AlterFunctionRequest { - s.UnsetLogLevel = &UnsetLogLevel +func (s *AlterFunctionRequest) WithSetTags(SetTags []TagAssociation) *AlterFunctionRequest { + s.SetTags = SetTags return s } -func (s *AlterFunctionRequest) WithUnsetTraceLevel(UnsetTraceLevel bool) *AlterFunctionRequest { - s.UnsetTraceLevel = &UnsetTraceLevel +func (s *AlterFunctionRequest) WithUnsetTags(UnsetTags []ObjectIdentifier) *AlterFunctionRequest { + s.UnsetTags = UnsetTags return s } -func (s *AlterFunctionRequest) WithUnsetComment(UnsetComment bool) *AlterFunctionRequest { - s.UnsetComment = &UnsetComment +func NewFunctionSetRequest() *FunctionSetRequest { + return &FunctionSetRequest{} +} + +func (s *FunctionSetRequest) WithComment(Comment string) *FunctionSetRequest { + s.Comment = &Comment return s } -func (s *AlterFunctionRequest) WithSetTags(SetTags []TagAssociation) *AlterFunctionRequest { - s.SetTags = SetTags +func (s *FunctionSetRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations []AccountObjectIdentifier) *FunctionSetRequest { + s.ExternalAccessIntegrations = ExternalAccessIntegrations return s } -func (s *AlterFunctionRequest) WithUnsetTags(UnsetTags []ObjectIdentifier) *AlterFunctionRequest { - s.UnsetTags = UnsetTags +func (s *FunctionSetRequest) WithSecretsList(SecretsList SecretsListRequest) *FunctionSetRequest { + s.SecretsList = &SecretsList + return s +} + +func (s *FunctionSetRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *FunctionSetRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *FunctionSetRequest) WithLogLevel(LogLevel LogLevel) *FunctionSetRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *FunctionSetRequest) WithMetricLevel(MetricLevel MetricLevel) *FunctionSetRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *FunctionSetRequest) WithTraceLevel(TraceLevel TraceLevel) *FunctionSetRequest { + s.TraceLevel = &TraceLevel + return s +} + +func NewSecretsListRequest( + SecretsList []SecretReference, +) *SecretsListRequest { + s := SecretsListRequest{} + s.SecretsList = SecretsList + return &s +} + +func NewFunctionUnsetRequest() *FunctionUnsetRequest { + return &FunctionUnsetRequest{} +} + +func (s *FunctionUnsetRequest) WithComment(Comment bool) *FunctionUnsetRequest { + s.Comment = &Comment + return s +} + +func (s *FunctionUnsetRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations bool) *FunctionUnsetRequest { + s.ExternalAccessIntegrations = &ExternalAccessIntegrations + return s +} + +func (s *FunctionUnsetRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *FunctionUnsetRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *FunctionUnsetRequest) WithLogLevel(LogLevel bool) *FunctionUnsetRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *FunctionUnsetRequest) WithMetricLevel(MetricLevel bool) *FunctionUnsetRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *FunctionUnsetRequest) WithTraceLevel(TraceLevel bool) *FunctionUnsetRequest { + s.TraceLevel = &TraceLevel return s } @@ -578,7 +751,7 @@ func (s *ShowFunctionRequest) WithLike(Like Like) *ShowFunctionRequest { return s } -func (s *ShowFunctionRequest) WithIn(In In) *ShowFunctionRequest { +func (s *ShowFunctionRequest) WithIn(In ExtendedIn) *ShowFunctionRequest { s.In = &In return s } diff --git a/pkg/sdk/functions_dto_gen.go b/pkg/sdk/functions_dto_gen.go index 4ff74dcd73..14e86f6260 100644 --- a/pkg/sdk/functions_dto_gen.go +++ b/pkg/sdk/functions_dto_gen.go @@ -36,6 +36,10 @@ type CreateForJavaFunctionRequest struct { ExternalAccessIntegrations []AccountObjectIdentifier Secrets []SecretReference TargetPath *string + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel FunctionDefinition *string } @@ -86,6 +90,10 @@ type CreateForJavascriptFunctionRequest struct { NullInputBehavior *NullInputBehavior ReturnResultsBehavior *ReturnResultsBehavior Comment *string + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel FunctionDefinition string // required } @@ -93,6 +101,7 @@ type CreateForPythonFunctionRequest struct { OrReplace *bool Temporary *bool Secure *bool + Aggregate *bool IfNotExists *bool name SchemaObjectIdentifier // required Arguments []FunctionArgumentRequest @@ -108,29 +117,39 @@ type CreateForPythonFunctionRequest struct { Handler string // required ExternalAccessIntegrations []AccountObjectIdentifier Secrets []SecretReference + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel FunctionDefinition *string } type CreateForScalaFunctionRequest struct { - OrReplace *bool - Temporary *bool - Secure *bool - IfNotExists *bool - name SchemaObjectIdentifier // required - Arguments []FunctionArgumentRequest - CopyGrants *bool - ResultDataTypeOld DataType - ResultDataType datatypes.DataType // required - ReturnNullValues *ReturnNullValues - NullInputBehavior *NullInputBehavior - ReturnResultsBehavior *ReturnResultsBehavior - RuntimeVersion *string - Comment *string - Imports []FunctionImportRequest - Packages []FunctionPackageRequest - Handler string // required - TargetPath *string - FunctionDefinition *string + OrReplace *bool + Temporary *bool + Secure *bool + IfNotExists *bool + name SchemaObjectIdentifier // required + Arguments []FunctionArgumentRequest + CopyGrants *bool + ResultDataTypeOld DataType + ResultDataType datatypes.DataType // required + ReturnNullValues *ReturnNullValues + NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior + RuntimeVersion string // required + Comment *string + Imports []FunctionImportRequest + Packages []FunctionPackageRequest + Handler string // required + ExternalAccessIntegrations []AccountObjectIdentifier + Secrets []SecretReference + TargetPath *string + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel + FunctionDefinition *string } type CreateForSQLFunctionRequest struct { @@ -145,23 +164,46 @@ type CreateForSQLFunctionRequest struct { ReturnResultsBehavior *ReturnResultsBehavior Memoizable *bool Comment *string + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel FunctionDefinition string // required } type AlterFunctionRequest struct { - IfExists *bool - name SchemaObjectIdentifierWithArguments // required - RenameTo *SchemaObjectIdentifier - SetComment *string - SetLogLevel *string - SetTraceLevel *string - SetSecure *bool - UnsetSecure *bool - UnsetLogLevel *bool - UnsetTraceLevel *bool - UnsetComment *bool - SetTags []TagAssociation - UnsetTags []ObjectIdentifier + IfExists *bool + name SchemaObjectIdentifierWithArguments // required + RenameTo *SchemaObjectIdentifier + Set *FunctionSetRequest + Unset *FunctionUnsetRequest + SetSecure *bool + UnsetSecure *bool + SetTags []TagAssociation + UnsetTags []ObjectIdentifier +} + +type FunctionSetRequest struct { + Comment *string + ExternalAccessIntegrations []AccountObjectIdentifier + SecretsList *SecretsListRequest + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel +} + +type SecretsListRequest struct { + SecretsList []SecretReference // required +} + +type FunctionUnsetRequest struct { + Comment *bool + ExternalAccessIntegrations *bool + EnableConsoleOutput *bool + LogLevel *bool + MetricLevel *bool + TraceLevel *bool } type DropFunctionRequest struct { @@ -171,7 +213,7 @@ type DropFunctionRequest struct { type ShowFunctionRequest struct { Like *Like - In *In + In *ExtendedIn } type DescribeFunctionRequest struct { diff --git a/pkg/sdk/functions_ext.go b/pkg/sdk/functions_ext.go index 4fe8a9524d..531ddfd9fa 100644 --- a/pkg/sdk/functions_ext.go +++ b/pkg/sdk/functions_ext.go @@ -1,5 +1,149 @@ package sdk +import ( + "context" + "errors" + "fmt" + "strconv" +) + +const DefaultFunctionComment = "user-defined function" + func (v *Function) ID() SchemaObjectIdentifierWithArguments { return NewSchemaObjectIdentifierWithArguments(v.CatalogName, v.SchemaName, v.Name, v.ArgumentsOld...) } + +// FunctionDetails contains aggregated describe results for the given function. +type FunctionDetails struct { + Signature string // present for all function types + Returns string // present for all function types + Language string // present for all function types + Body *string // present for all function types (hidden when SECURE) + NullHandling *string // present for all function types but SQL + Volatility *string // present for all function types but SQL + ExternalAccessIntegrations *string // list present for python, java, and scala + Secrets *string // map present for python, java, and scala + Imports *string // list present for python, java, and scala (hidden when SECURE) + Handler *string // present for python, java, and scala (hidden when SECURE) + RuntimeVersion *string // present for python, java, and scala (hidden when SECURE) + Packages *string // list // present for python, java, and scala + TargetPath *string // list present for scala and java (hidden when SECURE) + InstalledPackages *string // list present for python (hidden when SECURE) + IsAggregate *bool // present for python +} + +func functionDetailsFromRows(rows []FunctionDetail) (*FunctionDetails, error) { + v := &FunctionDetails{} + var errs []error + for _, row := range rows { + switch row.Property { + case "signature": + errs = append(errs, row.setStringValueOrError("signature", &v.Signature)) + case "returns": + errs = append(errs, row.setStringValueOrError("returns", &v.Returns)) + case "language": + errs = append(errs, row.setStringValueOrError("language", &v.Language)) + case "null handling": + v.NullHandling = row.Value + case "volatility": + v.Volatility = row.Value + case "body": + v.Body = row.Value + case "external_access_integrations": + v.ExternalAccessIntegrations = row.Value + case "secrets": + v.Secrets = row.Value + case "imports": + v.Imports = row.Value + case "handler": + v.Handler = row.Value + case "runtime_version": + v.RuntimeVersion = row.Value + case "packages": + v.Packages = row.Value + case "installed_packages": + v.InstalledPackages = row.Value + case "is_aggregate": + errs = append(errs, row.setOptionalBoolValueOrError("is_aggregate", &v.IsAggregate)) + case "target_path": + v.TargetPath = row.Value + } + } + return v, errors.Join(errs...) +} + +func (v *functions) DescribeDetails(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*FunctionDetails, error) { + rows, err := v.Describe(ctx, id) + if err != nil { + return nil, err + } + return functionDetailsFromRows(rows) +} + +func (v *functions) ShowParameters(ctx context.Context, id SchemaObjectIdentifierWithArguments) ([]*Parameter, error) { + return v.client.Parameters.ShowParameters(ctx, &ShowParametersOptions{ + In: &ParametersIn{ + Function: id, + }, + }) +} + +func (d *FunctionDetail) setStringValueOrError(property string, field *string) error { + if d.Value == nil { + return fmt.Errorf("value expected for field %s", property) + } else { + *field = *d.Value + } + return nil +} + +func (d *FunctionDetail) setOptionalBoolValueOrError(property string, field **bool) error { + if d.Value != nil && *d.Value != "" { + v, err := strconv.ParseBool(*d.Value) + if err != nil { + return fmt.Errorf("invalid value for field %s, err: %w", property, err) + } else { + *field = Bool(v) + } + } + return nil +} + +func (s *CreateForJavaFunctionRequest) WithFunctionDefinitionWrapped(functionDefinition string) *CreateForJavaFunctionRequest { + s.FunctionDefinition = String(fmt.Sprintf(`$$%s$$`, functionDefinition)) + return s +} + +func (s *CreateForPythonFunctionRequest) WithFunctionDefinitionWrapped(functionDefinition string) *CreateForPythonFunctionRequest { + s.FunctionDefinition = String(fmt.Sprintf(`$$%s$$`, functionDefinition)) + return s +} + +func (s *CreateForScalaFunctionRequest) WithFunctionDefinitionWrapped(functionDefinition string) *CreateForScalaFunctionRequest { + s.FunctionDefinition = String(fmt.Sprintf(`$$%s$$`, functionDefinition)) + return s +} + +func NewCreateForSQLFunctionRequestDefinitionWrapped( + name SchemaObjectIdentifier, + returns FunctionReturnsRequest, + functionDefinition string, +) *CreateForSQLFunctionRequest { + s := CreateForSQLFunctionRequest{} + s.name = name + s.Returns = returns + s.FunctionDefinition = fmt.Sprintf(`$$%s$$`, functionDefinition) + return &s +} + +func NewCreateForJavascriptFunctionRequestDefinitionWrapped( + name SchemaObjectIdentifier, + returns FunctionReturnsRequest, + functionDefinition string, +) *CreateForJavascriptFunctionRequest { + s := CreateForJavascriptFunctionRequest{} + s.name = name + s.Returns = returns + s.FunctionDefinition = fmt.Sprintf(`$$%s$$`, functionDefinition) + return &s +} diff --git a/pkg/sdk/functions_gen.go b/pkg/sdk/functions_gen.go index ab7ca62170..ad2ec72844 100644 --- a/pkg/sdk/functions_gen.go +++ b/pkg/sdk/functions_gen.go @@ -19,6 +19,10 @@ type Functions interface { Show(ctx context.Context, request *ShowFunctionRequest) ([]Function, error) ShowByID(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*Function, error) Describe(ctx context.Context, id SchemaObjectIdentifierWithArguments) ([]FunctionDetail, error) + + // DescribeDetails is added manually; it returns aggregated describe results for the given function. + DescribeDetails(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*FunctionDetails, error) + ShowParameters(ctx context.Context, id SchemaObjectIdentifierWithArguments) ([]*Parameter, error) } // CreateForJavaFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-function#java-handler. @@ -45,11 +49,15 @@ type CreateForJavaFunctionOptions struct { ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` - FunctionDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + FunctionDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } type FunctionArgument struct { - ArgName string `ddl:"keyword,no_quotes"` + ArgName string `ddl:"keyword,double_quotes"` ArgDataTypeOld DataType `ddl:"keyword,no_quotes"` ArgDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` DefaultValue *string `ddl:"parameter,no_equals" sql:"DEFAULT"` @@ -70,7 +78,7 @@ type FunctionReturnsTable struct { } type FunctionColumn struct { - ColumnName string `ddl:"keyword,no_quotes"` + ColumnName string `ddl:"keyword,double_quotes"` ColumnDataTypeOld DataType `ddl:"keyword,no_quotes"` ColumnDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` } @@ -99,7 +107,11 @@ type CreateForJavascriptFunctionOptions struct { NullInputBehavior *NullInputBehavior `ddl:"keyword"` ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - FunctionDefinition string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + FunctionDefinition string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForPythonFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-function#python-handler. @@ -108,6 +120,7 @@ type CreateForPythonFunctionOptions struct { OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` Temporary *bool `ddl:"keyword" sql:"TEMPORARY"` Secure *bool `ddl:"keyword" sql:"SECURE"` + Aggregate *bool `ddl:"keyword" sql:"AGGREGATE"` function bool `ddl:"static" sql:"FUNCTION"` IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` name SchemaObjectIdentifier `ddl:"identifier"` @@ -125,34 +138,44 @@ type CreateForPythonFunctionOptions struct { Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` - FunctionDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + FunctionDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForScalaFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-function#scala-handler. type CreateForScalaFunctionOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Temporary *bool `ddl:"keyword" sql:"TEMPORARY"` - Secure *bool `ddl:"keyword" sql:"SECURE"` - function bool `ddl:"static" sql:"FUNCTION"` - IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` - name SchemaObjectIdentifier `ddl:"identifier"` - Arguments []FunctionArgument `ddl:"list,must_parentheses"` - CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` - returns bool `ddl:"static" sql:"RETURNS"` - ResultDataTypeOld DataType `ddl:"parameter,no_equals"` - ResultDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` - ReturnNullValues *ReturnNullValues `ddl:"keyword"` - languageScala bool `ddl:"static" sql:"LANGUAGE SCALA"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` - ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` - RuntimeVersion *string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - Imports []FunctionImport `ddl:"parameter,parentheses" sql:"IMPORTS"` - Packages []FunctionPackage `ddl:"parameter,parentheses" sql:"PACKAGES"` - Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` - TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` - FunctionDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Temporary *bool `ddl:"keyword" sql:"TEMPORARY"` + Secure *bool `ddl:"keyword" sql:"SECURE"` + function bool `ddl:"static" sql:"FUNCTION"` + IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` + name SchemaObjectIdentifier `ddl:"identifier"` + Arguments []FunctionArgument `ddl:"list,must_parentheses"` + CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` + returns bool `ddl:"static" sql:"RETURNS"` + ResultDataTypeOld DataType `ddl:"parameter,no_equals"` + ResultDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` + ReturnNullValues *ReturnNullValues `ddl:"keyword"` + languageScala bool `ddl:"static" sql:"LANGUAGE SCALA"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` + RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + Imports []FunctionImport `ddl:"parameter,parentheses" sql:"IMPORTS"` + Packages []FunctionPackage `ddl:"parameter,parentheses" sql:"PACKAGES"` + Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` + ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` + TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + FunctionDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForSQLFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-function#sql-handler. @@ -170,26 +193,49 @@ type CreateForSQLFunctionOptions struct { ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` Memoizable *bool `ddl:"keyword" sql:"MEMOIZABLE"` Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - FunctionDefinition string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + FunctionDefinition string `ddl:"parameter,no_equals" sql:"AS"` } // AlterFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/alter-function. type AlterFunctionOptions struct { - alter bool `ddl:"static" sql:"ALTER"` - function bool `ddl:"static" sql:"FUNCTION"` - IfExists *bool `ddl:"keyword" sql:"IF EXISTS"` - name SchemaObjectIdentifierWithArguments `ddl:"identifier"` - RenameTo *SchemaObjectIdentifier `ddl:"identifier" sql:"RENAME TO"` - SetComment *string `ddl:"parameter,single_quotes" sql:"SET COMMENT"` - SetLogLevel *string `ddl:"parameter,single_quotes" sql:"SET LOG_LEVEL"` - SetTraceLevel *string `ddl:"parameter,single_quotes" sql:"SET TRACE_LEVEL"` - SetSecure *bool `ddl:"keyword" sql:"SET SECURE"` - UnsetSecure *bool `ddl:"keyword" sql:"UNSET SECURE"` - UnsetLogLevel *bool `ddl:"keyword" sql:"UNSET LOG_LEVEL"` - UnsetTraceLevel *bool `ddl:"keyword" sql:"UNSET TRACE_LEVEL"` - UnsetComment *bool `ddl:"keyword" sql:"UNSET COMMENT"` - SetTags []TagAssociation `ddl:"keyword" sql:"SET TAG"` - UnsetTags []ObjectIdentifier `ddl:"keyword" sql:"UNSET TAG"` + alter bool `ddl:"static" sql:"ALTER"` + function bool `ddl:"static" sql:"FUNCTION"` + IfExists *bool `ddl:"keyword" sql:"IF EXISTS"` + name SchemaObjectIdentifierWithArguments `ddl:"identifier"` + RenameTo *SchemaObjectIdentifier `ddl:"identifier" sql:"RENAME TO"` + Set *FunctionSet `ddl:"list" sql:"SET"` + Unset *FunctionUnset `ddl:"list" sql:"UNSET"` + SetSecure *bool `ddl:"keyword" sql:"SET SECURE"` + UnsetSecure *bool `ddl:"keyword" sql:"UNSET SECURE"` + SetTags []TagAssociation `ddl:"keyword" sql:"SET TAG"` + UnsetTags []ObjectIdentifier `ddl:"keyword" sql:"UNSET TAG"` +} + +type FunctionSet struct { + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + SecretsList *SecretsList `ddl:"parameter,parentheses" sql:"SECRETS"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` +} + +type SecretsList struct { + SecretsList []SecretReference `ddl:"list,must_parentheses"` +} + +type FunctionUnset struct { + Comment *bool `ddl:"keyword" sql:"COMMENT"` + ExternalAccessIntegrations *bool `ddl:"keyword" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + EnableConsoleOutput *bool `ddl:"keyword" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *bool `ddl:"keyword" sql:"LOG_LEVEL"` + MetricLevel *bool `ddl:"keyword" sql:"METRIC_LEVEL"` + TraceLevel *bool `ddl:"keyword" sql:"TRACE_LEVEL"` } // DropFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/drop-function. @@ -202,51 +248,57 @@ type DropFunctionOptions struct { // ShowFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/show-user-functions. type ShowFunctionOptions struct { - show bool `ddl:"static" sql:"SHOW"` - userFunctions bool `ddl:"static" sql:"USER FUNCTIONS"` - Like *Like `ddl:"keyword" sql:"LIKE"` - In *In `ddl:"keyword" sql:"IN"` + show bool `ddl:"static" sql:"SHOW"` + userFunctions bool `ddl:"static" sql:"USER FUNCTIONS"` + Like *Like `ddl:"keyword" sql:"LIKE"` + In *ExtendedIn `ddl:"keyword" sql:"IN"` } type functionRow struct { - CreatedOn string `db:"created_on"` - Name string `db:"name"` - SchemaName string `db:"schema_name"` - IsBuiltin string `db:"is_builtin"` - IsAggregate string `db:"is_aggregate"` - IsAnsi string `db:"is_ansi"` - MinNumArguments int `db:"min_num_arguments"` - MaxNumArguments int `db:"max_num_arguments"` - Arguments string `db:"arguments"` - Description string `db:"description"` - CatalogName string `db:"catalog_name"` - IsTableFunction string `db:"is_table_function"` - ValidForClustering string `db:"valid_for_clustering"` - IsSecure sql.NullString `db:"is_secure"` - IsExternalFunction string `db:"is_external_function"` - Language string `db:"language"` - IsMemoizable sql.NullString `db:"is_memoizable"` + CreatedOn string `db:"created_on"` + Name string `db:"name"` + SchemaName string `db:"schema_name"` + IsBuiltin string `db:"is_builtin"` + IsAggregate string `db:"is_aggregate"` + IsAnsi string `db:"is_ansi"` + MinNumArguments int `db:"min_num_arguments"` + MaxNumArguments int `db:"max_num_arguments"` + Arguments string `db:"arguments"` + Description string `db:"description"` + CatalogName string `db:"catalog_name"` + IsTableFunction string `db:"is_table_function"` + ValidForClustering string `db:"valid_for_clustering"` + IsSecure sql.NullString `db:"is_secure"` + Secrets sql.NullString `db:"secrets"` + ExternalAccessIntegrations sql.NullString `db:"external_access_integrations"` + IsExternalFunction string `db:"is_external_function"` + Language string `db:"language"` + IsMemoizable sql.NullString `db:"is_memoizable"` + IsDataMetric sql.NullString `db:"is_data_metric"` } type Function struct { - CreatedOn string - Name string - SchemaName string - IsBuiltin bool - IsAggregate bool - IsAnsi bool - MinNumArguments int - MaxNumArguments int - ArgumentsOld []DataType - ArgumentsRaw string - Description string - CatalogName string - IsTableFunction bool - ValidForClustering bool - IsSecure bool - IsExternalFunction bool - Language string - IsMemoizable bool + CreatedOn string + Name string + SchemaName string + IsBuiltin bool + IsAggregate bool + IsAnsi bool + MinNumArguments int + MaxNumArguments int + ArgumentsOld []DataType + ArgumentsRaw string + Description string + CatalogName string + IsTableFunction bool + ValidForClustering bool + IsSecure bool + Secrets *string + ExternalAccessIntegrations *string + IsExternalFunction bool + Language string + IsMemoizable bool + IsDataMetric bool } // DescribeFunctionOptions is based on https://docs.snowflake.com/en/sql-reference/sql/desc-function. @@ -263,5 +315,5 @@ type functionDetailRow struct { type FunctionDetail struct { Property string - Value string + Value *string } diff --git a/pkg/sdk/functions_gen_test.go b/pkg/sdk/functions_gen_test.go index 95c21d9204..7d8d8d9a79 100644 --- a/pkg/sdk/functions_gen_test.go +++ b/pkg/sdk/functions_gen_test.go @@ -1,11 +1,18 @@ package sdk import ( + "fmt" "testing" ) +func wrapFunctionDefinition(def string) string { + return fmt.Sprintf(`$$%s$$`, def) +} + func TestFunctions_CreateForJava(t *testing.T) { id := randomSchemaObjectIdentifier() + secretId := randomSchemaObjectIdentifier() + secretId2 := randomSchemaObjectIdentifier() defaultOpts := func() *CreateForJavaFunctionOptions { return &CreateForJavaFunctionOptions{ @@ -18,12 +25,29 @@ func TestFunctions_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) + t.Run("validation: [opts.Handler] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{ + ResultDataType: dataTypeVarchar, + }, + } + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavaFunctionOptions", "Handler")) + }) + + t.Run("validation: conflicting fields for [opts.OrReplace opts.IfNotExists]", func(t *testing.T) { + opts := defaultOpts() + opts.OrReplace = Bool(true) + opts.IfNotExists = Bool(true) + assertOptsInvalidJoinedErrors(t, opts, errOneOf("CreateForJavaFunctionOptions", "OrReplace", "IfNotExists")) + }) + t.Run("validation: exactly one field from [opts.Arguments.ArgDataTypeOld opts.Arguments.ArgDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Arguments = []FunctionArgument{ @@ -49,7 +73,7 @@ func TestFunctions_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaFunctionOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) - t.Run("validation: returns", func(t *testing.T) { + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = FunctionReturns{} assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaFunctionOptions.Returns", "ResultDataType", "Table")) @@ -120,20 +144,9 @@ func TestFunctions_CreateForJava(t *testing.T) { }, } assertOptsInvalidJoinedErrors(t, opts, NewError("TARGET_PATH must be nil when AS is nil")) - assertOptsInvalidJoinedErrors(t, opts, NewError("PACKAGES must be empty when AS is nil")) assertOptsInvalidJoinedErrors(t, opts, NewError("IMPORTS must not be empty when AS is nil")) }) - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{ - ResultDataType: &FunctionReturnsResultDataType{ - ResultDataType: dataTypeVarchar, - }, - } - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavaFunctionOptions", "Handler")) - }) - // TODO [SNOW-1348103]: remove with old function removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -188,16 +201,16 @@ func TestFunctions_CreateForJava(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.TargetPath = String("@~/testfunc.jar") - opts.FunctionDefinition = String("return id + name;") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (id NUMBER, name VARCHAR DEFAULT 'test') COPY GRANTS RETURNS TABLE (country_code VARCHAR, country_name VARCHAR) NOT NULL LANGUAGE JAVA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@~/my_decrement_udf_package_dir/my_decrement_udf_jar.jar') PACKAGES = ('com.snowflake:snowpark:1.2.0') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) TARGET_PATH = '@~/testfunc.jar' AS 'return id + name;'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("return id + name;")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("id" NUMBER, "name" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS TABLE ("country_code" VARCHAR, "country_name" VARCHAR) NOT NULL LANGUAGE JAVA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@~/my_decrement_udf_package_dir/my_decrement_udf_jar.jar') PACKAGES = ('com.snowflake:snowpark:1.2.0') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) TARGET_PATH = '@~/testfunc.jar' AS $$return id + name;$$`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -253,16 +266,16 @@ func TestFunctions_CreateForJava(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.TargetPath = String("@~/testfunc.jar") - opts.FunctionDefinition = String("return id + name;") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (id NUMBER(36, 2), name VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS TABLE (country_code VARCHAR(100), country_name VARCHAR(100)) NOT NULL LANGUAGE JAVA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@~/my_decrement_udf_package_dir/my_decrement_udf_jar.jar') PACKAGES = ('com.snowflake:snowpark:1.2.0') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) TARGET_PATH = '@~/testfunc.jar' AS 'return id + name;'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("return id + name;")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("id" NUMBER(36, 2), "name" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS TABLE ("country_code" VARCHAR(100), "country_name" VARCHAR(100)) NOT NULL LANGUAGE JAVA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@~/my_decrement_udf_package_dir/my_decrement_udf_jar.jar') PACKAGES = ('com.snowflake:snowpark:1.2.0') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) TARGET_PATH = '@~/testfunc.jar' AS $$return id + name;$$`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) } @@ -280,7 +293,17 @@ func TestFunctions_CreateForJavascript(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: [opts.FunctionDefinition] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{ + ResultDataType: dataTypeVarchar, + }, + } + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavascriptFunctionOptions", "FunctionDefinition")) + }) + + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -311,6 +334,21 @@ func TestFunctions_CreateForJavascript(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavascriptFunctionOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavascriptFunctionOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{}, + Table: &FunctionReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavascriptFunctionOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = FunctionReturns{ @@ -367,22 +405,6 @@ func TestFunctions_CreateForJavascript(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: returns", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavascriptFunctionOptions.Returns", "ResultDataType", "Table")) - }) - - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{ - ResultDataType: &FunctionReturnsResultDataType{ - ResultDataType: dataTypeVarchar, - }, - } - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavascriptFunctionOptions", "FunctionDefinition")) - }) - // TODO [SNOW-1348103]: remove with old function removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -406,8 +428,8 @@ func TestFunctions_CreateForJavascript(t *testing.T) { opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorCalledOnNullInput) opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) opts.Comment = String("comment") - opts.FunctionDefinition = "return 1;" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (d FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT CALLED ON NULL INPUT IMMUTABLE COMMENT = 'comment' AS 'return 1;'`, id.FullyQualifiedName()) + opts.FunctionDefinition = wrapFunctionDefinition("return 1;") + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("d" FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT CALLED ON NULL INPUT IMMUTABLE COMMENT = 'comment' AS $$return 1;$$`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -432,13 +454,15 @@ func TestFunctions_CreateForJavascript(t *testing.T) { opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorCalledOnNullInput) opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) opts.Comment = String("comment") - opts.FunctionDefinition = "return 1;" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (d FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT CALLED ON NULL INPUT IMMUTABLE COMMENT = 'comment' AS 'return 1;'`, id.FullyQualifiedName()) + opts.FunctionDefinition = wrapFunctionDefinition("return 1;") + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("d" FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT CALLED ON NULL INPUT IMMUTABLE COMMENT = 'comment' AS $$return 1;$$`, id.FullyQualifiedName()) }) } func TestFunctions_CreateForPython(t *testing.T) { id := randomSchemaObjectIdentifier() + secretId := randomSchemaObjectIdentifier() + secretId2 := randomSchemaObjectIdentifier() defaultOpts := func() *CreateForPythonFunctionOptions { return &CreateForPythonFunctionOptions{ @@ -451,12 +475,39 @@ func TestFunctions_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) + t.Run("validation: [opts.RuntimeVersion] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{ + ResultDataType: dataTypeVarchar, + }, + } + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForPythonFunctionOptions", "RuntimeVersion")) + }) + + t.Run("validation: [opts.Handler] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{ + ResultDataType: dataTypeVarchar, + }, + } + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForPythonFunctionOptions", "Handler")) + }) + + t.Run("validation: conflicting fields for [opts.OrReplace opts.IfNotExists]", func(t *testing.T) { + opts := defaultOpts() + opts.OrReplace = Bool(true) + opts.IfNotExists = Bool(true) + assertOptsInvalidJoinedErrors(t, opts, errOneOf("CreateForPythonFunctionOptions", "OrReplace", "IfNotExists")) + }) + t.Run("validation: exactly one field from [opts.Arguments.ArgDataTypeOld opts.Arguments.ArgDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Arguments = []FunctionArgument{ @@ -482,6 +533,21 @@ func TestFunctions_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonFunctionOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonFunctionOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{}, + Table: &FunctionReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonFunctionOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = FunctionReturns{ @@ -538,23 +604,6 @@ func TestFunctions_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: returns", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonFunctionOptions.Returns", "ResultDataType", "Table")) - }) - - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{ - ResultDataType: &FunctionReturnsResultDataType{ - ResultDataType: dataTypeVarchar, - }, - } - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForPythonFunctionOptions", "RuntimeVersion")) - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForPythonFunctionOptions", "Handler")) - }) - t.Run("validation: function definition", func(t *testing.T) { opts := defaultOpts() opts.Packages = []FunctionPackage{ @@ -612,15 +661,15 @@ func TestFunctions_CreateForPython(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } - opts.FunctionDefinition = String("import numpy as np") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (i NUMBER DEFAULT 1) COPY GRANTS RETURNS VARIANT NOT NULL LANGUAGE PYTHON CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '3.8' COMMENT = 'comment' IMPORTS = ('numpy', 'pandas') PACKAGES = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) AS 'import numpy as np'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("import numpy as np")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("i" NUMBER DEFAULT 1) COPY GRANTS RETURNS VARIANT NOT NULL LANGUAGE PYTHON CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '3.8' COMMENT = 'comment' IMPORTS = ('numpy', 'pandas') PACKAGES = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) AS $$import numpy as np$$`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -669,15 +718,15 @@ func TestFunctions_CreateForPython(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } - opts.FunctionDefinition = String("import numpy as np") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (i NUMBER(36, 2) DEFAULT 1) COPY GRANTS RETURNS VARIANT NOT NULL LANGUAGE PYTHON CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '3.8' COMMENT = 'comment' IMPORTS = ('numpy', 'pandas') PACKAGES = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) AS 'import numpy as np'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("import numpy as np")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("i" NUMBER(36, 2) DEFAULT 1) COPY GRANTS RETURNS VARIANT NOT NULL LANGUAGE PYTHON CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '3.8' COMMENT = 'comment' IMPORTS = ('numpy', 'pandas') PACKAGES = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) AS $$import numpy as np$$`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) } @@ -695,12 +744,37 @@ func TestFunctions_CreateForScala(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) + t.Run("validation: [opts.Handler] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.ResultDataType = dataTypeVarchar + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForScalaFunctionOptions", "Handler")) + }) + + t.Run("validation: conflicting fields for [opts.OrReplace opts.IfNotExists]", func(t *testing.T) { + opts := defaultOpts() + opts.OrReplace = Bool(true) + opts.IfNotExists = Bool(true) + assertOptsInvalidJoinedErrors(t, opts, errOneOf("CreateForScalaFunctionOptions", "OrReplace", "IfNotExists")) + }) + + t.Run("validation: exactly one field from [opts.ResultDataTypeOld opts.ResultDataType] should be present", func(t *testing.T) { + opts := defaultOpts() + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaFunctionOptions", "ResultDataTypeOld", "ResultDataType")) + }) + + t.Run("validation: exactly one field from [opts.ResultDataTypeOld opts.ResultDataType] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.ResultDataTypeOld = DataTypeFloat + opts.ResultDataType = dataTypeFloat + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaFunctionOptions", "ResultDataTypeOld", "ResultDataType")) + }) + t.Run("validation: exactly one field from [opts.Arguments.ArgDataTypeOld opts.Arguments.ArgDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Arguments = []FunctionArgument{ @@ -726,18 +800,6 @@ func TestFunctions_CreateForScala(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaFunctionOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) - t.Run("validation: exactly one field from [opts.ResultDataTypeOld opts.ResultDataType] should be present", func(t *testing.T) { - opts := defaultOpts() - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaFunctionOptions", "ResultDataTypeOld", "ResultDataType")) - }) - - t.Run("validation: exactly one field from [opts.ResultDataTypeOld opts.ResultDataType] should be present - two present", func(t *testing.T) { - opts := defaultOpts() - opts.ResultDataTypeOld = DataTypeFloat - opts.ResultDataType = dataTypeFloat - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaFunctionOptions", "ResultDataTypeOld", "ResultDataType")) - }) - t.Run("validation: function definition", func(t *testing.T) { opts := defaultOpts() opts.TargetPath = String("@~/testfunc.jar") @@ -747,16 +809,9 @@ func TestFunctions_CreateForScala(t *testing.T) { }, } assertOptsInvalidJoinedErrors(t, opts, NewError("TARGET_PATH must be nil when AS is nil")) - assertOptsInvalidJoinedErrors(t, opts, NewError("PACKAGES must be empty when AS is nil")) assertOptsInvalidJoinedErrors(t, opts, NewError("IMPORTS must not be empty when AS is nil")) }) - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.ResultDataType = dataTypeVarchar - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForScalaFunctionOptions", "Handler")) - }) - // TODO [SNOW-1348103]: remove with old function removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -775,7 +830,7 @@ func TestFunctions_CreateForScala(t *testing.T) { opts.ReturnNullValues = ReturnNullValuesPointer(ReturnNullValuesNotNull) opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorCalledOnNullInput) opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) - opts.RuntimeVersion = String("2.0") + opts.RuntimeVersion = "2.0" opts.Comment = String("comment") opts.Imports = []FunctionImport{ { @@ -783,8 +838,8 @@ func TestFunctions_CreateForScala(t *testing.T) { }, } opts.Handler = "Echo.echoVarchar" - opts.FunctionDefinition = String("return x") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (x VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SCALA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' AS 'return x'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("return x")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("x" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SCALA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' AS $$return x$$`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -804,7 +859,7 @@ func TestFunctions_CreateForScala(t *testing.T) { opts.ReturnNullValues = ReturnNullValuesPointer(ReturnNullValuesNotNull) opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorCalledOnNullInput) opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) - opts.RuntimeVersion = String("2.0") + opts.RuntimeVersion = "2.0" opts.Comment = String("comment") opts.Imports = []FunctionImport{ { @@ -812,8 +867,8 @@ func TestFunctions_CreateForScala(t *testing.T) { }, } opts.Handler = "Echo.echoVarchar" - opts.FunctionDefinition = String("return x") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (x VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SCALA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' AS 'return x'`, id.FullyQualifiedName()) + opts.FunctionDefinition = String(wrapFunctionDefinition("return x")) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("x" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SCALA CALLED ON NULL INPUT IMMUTABLE RUNTIME_VERSION = '2.0' COMMENT = 'comment' IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' AS $$return x$$`, id.FullyQualifiedName()) }) } @@ -831,7 +886,17 @@ func TestFunctions_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: [opts.FunctionDefinition] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{ + ResultDataType: dataTypeVarchar, + }, + } + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForSQLFunctionOptions", "FunctionDefinition")) + }) + + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -862,6 +927,21 @@ func TestFunctions_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = FunctionReturns{ + ResultDataType: &FunctionReturnsResultDataType{}, + Table: &FunctionReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = FunctionReturns{ @@ -918,22 +998,6 @@ func TestFunctions_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: returns", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLFunctionOptions.Returns", "ResultDataType", "Table")) - }) - - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = FunctionReturns{ - ResultDataType: &FunctionReturnsResultDataType{ - ResultDataType: dataTypeVarchar, - }, - } - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForSQLFunctionOptions", "FunctionDefinition")) - }) - t.Run("create with no arguments", func(t *testing.T) { opts := defaultOpts() opts.Returns = FunctionReturns{ @@ -941,8 +1005,8 @@ func TestFunctions_CreateForSQL(t *testing.T) { ResultDataType: dataTypeFloat, }, } - opts.FunctionDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE FUNCTION %s () RETURNS FLOAT AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) + opts.FunctionDefinition = wrapFunctionDefinition("3.141592654::FLOAT") + assertOptsValidAndSQLEquals(t, opts, `CREATE FUNCTION %s () RETURNS FLOAT AS $$3.141592654::FLOAT$$`, id.FullyQualifiedName()) }) // TODO [SNOW-1348103]: remove with old function removal for V1 @@ -968,8 +1032,8 @@ func TestFunctions_CreateForSQL(t *testing.T) { opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) opts.Memoizable = Bool(true) opts.Comment = String("comment") - opts.FunctionDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (message VARCHAR DEFAULT 'test') COPY GRANTS RETURNS FLOAT NOT NULL IMMUTABLE MEMOIZABLE COMMENT = 'comment' AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) + opts.FunctionDefinition = wrapFunctionDefinition("3.141592654::FLOAT") + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("message" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS FLOAT NOT NULL IMMUTABLE MEMOIZABLE COMMENT = 'comment' AS $$3.141592654::FLOAT$$`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -994,79 +1058,62 @@ func TestFunctions_CreateForSQL(t *testing.T) { opts.ReturnResultsBehavior = ReturnResultsBehaviorPointer(ReturnResultsBehaviorImmutable) opts.Memoizable = Bool(true) opts.Comment = String("comment") - opts.FunctionDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s (message VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS FLOAT NOT NULL IMMUTABLE MEMOIZABLE COMMENT = 'comment' AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) + opts.FunctionDefinition = wrapFunctionDefinition("3.141592654::FLOAT") + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TEMPORARY SECURE FUNCTION %s ("message" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS FLOAT NOT NULL IMMUTABLE MEMOIZABLE COMMENT = 'comment' AS $$3.141592654::FLOAT$$`, id.FullyQualifiedName()) }) } -func TestFunctions_Drop(t *testing.T) { - noArgsId := randomSchemaObjectIdentifierWithArguments() +func TestFunctions_Alter(t *testing.T) { id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) + secretId := randomSchemaObjectIdentifier() - defaultOpts := func() *DropFunctionOptions { - return &DropFunctionOptions{ - name: id, + defaultOpts := func() *AlterFunctionOptions { + return &AlterFunctionOptions{ + name: id, + IfExists: Bool(true), } } t.Run("validation: nil options", func(t *testing.T) { - opts := (*DropFunctionOptions)(nil) + opts := (*AlterFunctionOptions)(nil) assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifierWithArguments assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("no arguments", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.RenameTo] if set", func(t *testing.T) { opts := defaultOpts() - opts.name = noArgsId - assertOptsValidAndSQLEquals(t, opts, `DROP FUNCTION %s`, noArgsId.FullyQualifiedName()) - }) - - t.Run("all options", func(t *testing.T) { - opts := &DropFunctionOptions{ - name: id, - } - opts.IfExists = Bool(true) - assertOptsValidAndSQLEquals(t, opts, `DROP FUNCTION IF EXISTS %s`, id.FullyQualifiedName()) + target := emptySchemaObjectIdentifier + opts.RenameTo = &target + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) -} - -func TestFunctions_Alter(t *testing.T) { - id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) - noArgsId := randomSchemaObjectIdentifierWithArguments() - defaultOpts := func() *AlterFunctionOptions { - return &AlterFunctionOptions{ - name: id, - IfExists: Bool(true), - } - } - - t.Run("validation: nil options", func(t *testing.T) { - opts := (*AlterFunctionOptions)(nil) - assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) + t.Run("validation: exactly one field from [opts.RenameTo opts.Set opts.Unset opts.SetSecure opts.UnsetSecure opts.SetTags opts.UnsetTags] should be present", func(t *testing.T) { + opts := defaultOpts() + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "Set", "Unset", "SetSecure", "UnsetSecure", "SetTags", "UnsetTags")) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: exactly one field from [opts.RenameTo opts.Set opts.Unset opts.SetSecure opts.UnsetSecure opts.SetTags opts.UnsetTags] should be present - two present", func(t *testing.T) { opts := defaultOpts() - opts.name = emptySchemaObjectIdentifierWithArguments - assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + opts.Set = &FunctionSet{} + opts.Unset = &FunctionUnset{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "Set", "Unset", "SetSecure", "UnsetSecure", "SetTags", "UnsetTags")) }) - t.Run("validation: exactly one field should be present", func(t *testing.T) { + t.Run("validation: at least one of the fields [opts.Set.Comment opts.Set.ExternalAccessIntegrations opts.Set.SecretsList opts.Set.EnableConsoleOutput opts.Set.LogLevel opts.Set.MetricLevel opts.Set.TraceLevel] should be set", func(t *testing.T) { opts := defaultOpts() - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "SetSecure", "UnsetLogLevel", "UnsetTraceLevel", "UnsetSecure", "UnsetComment", "SetTags", "UnsetTags")) + opts.Set = &FunctionSet{} + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("AlterFunctionOptions.Set", "Comment", "ExternalAccessIntegrations", "SecretsList", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) }) - t.Run("validation: exactly one field should be present", func(t *testing.T) { + t.Run("validation: at least one of the fields [opts.Unset.Comment opts.Unset.ExternalAccessIntegrations opts.Unset.EnableConsoleOutput opts.Unset.LogLevel opts.Unset.MetricLevel opts.Unset.TraceLevel] should be set", func(t *testing.T) { opts := defaultOpts() - opts.SetLogLevel = String("DEBUG") - opts.UnsetComment = Bool(true) - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "SetSecure", "UnsetLogLevel", "UnsetTraceLevel", "UnsetSecure", "UnsetComment", "SetTags", "UnsetTags")) + opts.Unset = &FunctionUnset{} + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("AlterFunctionOptions.Unset", "Comment", "ExternalAccessIntegrations", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) }) t.Run("alter: rename to", func(t *testing.T) { @@ -1076,29 +1123,42 @@ func TestFunctions_Alter(t *testing.T) { assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s RENAME TO %s`, id.FullyQualifiedName(), opts.RenameTo.FullyQualifiedName()) }) - t.Run("alter: set log level with no arguments", func(t *testing.T) { + t.Run("alter: set", func(t *testing.T) { opts := defaultOpts() - opts.name = noArgsId - opts.SetLogLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET LOG_LEVEL = 'DEBUG'`, noArgsId.FullyQualifiedName()) + opts.Set = &FunctionSet{ + Comment: String("comment"), + TraceLevel: Pointer(TraceLevelOff), + } + assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET COMMENT = 'comment', TRACE_LEVEL = 'OFF'`, id.FullyQualifiedName()) }) - t.Run("alter: set log level", func(t *testing.T) { + t.Run("alter: set empty secrets", func(t *testing.T) { opts := defaultOpts() - opts.SetLogLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET LOG_LEVEL = 'DEBUG'`, id.FullyQualifiedName()) + opts.Set = &FunctionSet{ + SecretsList: &SecretsList{}, + } + assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET SECRETS = ()`, id.FullyQualifiedName()) }) - t.Run("alter: set trace level", func(t *testing.T) { + t.Run("alter: set non-empty secrets", func(t *testing.T) { opts := defaultOpts() - opts.SetTraceLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET TRACE_LEVEL = 'DEBUG'`, id.FullyQualifiedName()) + opts.Set = &FunctionSet{ + SecretsList: &SecretsList{ + []SecretReference{ + {VariableName: "abc", Name: secretId}, + }, + }, + } + assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET SECRETS = ('abc' = %s)`, id.FullyQualifiedName(), secretId.FullyQualifiedName()) }) - t.Run("alter: set comment", func(t *testing.T) { + t.Run("alter: unset", func(t *testing.T) { opts := defaultOpts() - opts.SetComment = String("comment") - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET COMMENT = 'comment'`, id.FullyQualifiedName()) + opts.Unset = &FunctionUnset{ + Comment: Bool(true), + TraceLevel: Bool(true), + } + assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s UNSET COMMENT, TRACE_LEVEL`, id.FullyQualifiedName()) }) t.Run("alter: set secure", func(t *testing.T) { @@ -1107,30 +1167,12 @@ func TestFunctions_Alter(t *testing.T) { assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s SET SECURE`, id.FullyQualifiedName()) }) - t.Run("alter: unset log level", func(t *testing.T) { - opts := defaultOpts() - opts.UnsetLogLevel = Bool(true) - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s UNSET LOG_LEVEL`, id.FullyQualifiedName()) - }) - - t.Run("alter: unset trace level", func(t *testing.T) { - opts := defaultOpts() - opts.UnsetTraceLevel = Bool(true) - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s UNSET TRACE_LEVEL`, id.FullyQualifiedName()) - }) - t.Run("alter: unset secure", func(t *testing.T) { opts := defaultOpts() opts.UnsetSecure = Bool(true) assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s UNSET SECURE`, id.FullyQualifiedName()) }) - t.Run("alter: unset comment", func(t *testing.T) { - opts := defaultOpts() - opts.UnsetComment = Bool(true) - assertOptsValidAndSQLEquals(t, opts, `ALTER FUNCTION IF EXISTS %s UNSET COMMENT`, id.FullyQualifiedName()) - }) - t.Run("alter: set tags", func(t *testing.T) { opts := defaultOpts() opts.SetTags = []TagAssociation{ @@ -1152,6 +1194,42 @@ func TestFunctions_Alter(t *testing.T) { }) } +func TestFunctions_Drop(t *testing.T) { + noArgsId := randomSchemaObjectIdentifierWithArguments() + id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) + + defaultOpts := func() *DropFunctionOptions { + return &DropFunctionOptions{ + name: id, + } + } + + t.Run("validation: nil options", func(t *testing.T) { + opts := (*DropFunctionOptions)(nil) + assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) + }) + + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { + opts := defaultOpts() + opts.name = emptySchemaObjectIdentifierWithArguments + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + }) + + t.Run("basic", func(t *testing.T) { + opts := defaultOpts() + opts.name = noArgsId + assertOptsValidAndSQLEquals(t, opts, `DROP FUNCTION %s`, noArgsId.FullyQualifiedName()) + }) + + t.Run("all options", func(t *testing.T) { + opts := &DropFunctionOptions{ + name: id, + } + opts.IfExists = Bool(true) + assertOptsValidAndSQLEquals(t, opts, `DROP FUNCTION IF EXISTS %s`, id.FullyQualifiedName()) + }) +} + func TestFunctions_Show(t *testing.T) { defaultOpts := func() *ShowFunctionOptions { return &ShowFunctionOptions{} @@ -1177,8 +1255,10 @@ func TestFunctions_Show(t *testing.T) { t.Run("show with in", func(t *testing.T) { opts := defaultOpts() - opts.In = &In{ - Account: Bool(true), + opts.In = &ExtendedIn{ + In: In{ + Account: Bool(true), + }, } assertOptsValidAndSQLEquals(t, opts, `SHOW USER FUNCTIONS IN ACCOUNT`) }) @@ -1186,7 +1266,6 @@ func TestFunctions_Show(t *testing.T) { func TestFunctions_Describe(t *testing.T) { id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) - noArgsId := randomSchemaObjectIdentifierWithArguments() defaultOpts := func() *DescribeFunctionOptions { return &DescribeFunctionOptions{ @@ -1199,18 +1278,12 @@ func TestFunctions_Describe(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifierWithArguments assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("no arguments", func(t *testing.T) { - opts := defaultOpts() - opts.name = noArgsId - assertOptsValidAndSQLEquals(t, opts, `DESCRIBE FUNCTION %s`, noArgsId.FullyQualifiedName()) - }) - t.Run("all options", func(t *testing.T) { opts := defaultOpts() assertOptsValidAndSQLEquals(t, opts, `DESCRIBE FUNCTION %s`, id.FullyQualifiedName()) diff --git a/pkg/sdk/functions_impl_gen.go b/pkg/sdk/functions_impl_gen.go index ca17781139..051a8fc994 100644 --- a/pkg/sdk/functions_impl_gen.go +++ b/pkg/sdk/functions_impl_gen.go @@ -60,7 +60,7 @@ func (v *functions) Show(ctx context.Context, request *ShowFunctionRequest) ([]F } func (v *functions) ShowByID(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*Function, error) { - functions, err := v.Show(ctx, NewShowFunctionRequest().WithIn(In{Schema: id.SchemaId()}).WithLike(Like{String(id.Name())})) + functions, err := v.Show(ctx, NewShowFunctionRequest().WithIn(ExtendedIn{In: In{Schema: id.SchemaId()}}).WithLike(Like{String(id.Name())})) if err != nil { return nil, err } @@ -98,12 +98,21 @@ func (r *CreateForJavaFunctionRequest) toOpts() *CreateForJavaFunctionOptions { ExternalAccessIntegrations: r.ExternalAccessIntegrations, Secrets: r.Secrets, TargetPath: r.TargetPath, + EnableConsoleOutput: r.EnableConsoleOutput, + LogLevel: r.LogLevel, + MetricLevel: r.MetricLevel, + TraceLevel: r.TraceLevel, FunctionDefinition: r.FunctionDefinition, } if r.Arguments != nil { s := make([]FunctionArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = FunctionArgument(v) + s[i] = FunctionArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -119,7 +128,11 @@ func (r *CreateForJavaFunctionRequest) toOpts() *CreateForJavaFunctionOptions { if r.Returns.Table.Columns != nil { s := make([]FunctionColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = FunctionColumn(v) + s[i] = FunctionColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -127,14 +140,18 @@ func (r *CreateForJavaFunctionRequest) toOpts() *CreateForJavaFunctionOptions { if r.Imports != nil { s := make([]FunctionImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = FunctionImport(v) + s[i] = FunctionImport{ + Import: v.Import, + } } opts.Imports = s } if r.Packages != nil { s := make([]FunctionPackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = FunctionPackage(v) + s[i] = FunctionPackage{ + Package: v.Package, + } } opts.Packages = s } @@ -154,12 +171,21 @@ func (r *CreateForJavascriptFunctionRequest) toOpts() *CreateForJavascriptFuncti NullInputBehavior: r.NullInputBehavior, ReturnResultsBehavior: r.ReturnResultsBehavior, Comment: r.Comment, + EnableConsoleOutput: r.EnableConsoleOutput, + LogLevel: r.LogLevel, + MetricLevel: r.MetricLevel, + TraceLevel: r.TraceLevel, FunctionDefinition: r.FunctionDefinition, } if r.Arguments != nil { s := make([]FunctionArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = FunctionArgument(v) + s[i] = FunctionArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -175,7 +201,11 @@ func (r *CreateForJavascriptFunctionRequest) toOpts() *CreateForJavascriptFuncti if r.Returns.Table.Columns != nil { s := make([]FunctionColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = FunctionColumn(v) + s[i] = FunctionColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -188,6 +218,7 @@ func (r *CreateForPythonFunctionRequest) toOpts() *CreateForPythonFunctionOption OrReplace: r.OrReplace, Temporary: r.Temporary, Secure: r.Secure, + Aggregate: r.Aggregate, IfNotExists: r.IfNotExists, name: r.name, @@ -202,12 +233,21 @@ func (r *CreateForPythonFunctionRequest) toOpts() *CreateForPythonFunctionOption Handler: r.Handler, ExternalAccessIntegrations: r.ExternalAccessIntegrations, Secrets: r.Secrets, + EnableConsoleOutput: r.EnableConsoleOutput, + LogLevel: r.LogLevel, + MetricLevel: r.MetricLevel, + TraceLevel: r.TraceLevel, FunctionDefinition: r.FunctionDefinition, } if r.Arguments != nil { s := make([]FunctionArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = FunctionArgument(v) + s[i] = FunctionArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -223,7 +263,11 @@ func (r *CreateForPythonFunctionRequest) toOpts() *CreateForPythonFunctionOption if r.Returns.Table.Columns != nil { s := make([]FunctionColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = FunctionColumn(v) + s[i] = FunctionColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -231,14 +275,18 @@ func (r *CreateForPythonFunctionRequest) toOpts() *CreateForPythonFunctionOption if r.Imports != nil { s := make([]FunctionImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = FunctionImport(v) + s[i] = FunctionImport{ + Import: v.Import, + } } opts.Imports = s } if r.Packages != nil { s := make([]FunctionPackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = FunctionPackage(v) + s[i] = FunctionPackage{ + Package: v.Package, + } } opts.Packages = s } @@ -262,28 +310,43 @@ func (r *CreateForScalaFunctionRequest) toOpts() *CreateForScalaFunctionOptions RuntimeVersion: r.RuntimeVersion, Comment: r.Comment, - Handler: r.Handler, - TargetPath: r.TargetPath, - FunctionDefinition: r.FunctionDefinition, + Handler: r.Handler, + ExternalAccessIntegrations: r.ExternalAccessIntegrations, + Secrets: r.Secrets, + TargetPath: r.TargetPath, + EnableConsoleOutput: r.EnableConsoleOutput, + LogLevel: r.LogLevel, + MetricLevel: r.MetricLevel, + TraceLevel: r.TraceLevel, + FunctionDefinition: r.FunctionDefinition, } if r.Arguments != nil { s := make([]FunctionArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = FunctionArgument(v) + s[i] = FunctionArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } if r.Imports != nil { s := make([]FunctionImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = FunctionImport(v) + s[i] = FunctionImport{ + Import: v.Import, + } } opts.Imports = s } if r.Packages != nil { s := make([]FunctionPackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = FunctionPackage(v) + s[i] = FunctionPackage{ + Package: v.Package, + } } opts.Packages = s } @@ -303,12 +366,21 @@ func (r *CreateForSQLFunctionRequest) toOpts() *CreateForSQLFunctionOptions { ReturnResultsBehavior: r.ReturnResultsBehavior, Memoizable: r.Memoizable, Comment: r.Comment, + EnableConsoleOutput: r.EnableConsoleOutput, + LogLevel: r.LogLevel, + MetricLevel: r.MetricLevel, + TraceLevel: r.TraceLevel, FunctionDefinition: r.FunctionDefinition, } if r.Arguments != nil { s := make([]FunctionArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = FunctionArgument(v) + s[i] = FunctionArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -324,7 +396,11 @@ func (r *CreateForSQLFunctionRequest) toOpts() *CreateForSQLFunctionOptions { if r.Returns.Table.Columns != nil { s := make([]FunctionColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = FunctionColumn(v) + s[i] = FunctionColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -334,19 +410,39 @@ func (r *CreateForSQLFunctionRequest) toOpts() *CreateForSQLFunctionOptions { func (r *AlterFunctionRequest) toOpts() *AlterFunctionOptions { opts := &AlterFunctionOptions{ - IfExists: r.IfExists, - name: r.name, - RenameTo: r.RenameTo, - SetComment: r.SetComment, - SetLogLevel: r.SetLogLevel, - SetTraceLevel: r.SetTraceLevel, - SetSecure: r.SetSecure, - UnsetSecure: r.UnsetSecure, - UnsetLogLevel: r.UnsetLogLevel, - UnsetTraceLevel: r.UnsetTraceLevel, - UnsetComment: r.UnsetComment, - SetTags: r.SetTags, - UnsetTags: r.UnsetTags, + IfExists: r.IfExists, + name: r.name, + RenameTo: r.RenameTo, + SetSecure: r.SetSecure, + UnsetSecure: r.UnsetSecure, + SetTags: r.SetTags, + UnsetTags: r.UnsetTags, + } + if r.Set != nil { + opts.Set = &FunctionSet{ + Comment: r.Set.Comment, + ExternalAccessIntegrations: r.Set.ExternalAccessIntegrations, + + EnableConsoleOutput: r.Set.EnableConsoleOutput, + LogLevel: r.Set.LogLevel, + MetricLevel: r.Set.MetricLevel, + TraceLevel: r.Set.TraceLevel, + } + if r.Set.SecretsList != nil { + opts.Set.SecretsList = &SecretsList{ + SecretsList: r.Set.SecretsList.SecretsList, + } + } + } + if r.Unset != nil { + opts.Unset = &FunctionUnset{ + Comment: r.Unset.Comment, + ExternalAccessIntegrations: r.Unset.ExternalAccessIntegrations, + EnableConsoleOutput: r.Unset.EnableConsoleOutput, + LogLevel: r.Unset.LogLevel, + MetricLevel: r.Unset.MetricLevel, + TraceLevel: r.Unset.TraceLevel, + } } return opts } @@ -371,7 +467,7 @@ func (r functionRow) convert() *Function { e := &Function{ CreatedOn: r.CreatedOn, Name: r.Name, - SchemaName: r.SchemaName, + SchemaName: strings.Trim(r.SchemaName, `"`), IsBuiltin: r.IsBuiltin == "Y", IsAggregate: r.IsAggregate == "Y", IsAnsi: r.IsAnsi == "Y", @@ -379,7 +475,7 @@ func (r functionRow) convert() *Function { MaxNumArguments: r.MaxNumArguments, ArgumentsRaw: r.Arguments, Description: r.Description, - CatalogName: r.CatalogName, + CatalogName: strings.Trim(r.CatalogName, `"`), IsTableFunction: r.IsTableFunction == "Y", ValidForClustering: r.ValidForClustering == "Y", IsExternalFunction: r.IsExternalFunction == "Y", @@ -397,9 +493,18 @@ func (r functionRow) convert() *Function { if r.IsSecure.Valid { e.IsSecure = r.IsSecure.String == "Y" } + if r.Secrets.Valid { + e.Secrets = String(r.Secrets.String) + } + if r.ExternalAccessIntegrations.Valid { + e.ExternalAccessIntegrations = String(r.ExternalAccessIntegrations.String) + } if r.IsMemoizable.Valid { e.IsMemoizable = r.IsMemoizable.String == "Y" } + if r.IsDataMetric.Valid { + e.IsDataMetric = r.IsDataMetric.String == "Y" + } return e } @@ -414,8 +519,8 @@ func (r functionDetailRow) convert() *FunctionDetail { e := &FunctionDetail{ Property: r.Property, } - if r.Value.Valid { - e.Value = r.Value.String + if r.Value.Valid && r.Value.String != "null" { + e.Value = String(r.Value.String) } return e } diff --git a/pkg/sdk/functions_validations_gen.go b/pkg/sdk/functions_validations_gen.go index 78970158e8..651515e1cf 100644 --- a/pkg/sdk/functions_validations_gen.go +++ b/pkg/sdk/functions_validations_gen.go @@ -59,9 +59,6 @@ func (opts *CreateForJavaFunctionOptions) validate() error { if opts.TargetPath != nil { errs = append(errs, NewError("TARGET_PATH must be nil when AS is nil")) } - if len(opts.Packages) > 0 { - errs = append(errs, NewError("PACKAGES must be empty when AS is nil")) - } if len(opts.Imports) == 0 { errs = append(errs, NewError("IMPORTS must not be empty when AS is nil")) } @@ -195,9 +192,6 @@ func (opts *CreateForScalaFunctionOptions) validate() error { if opts.TargetPath != nil { errs = append(errs, NewError("TARGET_PATH must be nil when AS is nil")) } - if len(opts.Packages) > 0 { - errs = append(errs, NewError("PACKAGES must be empty when AS is nil")) - } if len(opts.Imports) == 0 { errs = append(errs, NewError("IMPORTS must not be empty when AS is nil")) } @@ -258,8 +252,18 @@ func (opts *AlterFunctionOptions) validate() error { if opts.RenameTo != nil && !ValidObjectIdentifier(opts.RenameTo) { errs = append(errs, ErrInvalidObjectIdentifier) } - if !exactlyOneValueSet(opts.RenameTo, opts.SetComment, opts.SetLogLevel, opts.SetTraceLevel, opts.SetSecure, opts.UnsetLogLevel, opts.UnsetTraceLevel, opts.UnsetSecure, opts.UnsetComment, opts.SetTags, opts.UnsetTags) { - errs = append(errs, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "SetSecure", "UnsetLogLevel", "UnsetTraceLevel", "UnsetSecure", "UnsetComment", "SetTags", "UnsetTags")) + if !exactlyOneValueSet(opts.RenameTo, opts.Set, opts.Unset, opts.SetSecure, opts.UnsetSecure, opts.SetTags, opts.UnsetTags) { + errs = append(errs, errExactlyOneOf("AlterFunctionOptions", "RenameTo", "Set", "Unset", "SetSecure", "UnsetSecure", "SetTags", "UnsetTags")) + } + if valueSet(opts.Set) { + if !anyValueSet(opts.Set.Comment, opts.Set.ExternalAccessIntegrations, opts.Set.SecretsList, opts.Set.EnableConsoleOutput, opts.Set.LogLevel, opts.Set.MetricLevel, opts.Set.TraceLevel) { + errs = append(errs, errAtLeastOneOf("AlterFunctionOptions.Set", "Comment", "ExternalAccessIntegrations", "SecretsList", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) + } + } + if valueSet(opts.Unset) { + if !anyValueSet(opts.Unset.Comment, opts.Unset.ExternalAccessIntegrations, opts.Unset.EnableConsoleOutput, opts.Unset.LogLevel, opts.Unset.MetricLevel, opts.Unset.TraceLevel) { + errs = append(errs, errAtLeastOneOf("AlterFunctionOptions.Unset", "Comment", "ExternalAccessIntegrations", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) + } } return JoinErrors(errs...) } diff --git a/pkg/sdk/identifier_helpers.go b/pkg/sdk/identifier_helpers.go index 90d1acdf44..1609593d71 100644 --- a/pkg/sdk/identifier_helpers.go +++ b/pkg/sdk/identifier_helpers.go @@ -124,6 +124,10 @@ func (i AccountIdentifier) AccountName() string { return i.accountName } +func (i AccountIdentifier) AsAccountObjectIdentifier() AccountObjectIdentifier { + return NewAccountObjectIdentifier(i.accountName) +} + func (i AccountIdentifier) Name() string { if i.organizationName != "" && i.accountName != "" { return fmt.Sprintf("%s.%s", i.organizationName, i.accountName) @@ -324,7 +328,12 @@ func NewSchemaObjectIdentifierWithArguments(databaseName, schemaName, name strin if err != nil { log.Printf("[DEBUG] failed to normalize argument %d: %v, err = %v", i, argument, err) } - normalizedArguments[i] = LegacyDataTypeFrom(normalizedArgument) + // TODO [SNOW-1348103]: temporary workaround to fix panic resulting from TestAcc_Grants_To_AccountRole test (because of unsupported TABLE data type) + if normalizedArgument != nil { + normalizedArguments[i] = LegacyDataTypeFrom(normalizedArgument) + } else { + normalizedArguments[i] = "" + } } return SchemaObjectIdentifierWithArguments{ databaseName: strings.Trim(databaseName, `"`), diff --git a/pkg/sdk/parameters.go b/pkg/sdk/parameters.go index cf29fa1da4..44f73e9e55 100644 --- a/pkg/sdk/parameters.go +++ b/pkg/sdk/parameters.go @@ -833,6 +833,39 @@ const ( DatabaseParameterEnableConsoleOutput DatabaseParameter = "ENABLE_CONSOLE_OUTPUT" ) +type FunctionParameter string + +const ( + FunctionParameterEnableConsoleOutput FunctionParameter = "ENABLE_CONSOLE_OUTPUT" + FunctionParameterLogLevel FunctionParameter = "LOG_LEVEL" + FunctionParameterMetricLevel FunctionParameter = "METRIC_LEVEL" + FunctionParameterTraceLevel FunctionParameter = "TRACE_LEVEL" +) + +var AllFunctionParameters = []FunctionParameter{ + FunctionParameterEnableConsoleOutput, + FunctionParameterLogLevel, + FunctionParameterMetricLevel, + FunctionParameterTraceLevel, +} + +type ProcedureParameter string + +const ( + ProcedureParameterAutoEventLogging ProcedureParameter = "AUTO_EVENT_LOGGING" + ProcedureParameterEnableConsoleOutput ProcedureParameter = "ENABLE_CONSOLE_OUTPUT" + ProcedureParameterLogLevel ProcedureParameter = "LOG_LEVEL" + ProcedureParameterMetricLevel ProcedureParameter = "METRIC_LEVEL" + ProcedureParameterTraceLevel ProcedureParameter = "TRACE_LEVEL" +) + +var AllProcedureParameters = []ProcedureParameter{ + ProcedureParameterEnableConsoleOutput, + ProcedureParameterLogLevel, + ProcedureParameterMetricLevel, + ProcedureParameterTraceLevel, +} + // AccountParameters is based on https://docs.snowflake.com/en/sql-reference/parameters#account-parameters. type AccountParameters struct { // Account Parameters @@ -1341,19 +1374,21 @@ func (opts *ShowParametersOptions) validate() error { } type ParametersIn struct { - Session *bool `ddl:"keyword" sql:"SESSION"` - Account *bool `ddl:"keyword" sql:"ACCOUNT"` - User AccountObjectIdentifier `ddl:"identifier" sql:"USER"` - Warehouse AccountObjectIdentifier `ddl:"identifier" sql:"WAREHOUSE"` - Database AccountObjectIdentifier `ddl:"identifier" sql:"DATABASE"` - Schema DatabaseObjectIdentifier `ddl:"identifier" sql:"SCHEMA"` - Task SchemaObjectIdentifier `ddl:"identifier" sql:"TASK"` - Table SchemaObjectIdentifier `ddl:"identifier" sql:"TABLE"` + Session *bool `ddl:"keyword" sql:"SESSION"` + Account *bool `ddl:"keyword" sql:"ACCOUNT"` + User AccountObjectIdentifier `ddl:"identifier" sql:"USER"` + Warehouse AccountObjectIdentifier `ddl:"identifier" sql:"WAREHOUSE"` + Database AccountObjectIdentifier `ddl:"identifier" sql:"DATABASE"` + Schema DatabaseObjectIdentifier `ddl:"identifier" sql:"SCHEMA"` + Task SchemaObjectIdentifier `ddl:"identifier" sql:"TASK"` + Table SchemaObjectIdentifier `ddl:"identifier" sql:"TABLE"` + Function SchemaObjectIdentifierWithArguments `ddl:"identifier" sql:"FUNCTION"` + Procedure SchemaObjectIdentifierWithArguments `ddl:"identifier" sql:"PROCEDURE"` } func (v *ParametersIn) validate() error { - if !anyValueSet(v.Session, v.Account, v.User, v.Warehouse, v.Database, v.Schema, v.Task, v.Table) { - return errors.Join(errAtLeastOneOf("Session", "Account", "User", "Warehouse", "Database", "Schema", "Task", "Table")) + if !anyValueSet(v.Session, v.Account, v.User, v.Warehouse, v.Database, v.Schema, v.Task, v.Table, v.Function, v.Procedure) { + return errors.Join(errAtLeastOneOf("Session", "Account", "User", "Warehouse", "Database", "Schema", "Task", "Table", "Function", "Procedure")) } return nil } @@ -1370,6 +1405,8 @@ const ( ParameterTypeDatabase ParameterType = "DATABASE" ParameterTypeSchema ParameterType = "SCHEMA" ParameterTypeTask ParameterType = "TASK" + ParameterTypeFunction ParameterType = "FUNCTION" + ParameterTypeProcedure ParameterType = "PROCEDURE" ) type Parameter struct { @@ -1498,6 +1535,10 @@ func (v *parameters) ShowObjectParameter(ctx context.Context, parameter ObjectPa opts.In.Table = object.Name.(SchemaObjectIdentifier) case ObjectTypeUser: opts.In.User = object.Name.(AccountObjectIdentifier) + case ObjectTypeFunction: + opts.In.Function = object.Name.(SchemaObjectIdentifierWithArguments) + case ObjectTypeProcedure: + opts.In.Procedure = object.Name.(SchemaObjectIdentifierWithArguments) default: return nil, fmt.Errorf("unsupported object type %s", object.Name) } diff --git a/pkg/sdk/poc/README.md b/pkg/sdk/poc/README.md index 46cb6e16b9..eabaf74e55 100644 --- a/pkg/sdk/poc/README.md +++ b/pkg/sdk/poc/README.md @@ -110,6 +110,7 @@ find a better solution to solve the issue (add more logic to the templates ?) - more clear definition of lists that can be empty vs cannot be empty - add empty ids in generated tests (TODO in random_test.go) - add optional imports (currently they have to be added manually, e.g. `datatypes.DataType`) +- add fourth type of quotes - double dollars ($$..$$) -> used for functions, procedures, and tasks ##### Known issues - generating two converts when Show and Desc use the same data structure diff --git a/pkg/sdk/poc/generator/templates/sub_templates/to_opts_mapping.tmpl b/pkg/sdk/poc/generator/templates/sub_templates/to_opts_mapping.tmpl index 57bf0ec3bb..0efd3fde29 100644 --- a/pkg/sdk/poc/generator/templates/sub_templates/to_opts_mapping.tmpl +++ b/pkg/sdk/poc/generator/templates/sub_templates/to_opts_mapping.tmpl @@ -10,13 +10,13 @@ {{- range .Fields }} {{- if .ShouldBeInDto }} {{- if .IsStruct }} - {{ if or .IsPointer .IsSlice }} + {{- if or .IsPointer .IsSlice }} if r{{ .Path }} != nil { - {{ end }} + {{- end -}} {{- if not .IsSlice -}} opts{{ .Path }} = {{ if .IsPointer }}&{{end}}{{ template "toOptsMapping" . -}}{{/* Recursive call */}} - {{- else }} + {{- else -}} s := make({{ .Kind }}, len(r{{ .Path }})) for i, v := range r{{ .Path }} { s[i] = {{ .KindNoSlice }}{ @@ -26,11 +26,11 @@ } } opts{{ .Path }} = s - {{ end -}} + {{- end -}} - {{ if or .IsPointer .IsSlice -}} + {{- if or .IsPointer .IsSlice -}} } {{- end -}} {{- end -}} - {{ end -}} -{{ end }} + {{- end }} +{{- end }} diff --git a/pkg/sdk/procedures_def.go b/pkg/sdk/procedures_def.go index 0485b7711b..636da06187 100644 --- a/pkg/sdk/procedures_def.go +++ b/pkg/sdk/procedures_def.go @@ -5,14 +5,14 @@ import g "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/poc/gen //go:generate go run ./poc/main.go var procedureArgument = g.NewQueryStruct("ProcedureArgument"). - Text("ArgName", g.KeywordOptions().NoQuotes().Required()). + Text("ArgName", g.KeywordOptions().DoubleQuotes().Required()). PredefinedQueryStructField("ArgDataTypeOld", "DataType", g.KeywordOptions().NoQuotes()). PredefinedQueryStructField("ArgDataType", "datatypes.DataType", g.ParameterOptions().NoQuotes().NoEquals().Required()). PredefinedQueryStructField("DefaultValue", "*string", g.ParameterOptions().NoEquals().SQL("DEFAULT")). WithValidation(g.ExactlyOneValueSet, "ArgDataTypeOld", "ArgDataType") var procedureColumn = g.NewQueryStruct("ProcedureColumn"). - Text("ColumnName", g.KeywordOptions().NoQuotes().Required()). + Text("ColumnName", g.KeywordOptions().DoubleQuotes().Required()). PredefinedQueryStructField("ColumnDataTypeOld", "DataType", g.KeywordOptions().NoQuotes()). PredefinedQueryStructField("ColumnDataType", "datatypes.DataType", g.ParameterOptions().NoQuotes().NoEquals().Required()). WithValidation(g.ExactlyOneValueSet, "ColumnDataTypeOld", "ColumnDataType") @@ -96,6 +96,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE JAVA"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -111,10 +113,9 @@ var ProceduresDef = g.NewInterface( ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). OptionalTextAssignment("TARGET_PATH", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidateValueSet, "RuntimeVersion"). WithValidation(g.ValidateValueSet, "Packages"). WithValidation(g.ValidateValueSet, "Handler"). @@ -140,9 +141,10 @@ var ProceduresDef = g.NewInterface( OptionalSQL("NOT NULL"). SQL("LANGUAGE JAVASCRIPT"). PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS").Required()). + PredefinedQueryStructField("ProcedureDefinition", "string", g.ParameterOptions().NoEquals().SQL("AS").Required()). WithValidation(g.ValidateValueSet, "ProcedureDefinition"). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ExactlyOneValueSet, "ResultDataTypeOld", "ResultDataType"), @@ -167,6 +169,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE PYTHON"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -181,10 +185,9 @@ var ProceduresDef = g.NewInterface( TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidateValueSet, "RuntimeVersion"). WithValidation(g.ValidateValueSet, "Packages"). WithValidation(g.ValidateValueSet, "Handler"). @@ -210,6 +213,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE SCALA"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -222,11 +227,12 @@ var ProceduresDef = g.NewInterface( g.ParameterOptions().Parentheses().SQL("IMPORTS"), ). TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). + ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). + ListAssignment("SECRETS", "SecretReference", g.ParameterOptions().Parentheses()). OptionalTextAssignment("TARGET_PATH", g.ParameterOptions().SingleQuotes()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SQL("AS")). WithValidation(g.ValidateValueSet, "RuntimeVersion"). WithValidation(g.ValidateValueSet, "Packages"). WithValidation(g.ValidateValueSet, "Handler"). @@ -253,9 +259,10 @@ var ProceduresDef = g.NewInterface( ). SQL("LANGUAGE SQL"). PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ReturnResultsBehavior", "*ReturnResultsBehavior", g.KeywordOptions()). OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS").Required()). + PredefinedQueryStructField("ProcedureDefinition", "string", g.ParameterOptions().NoEquals().SQL("AS").Required()). WithValidation(g.ValidateValueSet, "ProcedureDefinition"). WithValidation(g.ValidIdentifier, "name"), ).AlterOperation( @@ -266,16 +273,39 @@ var ProceduresDef = g.NewInterface( IfExists(). Name(). OptionalIdentifier("RenameTo", g.KindOfT[SchemaObjectIdentifier](), g.IdentifierOptions().SQL("RENAME TO")). - OptionalTextAssignment("SET COMMENT", g.ParameterOptions().SingleQuotes()). - OptionalTextAssignment("SET LOG_LEVEL", g.ParameterOptions().SingleQuotes()). - OptionalTextAssignment("SET TRACE_LEVEL", g.ParameterOptions().SingleQuotes()). - OptionalSQL("UNSET COMMENT"). + OptionalQueryStructField( + "Set", + g.NewQueryStruct("ProcedureSet"). + OptionalTextAssignment("COMMENT", g.ParameterOptions().SingleQuotes()). + ListAssignment("EXTERNAL_ACCESS_INTEGRATIONS", "AccountObjectIdentifier", g.ParameterOptions().Parentheses()). + OptionalQueryStructField("SecretsList", functionSecretsListWrapper, g.ParameterOptions().SQL("SECRETS").Parentheses()). + OptionalAssignment("AUTO_EVENT_LOGGING", g.KindOfTPointer[AutoEventLogging](), g.ParameterOptions().SingleQuotes()). + OptionalBooleanAssignment("ENABLE_CONSOLE_OUTPUT", nil). + OptionalAssignment("LOG_LEVEL", g.KindOfTPointer[LogLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("METRIC_LEVEL", g.KindOfTPointer[MetricLevel](), g.ParameterOptions().SingleQuotes()). + OptionalAssignment("TRACE_LEVEL", g.KindOfTPointer[TraceLevel](), g.ParameterOptions().SingleQuotes()). + WithValidation(g.AtLeastOneValueSet, "Comment", "ExternalAccessIntegrations", "SecretsList", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel"), + g.ListOptions().SQL("SET"), + ). + OptionalQueryStructField( + "Unset", + g.NewQueryStruct("ProcedureUnset"). + OptionalSQL("COMMENT"). + OptionalSQL("EXTERNAL_ACCESS_INTEGRATIONS"). + OptionalSQL("AUTO_EVENT_LOGGING"). + OptionalSQL("ENABLE_CONSOLE_OUTPUT"). + OptionalSQL("LOG_LEVEL"). + OptionalSQL("METRIC_LEVEL"). + OptionalSQL("TRACE_LEVEL"). + WithValidation(g.AtLeastOneValueSet, "Comment", "ExternalAccessIntegrations", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel"), + g.ListOptions().SQL("UNSET"), + ). OptionalSetTags(). OptionalUnsetTags(). PredefinedQueryStructField("ExecuteAs", "*ExecuteAs", g.KeywordOptions()). WithValidation(g.ValidIdentifier, "name"). WithValidation(g.ValidIdentifierIfSet, "RenameTo"). - WithValidation(g.ExactlyOneValueSet, "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "UnsetComment", "SetTags", "UnsetTags", "ExecuteAs"), + WithValidation(g.ExactlyOneValueSet, "RenameTo", "Set", "Unset", "SetTags", "UnsetTags", "ExecuteAs"), ).DropOperation( "https://docs.snowflake.com/en/sql-reference/sql/drop-procedure", g.NewQueryStruct("DropProcedure"). @@ -300,7 +330,9 @@ var ProceduresDef = g.NewInterface( Field("catalog_name", "string"). Field("is_table_function", "string"). Field("valid_for_clustering", "string"). - Field("is_secure", "sql.NullString"), + Field("is_secure", "sql.NullString"). + OptionalText("secrets"). + OptionalText("external_access_integrations"), g.PlainStruct("Procedure"). Field("CreatedOn", "string"). Field("Name", "string"). @@ -315,12 +347,14 @@ var ProceduresDef = g.NewInterface( Field("CatalogName", "string"). Field("IsTableFunction", "bool"). Field("ValidForClustering", "bool"). - Field("IsSecure", "bool"), + Field("IsSecure", "bool"). + OptionalText("Secrets"). + OptionalText("ExternalAccessIntegrations"), g.NewQueryStruct("ShowProcedures"). Show(). SQL("PROCEDURES"). OptionalLike(). - OptionalIn(), // TODO: 'In' struct for procedures not support keyword "CLASS" now + OptionalExtendedIn(), ).ShowByIdOperation().DescribeOperation( g.DescriptionMappingKindSlice, "https://docs.snowflake.com/en/sql-reference/sql/desc-procedure", @@ -329,7 +363,7 @@ var ProceduresDef = g.NewInterface( Field("value", "sql.NullString"), g.PlainStruct("ProcedureDetail"). Field("Property", "string"). - Field("Value", "string"), + OptionalText("Value"), g.NewQueryStruct("DescribeProcedure"). Describe(). SQL("PROCEDURE"). @@ -362,6 +396,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE JAVA"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -374,8 +410,6 @@ var ProceduresDef = g.NewInterface( g.ParameterOptions().Parentheses().SQL("IMPORTS"), ). TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). OptionalQueryStructField( "WithClause", procedureWithClause, @@ -408,6 +442,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE SCALA"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -420,8 +456,6 @@ var ProceduresDef = g.NewInterface( g.ParameterOptions().Parentheses().SQL("IMPORTS"), ). TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). ListQueryStructField( "WithClauses", procedureWithClause, @@ -486,6 +520,8 @@ var ProceduresDef = g.NewInterface( g.KeywordOptions().SQL("RETURNS").Required(), ). SQL("LANGUAGE PYTHON"). + PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). + PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). TextAssignment("RUNTIME_VERSION", g.ParameterOptions().SingleQuotes().Required()). ListQueryStructField( "Packages", @@ -498,8 +534,6 @@ var ProceduresDef = g.NewInterface( g.ParameterOptions().Parentheses().SQL("IMPORTS"), ). TextAssignment("HANDLER", g.ParameterOptions().SingleQuotes().Required()). - PredefinedQueryStructField("NullInputBehavior", "*NullInputBehavior", g.KeywordOptions()). - PredefinedQueryStructField("ProcedureDefinition", "*string", g.ParameterOptions().NoEquals().SingleQuotes().SQL("AS")). ListQueryStructField( "WithClauses", procedureWithClause, diff --git a/pkg/sdk/procedures_dto_builders_gen.go b/pkg/sdk/procedures_dto_builders_gen.go index 373852a62a..c9be49ffce 100644 --- a/pkg/sdk/procedures_dto_builders_gen.go +++ b/pkg/sdk/procedures_dto_builders_gen.go @@ -68,6 +68,11 @@ func (s *CreateForJavaProcedureRequest) WithNullInputBehavior(NullInputBehavior return s } +func (s *CreateForJavaProcedureRequest) WithReturnResultsBehavior(ReturnResultsBehavior ReturnResultsBehavior) *CreateForJavaProcedureRequest { + s.ReturnResultsBehavior = &ReturnResultsBehavior + return s +} + func (s *CreateForJavaProcedureRequest) WithComment(Comment string) *CreateForJavaProcedureRequest { s.Comment = &Comment return s @@ -227,6 +232,11 @@ func (s *CreateForJavaScriptProcedureRequest) WithNullInputBehavior(NullInputBeh return s } +func (s *CreateForJavaScriptProcedureRequest) WithReturnResultsBehavior(ReturnResultsBehavior ReturnResultsBehavior) *CreateForJavaScriptProcedureRequest { + s.ReturnResultsBehavior = &ReturnResultsBehavior + return s +} + func (s *CreateForJavaScriptProcedureRequest) WithComment(Comment string) *CreateForJavaScriptProcedureRequest { s.Comment = &Comment return s @@ -293,6 +303,11 @@ func (s *CreateForPythonProcedureRequest) WithNullInputBehavior(NullInputBehavio return s } +func (s *CreateForPythonProcedureRequest) WithReturnResultsBehavior(ReturnResultsBehavior ReturnResultsBehavior) *CreateForPythonProcedureRequest { + s.ReturnResultsBehavior = &ReturnResultsBehavior + return s +} + func (s *CreateForPythonProcedureRequest) WithComment(Comment string) *CreateForPythonProcedureRequest { s.Comment = &Comment return s @@ -349,6 +364,16 @@ func (s *CreateForScalaProcedureRequest) WithImports(Imports []ProcedureImportRe return s } +func (s *CreateForScalaProcedureRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations []AccountObjectIdentifier) *CreateForScalaProcedureRequest { + s.ExternalAccessIntegrations = ExternalAccessIntegrations + return s +} + +func (s *CreateForScalaProcedureRequest) WithSecrets(Secrets []SecretReference) *CreateForScalaProcedureRequest { + s.Secrets = Secrets + return s +} + func (s *CreateForScalaProcedureRequest) WithTargetPath(TargetPath string) *CreateForScalaProcedureRequest { s.TargetPath = &TargetPath return s @@ -359,6 +384,11 @@ func (s *CreateForScalaProcedureRequest) WithNullInputBehavior(NullInputBehavior return s } +func (s *CreateForScalaProcedureRequest) WithReturnResultsBehavior(ReturnResultsBehavior ReturnResultsBehavior) *CreateForScalaProcedureRequest { + s.ReturnResultsBehavior = &ReturnResultsBehavior + return s +} + func (s *CreateForScalaProcedureRequest) WithComment(Comment string) *CreateForScalaProcedureRequest { s.Comment = &Comment return s @@ -411,6 +441,11 @@ func (s *CreateForSQLProcedureRequest) WithNullInputBehavior(NullInputBehavior N return s } +func (s *CreateForSQLProcedureRequest) WithReturnResultsBehavior(ReturnResultsBehavior ReturnResultsBehavior) *CreateForSQLProcedureRequest { + s.ReturnResultsBehavior = &ReturnResultsBehavior + return s +} + func (s *CreateForSQLProcedureRequest) WithComment(Comment string) *CreateForSQLProcedureRequest { s.Comment = &Comment return s @@ -458,23 +493,13 @@ func (s *AlterProcedureRequest) WithRenameTo(RenameTo SchemaObjectIdentifier) *A return s } -func (s *AlterProcedureRequest) WithSetComment(SetComment string) *AlterProcedureRequest { - s.SetComment = &SetComment - return s -} - -func (s *AlterProcedureRequest) WithSetLogLevel(SetLogLevel string) *AlterProcedureRequest { - s.SetLogLevel = &SetLogLevel +func (s *AlterProcedureRequest) WithSet(Set ProcedureSetRequest) *AlterProcedureRequest { + s.Set = &Set return s } -func (s *AlterProcedureRequest) WithSetTraceLevel(SetTraceLevel string) *AlterProcedureRequest { - s.SetTraceLevel = &SetTraceLevel - return s -} - -func (s *AlterProcedureRequest) WithUnsetComment(UnsetComment bool) *AlterProcedureRequest { - s.UnsetComment = &UnsetComment +func (s *AlterProcedureRequest) WithUnset(Unset ProcedureUnsetRequest) *AlterProcedureRequest { + s.Unset = &Unset return s } @@ -493,6 +518,91 @@ func (s *AlterProcedureRequest) WithExecuteAs(ExecuteAs ExecuteAs) *AlterProcedu return s } +func NewProcedureSetRequest() *ProcedureSetRequest { + return &ProcedureSetRequest{} +} + +func (s *ProcedureSetRequest) WithComment(Comment string) *ProcedureSetRequest { + s.Comment = &Comment + return s +} + +func (s *ProcedureSetRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations []AccountObjectIdentifier) *ProcedureSetRequest { + s.ExternalAccessIntegrations = ExternalAccessIntegrations + return s +} + +func (s *ProcedureSetRequest) WithSecretsList(SecretsList SecretsListRequest) *ProcedureSetRequest { + s.SecretsList = &SecretsList + return s +} + +func (s *ProcedureSetRequest) WithAutoEventLogging(AutoEventLogging AutoEventLogging) *ProcedureSetRequest { + s.AutoEventLogging = &AutoEventLogging + return s +} + +func (s *ProcedureSetRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *ProcedureSetRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *ProcedureSetRequest) WithLogLevel(LogLevel LogLevel) *ProcedureSetRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *ProcedureSetRequest) WithMetricLevel(MetricLevel MetricLevel) *ProcedureSetRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *ProcedureSetRequest) WithTraceLevel(TraceLevel TraceLevel) *ProcedureSetRequest { + s.TraceLevel = &TraceLevel + return s +} + +// NewSecretsListRequest removed manually - redeclared in functions + +func NewProcedureUnsetRequest() *ProcedureUnsetRequest { + return &ProcedureUnsetRequest{} +} + +func (s *ProcedureUnsetRequest) WithComment(Comment bool) *ProcedureUnsetRequest { + s.Comment = &Comment + return s +} + +func (s *ProcedureUnsetRequest) WithExternalAccessIntegrations(ExternalAccessIntegrations bool) *ProcedureUnsetRequest { + s.ExternalAccessIntegrations = &ExternalAccessIntegrations + return s +} + +func (s *ProcedureUnsetRequest) WithAutoEventLogging(AutoEventLogging bool) *ProcedureUnsetRequest { + s.AutoEventLogging = &AutoEventLogging + return s +} + +func (s *ProcedureUnsetRequest) WithEnableConsoleOutput(EnableConsoleOutput bool) *ProcedureUnsetRequest { + s.EnableConsoleOutput = &EnableConsoleOutput + return s +} + +func (s *ProcedureUnsetRequest) WithLogLevel(LogLevel bool) *ProcedureUnsetRequest { + s.LogLevel = &LogLevel + return s +} + +func (s *ProcedureUnsetRequest) WithMetricLevel(MetricLevel bool) *ProcedureUnsetRequest { + s.MetricLevel = &MetricLevel + return s +} + +func (s *ProcedureUnsetRequest) WithTraceLevel(TraceLevel bool) *ProcedureUnsetRequest { + s.TraceLevel = &TraceLevel + return s +} + func NewDropProcedureRequest( name SchemaObjectIdentifierWithArguments, ) *DropProcedureRequest { @@ -515,7 +625,7 @@ func (s *ShowProcedureRequest) WithLike(Like Like) *ShowProcedureRequest { return s } -func (s *ShowProcedureRequest) WithIn(In In) *ShowProcedureRequest { +func (s *ShowProcedureRequest) WithIn(In ExtendedIn) *ShowProcedureRequest { s.In = &In return s } diff --git a/pkg/sdk/procedures_dto_gen.go b/pkg/sdk/procedures_dto_gen.go index bf3e0a8d72..75d57c2448 100644 --- a/pkg/sdk/procedures_dto_gen.go +++ b/pkg/sdk/procedures_dto_gen.go @@ -38,6 +38,7 @@ type CreateForJavaProcedureRequest struct { Secrets []SecretReference TargetPath *string NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior Comment *string ExecuteAs *ExecuteAs ProcedureDefinition *string @@ -81,18 +82,19 @@ type ProcedureImportRequest struct { } type CreateForJavaScriptProcedureRequest struct { - OrReplace *bool - Secure *bool - name SchemaObjectIdentifier // required - Arguments []ProcedureArgumentRequest - CopyGrants *bool - ResultDataTypeOld DataType - ResultDataType datatypes.DataType // required - NotNull *bool - NullInputBehavior *NullInputBehavior - Comment *string - ExecuteAs *ExecuteAs - ProcedureDefinition string // required + OrReplace *bool + Secure *bool + name SchemaObjectIdentifier // required + Arguments []ProcedureArgumentRequest + CopyGrants *bool + ResultDataTypeOld DataType + ResultDataType datatypes.DataType // required + NotNull *bool + NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior + Comment *string + ExecuteAs *ExecuteAs + ProcedureDefinition string // required } type CreateForPythonProcedureRequest struct { @@ -109,40 +111,45 @@ type CreateForPythonProcedureRequest struct { ExternalAccessIntegrations []AccountObjectIdentifier Secrets []SecretReference NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior Comment *string ExecuteAs *ExecuteAs ProcedureDefinition *string } type CreateForScalaProcedureRequest struct { - OrReplace *bool - Secure *bool - name SchemaObjectIdentifier // required - Arguments []ProcedureArgumentRequest - CopyGrants *bool - Returns ProcedureReturnsRequest // required - RuntimeVersion string // required - Packages []ProcedurePackageRequest // required - Imports []ProcedureImportRequest - Handler string // required - TargetPath *string - NullInputBehavior *NullInputBehavior - Comment *string - ExecuteAs *ExecuteAs - ProcedureDefinition *string + OrReplace *bool + Secure *bool + name SchemaObjectIdentifier // required + Arguments []ProcedureArgumentRequest + CopyGrants *bool + Returns ProcedureReturnsRequest // required + RuntimeVersion string // required + Packages []ProcedurePackageRequest // required + Imports []ProcedureImportRequest + Handler string // required + ExternalAccessIntegrations []AccountObjectIdentifier + Secrets []SecretReference + TargetPath *string + NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior + Comment *string + ExecuteAs *ExecuteAs + ProcedureDefinition *string } type CreateForSQLProcedureRequest struct { - OrReplace *bool - Secure *bool - name SchemaObjectIdentifier // required - Arguments []ProcedureArgumentRequest - CopyGrants *bool - Returns ProcedureSQLReturnsRequest // required - NullInputBehavior *NullInputBehavior - Comment *string - ExecuteAs *ExecuteAs - ProcedureDefinition string // required + OrReplace *bool + Secure *bool + name SchemaObjectIdentifier // required + Arguments []ProcedureArgumentRequest + CopyGrants *bool + Returns ProcedureSQLReturnsRequest // required + NullInputBehavior *NullInputBehavior + ReturnResultsBehavior *ReturnResultsBehavior + Comment *string + ExecuteAs *ExecuteAs + ProcedureDefinition string // required } type ProcedureSQLReturnsRequest struct { @@ -152,16 +159,37 @@ type ProcedureSQLReturnsRequest struct { } type AlterProcedureRequest struct { - IfExists *bool - name SchemaObjectIdentifierWithArguments // required - RenameTo *SchemaObjectIdentifier - SetComment *string - SetLogLevel *string - SetTraceLevel *string - UnsetComment *bool - SetTags []TagAssociation - UnsetTags []ObjectIdentifier - ExecuteAs *ExecuteAs + IfExists *bool + name SchemaObjectIdentifierWithArguments // required + RenameTo *SchemaObjectIdentifier + Set *ProcedureSetRequest + Unset *ProcedureUnsetRequest + SetTags []TagAssociation + UnsetTags []ObjectIdentifier + ExecuteAs *ExecuteAs +} + +type ProcedureSetRequest struct { + Comment *string + ExternalAccessIntegrations []AccountObjectIdentifier + SecretsList *SecretsListRequest + AutoEventLogging *AutoEventLogging + EnableConsoleOutput *bool + LogLevel *LogLevel + MetricLevel *MetricLevel + TraceLevel *TraceLevel +} + +// SecretsListRequest removed manually - redeclaration with function + +type ProcedureUnsetRequest struct { + Comment *bool + ExternalAccessIntegrations *bool + AutoEventLogging *bool + EnableConsoleOutput *bool + LogLevel *bool + MetricLevel *bool + TraceLevel *bool } type DropProcedureRequest struct { @@ -171,7 +199,7 @@ type DropProcedureRequest struct { type ShowProcedureRequest struct { Like *Like - In *In + In *ExtendedIn } type DescribeProcedureRequest struct { diff --git a/pkg/sdk/procedures_ext.go b/pkg/sdk/procedures_ext.go index 31307bc2fb..a8ee2844bf 100644 --- a/pkg/sdk/procedures_ext.go +++ b/pkg/sdk/procedures_ext.go @@ -1,5 +1,151 @@ package sdk +import ( + "context" + "errors" + "fmt" + "strconv" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" +) + +const DefaultProcedureComment = "user-defined procedure" + func (v *Procedure) ID() SchemaObjectIdentifierWithArguments { return NewSchemaObjectIdentifierWithArguments(v.CatalogName, v.SchemaName, v.Name, v.ArgumentsOld...) } + +// ProcedureDetails contains aggregated describe results for the given procedure. +type ProcedureDetails struct { + Signature string // present for all procedure types + Returns string // present for all procedure types + Language string // present for all procedure types + NullHandling *string // present for all procedure types but SQL + Body *string // present for all procedure types (hidden when SECURE) + Volatility *string // present for all procedure types but SQL + ExternalAccessIntegrations *string // list present for python, java, and scala + Secrets *string // map present for python, java, and scala + Imports *string // list present for python, java, and scala (hidden when SECURE) + Handler *string // present for python, java, and scala (hidden when SECURE) + RuntimeVersion *string // present for python, java, and scala (hidden when SECURE) + Packages *string // list // present for python, java, and scala (hidden when SECURE) + TargetPath *string // list present for scala and java (hidden when SECURE) + InstalledPackages *string // list present for python (hidden when SECURE) + ExecuteAs string // present for all procedure types +} + +func procedureDetailsFromRows(rows []ProcedureDetail) (*ProcedureDetails, error) { + v := &ProcedureDetails{} + var errs []error + for _, row := range rows { + switch row.Property { + case "signature": + errs = append(errs, row.setStringValueOrError("signature", &v.Signature)) + case "returns": + errs = append(errs, row.setStringValueOrError("returns", &v.Returns)) + case "language": + errs = append(errs, row.setStringValueOrError("language", &v.Language)) + case "execute as": + errs = append(errs, row.setStringValueOrError("execute as", &v.ExecuteAs)) + case "null handling": + v.NullHandling = row.Value + case "volatility": + v.Volatility = row.Value + case "body": + v.Body = row.Value + case "external_access_integrations": + v.ExternalAccessIntegrations = row.Value + case "secrets": + v.Secrets = row.Value + case "imports": + v.Imports = row.Value + case "handler": + v.Handler = row.Value + case "runtime_version": + v.RuntimeVersion = row.Value + case "packages": + v.Packages = row.Value + case "installed_packages": + v.InstalledPackages = row.Value + case "target_path": + v.TargetPath = row.Value + } + } + return v, errors.Join(errs...) +} + +func (d *ProcedureDetail) setStringValueOrError(property string, field *string) error { + if d.Value == nil { + return fmt.Errorf("value expected for field %s", property) + } else { + *field = *d.Value + } + return nil +} + +func (d *ProcedureDetail) setOptionalBoolValueOrError(property string, field **bool) error { + if d.Value != nil && *d.Value != "" { + v, err := strconv.ParseBool(*d.Value) + if err != nil { + return fmt.Errorf("invalid value for field %s, err: %w", property, err) + } else { + *field = Bool(v) + } + } + return nil +} + +func (v *procedures) DescribeDetails(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*ProcedureDetails, error) { + rows, err := v.Describe(ctx, id) + if err != nil { + return nil, err + } + return procedureDetailsFromRows(rows) +} + +func (v *procedures) ShowParameters(ctx context.Context, id SchemaObjectIdentifierWithArguments) ([]*Parameter, error) { + return v.client.Parameters.ShowParameters(ctx, &ShowParametersOptions{ + In: &ParametersIn{ + Procedure: id, + }, + }) +} + +func (s *CreateForJavaProcedureRequest) WithProcedureDefinitionWrapped(procedureDefinition string) *CreateForJavaProcedureRequest { + s.ProcedureDefinition = String(fmt.Sprintf(`$$%s$$`, procedureDefinition)) + return s +} + +func (s *CreateForPythonProcedureRequest) WithProcedureDefinitionWrapped(procedureDefinition string) *CreateForPythonProcedureRequest { + s.ProcedureDefinition = String(fmt.Sprintf(`$$%s$$`, procedureDefinition)) + return s +} + +func (s *CreateForScalaProcedureRequest) WithProcedureDefinitionWrapped(procedureDefinition string) *CreateForScalaProcedureRequest { + s.ProcedureDefinition = String(fmt.Sprintf(`$$%s$$`, procedureDefinition)) + return s +} + +func NewCreateForSQLProcedureRequestDefinitionWrapped( + name SchemaObjectIdentifier, + returns ProcedureSQLReturnsRequest, + procedureDefinition string, +) *CreateForSQLProcedureRequest { + s := CreateForSQLProcedureRequest{} + s.name = name + s.Returns = returns + s.ProcedureDefinition = fmt.Sprintf(`$$%s$$`, procedureDefinition) + return &s +} + +func NewCreateForJavaScriptProcedureRequestDefinitionWrapped( + name SchemaObjectIdentifier, + resultDataType datatypes.DataType, + procedureDefinition string, +) *CreateForJavaScriptProcedureRequest { + s := CreateForJavaScriptProcedureRequest{} + s.name = name + s.ResultDataType = resultDataType + s.ProcedureDefinition = fmt.Sprintf(`$$%s$$`, procedureDefinition) + return &s +} diff --git a/pkg/sdk/procedures_gen.go b/pkg/sdk/procedures_gen.go index c65e95e94a..0dbbf8f5a1 100644 --- a/pkg/sdk/procedures_gen.go +++ b/pkg/sdk/procedures_gen.go @@ -25,6 +25,10 @@ type Procedures interface { CreateAndCallForJavaScript(ctx context.Context, request *CreateAndCallForJavaScriptProcedureRequest) error CreateAndCallForPython(ctx context.Context, request *CreateAndCallForPythonProcedureRequest) error CreateAndCallForSQL(ctx context.Context, request *CreateAndCallForSQLProcedureRequest) error + + // DescribeDetails is added manually; it returns aggregated describe results for the given procedure. + DescribeDetails(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*ProcedureDetails, error) + ShowParameters(ctx context.Context, id SchemaObjectIdentifierWithArguments) ([]*Parameter, error) } // CreateForJavaProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-procedure#java-handler. @@ -38,6 +42,8 @@ type CreateForJavaProcedureOptions struct { CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` languageJava bool `ddl:"static" sql:"LANGUAGE JAVA"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` @@ -45,14 +51,13 @@ type CreateForJavaProcedureOptions struct { ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` ExecuteAs *ExecuteAs `ddl:"keyword"` - ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + ProcedureDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } type ProcedureArgument struct { - ArgName string `ddl:"keyword,no_quotes"` + ArgName string `ddl:"keyword,double_quotes"` ArgDataTypeOld DataType `ddl:"keyword,no_quotes"` ArgDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` DefaultValue *string `ddl:"parameter,no_equals" sql:"DEFAULT"` @@ -75,7 +80,7 @@ type ProcedureReturnsTable struct { } type ProcedureColumn struct { - ColumnName string `ddl:"keyword,no_quotes"` + ColumnName string `ddl:"keyword,double_quotes"` ColumnDataTypeOld DataType `ddl:"keyword,no_quotes"` ColumnDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` } @@ -90,22 +95,23 @@ type ProcedureImport struct { // CreateForJavaScriptProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-procedure#javascript-handler. type CreateForJavaScriptProcedureOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Secure *bool `ddl:"keyword" sql:"SECURE"` - procedure bool `ddl:"static" sql:"PROCEDURE"` - name SchemaObjectIdentifier `ddl:"identifier"` - Arguments []ProcedureArgument `ddl:"list,must_parentheses"` - CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` - returns bool `ddl:"static" sql:"RETURNS"` - ResultDataTypeOld DataType `ddl:"parameter,no_equals"` - ResultDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` - NotNull *bool `ddl:"keyword" sql:"NOT NULL"` - languageJavascript bool `ddl:"static" sql:"LANGUAGE JAVASCRIPT"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - ExecuteAs *ExecuteAs `ddl:"keyword"` - ProcedureDefinition string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Secure *bool `ddl:"keyword" sql:"SECURE"` + procedure bool `ddl:"static" sql:"PROCEDURE"` + name SchemaObjectIdentifier `ddl:"identifier"` + Arguments []ProcedureArgument `ddl:"list,must_parentheses"` + CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` + returns bool `ddl:"static" sql:"RETURNS"` + ResultDataTypeOld DataType `ddl:"parameter,no_equals"` + ResultDataType datatypes.DataType `ddl:"parameter,no_quotes,no_equals"` + NotNull *bool `ddl:"keyword" sql:"NOT NULL"` + languageJavascript bool `ddl:"static" sql:"LANGUAGE JAVASCRIPT"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + ExecuteAs *ExecuteAs `ddl:"keyword"` + ProcedureDefinition string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForPythonProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-procedure#python-handler. @@ -119,55 +125,60 @@ type CreateForPythonProcedureOptions struct { CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` languagePython bool `ddl:"static" sql:"LANGUAGE PYTHON"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` ExecuteAs *ExecuteAs `ddl:"keyword"` - ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + ProcedureDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForScalaProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-procedure#scala-handler. type CreateForScalaProcedureOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Secure *bool `ddl:"keyword" sql:"SECURE"` - procedure bool `ddl:"static" sql:"PROCEDURE"` - name SchemaObjectIdentifier `ddl:"identifier"` - Arguments []ProcedureArgument `ddl:"list,must_parentheses"` - CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` - Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` - languageScala bool `ddl:"static" sql:"LANGUAGE SCALA"` - RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` - Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` - Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` - Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` - TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - ExecuteAs *ExecuteAs `ddl:"keyword"` - ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Secure *bool `ddl:"keyword" sql:"SECURE"` + procedure bool `ddl:"static" sql:"PROCEDURE"` + name SchemaObjectIdentifier `ddl:"identifier"` + Arguments []ProcedureArgument `ddl:"list,must_parentheses"` + CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` + Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` + languageScala bool `ddl:"static" sql:"LANGUAGE SCALA"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` + RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` + Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` + Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` + Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` + ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + Secrets []SecretReference `ddl:"parameter,parentheses" sql:"SECRETS"` + TargetPath *string `ddl:"parameter,single_quotes" sql:"TARGET_PATH"` + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + ExecuteAs *ExecuteAs `ddl:"keyword"` + ProcedureDefinition *string `ddl:"parameter,no_equals" sql:"AS"` } // CreateForSQLProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-procedure#snowflake-scripting-handler. type CreateForSQLProcedureOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Secure *bool `ddl:"keyword" sql:"SECURE"` - procedure bool `ddl:"static" sql:"PROCEDURE"` - name SchemaObjectIdentifier `ddl:"identifier"` - Arguments []ProcedureArgument `ddl:"list,must_parentheses"` - CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` - Returns ProcedureSQLReturns `ddl:"keyword" sql:"RETURNS"` - languageSql bool `ddl:"static" sql:"LANGUAGE SQL"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - ExecuteAs *ExecuteAs `ddl:"keyword"` - ProcedureDefinition string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Secure *bool `ddl:"keyword" sql:"SECURE"` + procedure bool `ddl:"static" sql:"PROCEDURE"` + name SchemaObjectIdentifier `ddl:"identifier"` + Arguments []ProcedureArgument `ddl:"list,must_parentheses"` + CopyGrants *bool `ddl:"keyword" sql:"COPY GRANTS"` + Returns ProcedureSQLReturns `ddl:"keyword" sql:"RETURNS"` + languageSql bool `ddl:"static" sql:"LANGUAGE SQL"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` + ReturnResultsBehavior *ReturnResultsBehavior `ddl:"keyword"` + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + ExecuteAs *ExecuteAs `ddl:"keyword"` + ProcedureDefinition string `ddl:"parameter,no_equals" sql:"AS"` } type ProcedureSQLReturns struct { @@ -178,18 +189,39 @@ type ProcedureSQLReturns struct { // AlterProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/alter-procedure. type AlterProcedureOptions struct { - alter bool `ddl:"static" sql:"ALTER"` - procedure bool `ddl:"static" sql:"PROCEDURE"` - IfExists *bool `ddl:"keyword" sql:"IF EXISTS"` - name SchemaObjectIdentifierWithArguments `ddl:"identifier"` - RenameTo *SchemaObjectIdentifier `ddl:"identifier" sql:"RENAME TO"` - SetComment *string `ddl:"parameter,single_quotes" sql:"SET COMMENT"` - SetLogLevel *string `ddl:"parameter,single_quotes" sql:"SET LOG_LEVEL"` - SetTraceLevel *string `ddl:"parameter,single_quotes" sql:"SET TRACE_LEVEL"` - UnsetComment *bool `ddl:"keyword" sql:"UNSET COMMENT"` - SetTags []TagAssociation `ddl:"keyword" sql:"SET TAG"` - UnsetTags []ObjectIdentifier `ddl:"keyword" sql:"UNSET TAG"` - ExecuteAs *ExecuteAs `ddl:"keyword"` + alter bool `ddl:"static" sql:"ALTER"` + procedure bool `ddl:"static" sql:"PROCEDURE"` + IfExists *bool `ddl:"keyword" sql:"IF EXISTS"` + name SchemaObjectIdentifierWithArguments `ddl:"identifier"` + RenameTo *SchemaObjectIdentifier `ddl:"identifier" sql:"RENAME TO"` + Set *ProcedureSet `ddl:"list" sql:"SET"` + Unset *ProcedureUnset `ddl:"list" sql:"UNSET"` + SetTags []TagAssociation `ddl:"keyword" sql:"SET TAG"` + UnsetTags []ObjectIdentifier `ddl:"keyword" sql:"UNSET TAG"` + ExecuteAs *ExecuteAs `ddl:"keyword"` +} + +type ProcedureSet struct { + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + ExternalAccessIntegrations []AccountObjectIdentifier `ddl:"parameter,parentheses" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + SecretsList *SecretsList `ddl:"parameter,parentheses" sql:"SECRETS"` + AutoEventLogging *AutoEventLogging `ddl:"parameter,single_quotes" sql:"AUTO_EVENT_LOGGING"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + MetricLevel *MetricLevel `ddl:"parameter,single_quotes" sql:"METRIC_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` +} + +// SecretsList removed manually - redeclared in functions + +type ProcedureUnset struct { + Comment *bool `ddl:"keyword" sql:"COMMENT"` + ExternalAccessIntegrations *bool `ddl:"keyword" sql:"EXTERNAL_ACCESS_INTEGRATIONS"` + AutoEventLogging *bool `ddl:"keyword" sql:"AUTO_EVENT_LOGGING"` + EnableConsoleOutput *bool `ddl:"keyword" sql:"ENABLE_CONSOLE_OUTPUT"` + LogLevel *bool `ddl:"keyword" sql:"LOG_LEVEL"` + MetricLevel *bool `ddl:"keyword" sql:"METRIC_LEVEL"` + TraceLevel *bool `ddl:"keyword" sql:"TRACE_LEVEL"` } // DropProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/drop-procedure. @@ -202,45 +234,49 @@ type DropProcedureOptions struct { // ShowProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/show-procedures. type ShowProcedureOptions struct { - show bool `ddl:"static" sql:"SHOW"` - procedures bool `ddl:"static" sql:"PROCEDURES"` - Like *Like `ddl:"keyword" sql:"LIKE"` - In *In `ddl:"keyword" sql:"IN"` + show bool `ddl:"static" sql:"SHOW"` + procedures bool `ddl:"static" sql:"PROCEDURES"` + Like *Like `ddl:"keyword" sql:"LIKE"` + In *ExtendedIn `ddl:"keyword" sql:"IN"` } type procedureRow struct { - CreatedOn string `db:"created_on"` - Name string `db:"name"` - SchemaName string `db:"schema_name"` - IsBuiltin string `db:"is_builtin"` - IsAggregate string `db:"is_aggregate"` - IsAnsi string `db:"is_ansi"` - MinNumArguments int `db:"min_num_arguments"` - MaxNumArguments int `db:"max_num_arguments"` - Arguments string `db:"arguments"` - Description string `db:"description"` - CatalogName string `db:"catalog_name"` - IsTableFunction string `db:"is_table_function"` - ValidForClustering string `db:"valid_for_clustering"` - IsSecure sql.NullString `db:"is_secure"` + CreatedOn string `db:"created_on"` + Name string `db:"name"` + SchemaName string `db:"schema_name"` + IsBuiltin string `db:"is_builtin"` + IsAggregate string `db:"is_aggregate"` + IsAnsi string `db:"is_ansi"` + MinNumArguments int `db:"min_num_arguments"` + MaxNumArguments int `db:"max_num_arguments"` + Arguments string `db:"arguments"` + Description string `db:"description"` + CatalogName string `db:"catalog_name"` + IsTableFunction string `db:"is_table_function"` + ValidForClustering string `db:"valid_for_clustering"` + IsSecure sql.NullString `db:"is_secure"` + Secrets sql.NullString `db:"secrets"` + ExternalAccessIntegrations sql.NullString `db:"external_access_integrations"` } type Procedure struct { - CreatedOn string - Name string - SchemaName string - IsBuiltin bool - IsAggregate bool - IsAnsi bool - MinNumArguments int - MaxNumArguments int - ArgumentsOld []DataType - ArgumentsRaw string - Description string - CatalogName string - IsTableFunction bool - ValidForClustering bool - IsSecure bool + CreatedOn string + Name string + SchemaName string + IsBuiltin bool + IsAggregate bool + IsAnsi bool + MinNumArguments int + MaxNumArguments int + ArgumentsOld []DataType + ArgumentsRaw string + Description string + CatalogName string + IsTableFunction bool + ValidForClustering bool + IsSecure bool + Secrets *string + ExternalAccessIntegrations *string } // DescribeProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/desc-procedure. @@ -257,7 +293,7 @@ type procedureDetailRow struct { type ProcedureDetail struct { Property string - Value string + Value *string } // CallProcedureOptions is based on https://docs.snowflake.com/en/sql-reference/sql/call. @@ -276,11 +312,11 @@ type CreateAndCallForJavaProcedureOptions struct { Arguments []ProcedureArgument `ddl:"list,must_parentheses"` Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` languageJava bool `ddl:"static" sql:"LANGUAGE JAVA"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` WithClause *ProcedureWithClause `ddl:"keyword"` call bool `ddl:"static" sql:"CALL"` @@ -288,6 +324,7 @@ type CreateAndCallForJavaProcedureOptions struct { CallArguments []string `ddl:"keyword,must_parentheses"` ScriptingVariable *string `ddl:"parameter,no_quotes,no_equals" sql:"INTO"` } + type ProcedureWithClause struct { prefix bool `ddl:"static" sql:","` CteName AccountObjectIdentifier `ddl:"identifier"` @@ -303,11 +340,11 @@ type CreateAndCallForScalaProcedureOptions struct { Arguments []ProcedureArgument `ddl:"list,must_parentheses"` Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` languageScala bool `ddl:"static" sql:"LANGUAGE SCALA"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` WithClauses []ProcedureWithClause `ddl:"keyword"` call bool `ddl:"static" sql:"CALL"` @@ -344,11 +381,11 @@ type CreateAndCallForPythonProcedureOptions struct { Arguments []ProcedureArgument `ddl:"list,must_parentheses"` Returns ProcedureReturns `ddl:"keyword" sql:"RETURNS"` languagePython bool `ddl:"static" sql:"LANGUAGE PYTHON"` + NullInputBehavior *NullInputBehavior `ddl:"keyword"` RuntimeVersion string `ddl:"parameter,single_quotes" sql:"RUNTIME_VERSION"` Packages []ProcedurePackage `ddl:"parameter,parentheses" sql:"PACKAGES"` Imports []ProcedureImport `ddl:"parameter,parentheses" sql:"IMPORTS"` Handler string `ddl:"parameter,single_quotes" sql:"HANDLER"` - NullInputBehavior *NullInputBehavior `ddl:"keyword"` ProcedureDefinition *string `ddl:"parameter,single_quotes,no_equals" sql:"AS"` WithClauses []ProcedureWithClause `ddl:"keyword"` call bool `ddl:"static" sql:"CALL"` diff --git a/pkg/sdk/procedures_gen_test.go b/pkg/sdk/procedures_gen_test.go index 994181b59b..f7e84503d2 100644 --- a/pkg/sdk/procedures_gen_test.go +++ b/pkg/sdk/procedures_gen_test.go @@ -6,6 +6,8 @@ import ( func TestProcedures_CreateForJava(t *testing.T) { id := randomSchemaObjectIdentifier() + secretId := randomSchemaObjectIdentifier() + secretId2 := randomSchemaObjectIdentifier() defaultOpts := func() *CreateForJavaProcedureOptions { return &CreateForJavaProcedureOptions{ @@ -25,12 +27,6 @@ func TestProcedures_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { - opts := defaultOpts() - opts.name = emptySchemaObjectIdentifier - assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) - }) - t.Run("validation: [opts.RuntimeVersion] should be set", func(t *testing.T) { opts := defaultOpts() opts.RuntimeVersion = "" @@ -49,6 +45,12 @@ func TestProcedures_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavaProcedureOptions", "Handler")) }) + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { + opts := defaultOpts() + opts.name = emptySchemaObjectIdentifier + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + }) + t.Run("validation: exactly one field from [opts.Arguments.ArgDataTypeOld opts.Arguments.ArgDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Arguments = []ProcedureArgument{ @@ -82,6 +84,12 @@ func TestProcedures_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForSQLProcedureOptions.Returns.ResultDataType", "ResultDataTypeOld", "ResultDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaProcedureOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present - two present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{ @@ -130,13 +138,7 @@ func TestProcedures_CreateForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForSQLProcedureOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaProcedureOptions.Returns", "ResultDataType", "Table")) - }) - - t.Run("validation: function definition", func(t *testing.T) { + t.Run("validation: procedure definition", func(t *testing.T) { opts := defaultOpts() opts.TargetPath = String("@~/testfunc.jar") opts.Packages = []ProcedurePackage{ @@ -192,19 +194,20 @@ func TestProcedures_CreateForJava(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.TargetPath = String("@~/testfunc.jar") opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("return id + name;") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (id NUMBER, name VARCHAR DEFAULT 'test') COPY GRANTS RETURNS TABLE (country_code VARCHAR) LANGUAGE JAVA RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) TARGET_PATH = '@~/testfunc.jar' STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return id + name;'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("id" NUMBER, "name" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS TABLE ("country_code" VARCHAR) LANGUAGE JAVA STRICT IMMUTABLE RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) TARGET_PATH = '@~/testfunc.jar' COMMENT = 'test comment' EXECUTE AS CALLER AS return id + name;`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -251,19 +254,20 @@ func TestProcedures_CreateForJava(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.TargetPath = String("@~/testfunc.jar") opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("return id + name;") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (id NUMBER(36, 2), name VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS TABLE (country_code VARCHAR(100)) LANGUAGE JAVA RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) TARGET_PATH = '@~/testfunc.jar' STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return id + name;'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("id" NUMBER(36, 2), "name" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS TABLE ("country_code" VARCHAR(100)) LANGUAGE JAVA STRICT IMMUTABLE RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) TARGET_PATH = '@~/testfunc.jar' COMMENT = 'test comment' EXECUTE AS CALLER AS return id + name;`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) } @@ -288,6 +292,12 @@ func TestProcedures_CreateForJavaScript(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForJavaScriptProcedureOptions", "ProcedureDefinition")) }) + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { + opts := defaultOpts() + opts.name = emptySchemaObjectIdentifier + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + }) + t.Run("validation: exactly one field from [opts.ResultDataTypeOld opts.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaScriptProcedureOptions", "ResultDataTypeOld", "ResultDataType")) @@ -325,12 +335,6 @@ func TestProcedures_CreateForJavaScript(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForJavaScriptProcedureOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { - opts := defaultOpts() - opts.name = emptySchemaObjectIdentifier - assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) - }) - // TODO [SNOW-1348106]: remove with old procedure removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -347,10 +351,11 @@ func TestProcedures_CreateForJavaScript(t *testing.T) { opts.ResultDataTypeOld = "DOUBLE" opts.NotNull = Bool(true) opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = "return 1;" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (d DOUBLE DEFAULT 1.0) COPY GRANTS RETURNS DOUBLE NOT NULL LANGUAGE JAVASCRIPT STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return 1;'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("d" DOUBLE DEFAULT 1.0) COPY GRANTS RETURNS DOUBLE NOT NULL LANGUAGE JAVASCRIPT STRICT IMMUTABLE COMMENT = 'test comment' EXECUTE AS CALLER AS return 1;`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -368,15 +373,18 @@ func TestProcedures_CreateForJavaScript(t *testing.T) { opts.ResultDataType = dataTypeFloat opts.NotNull = Bool(true) opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = "return 1;" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (d FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return 1;'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("d" FLOAT DEFAULT 1.0) COPY GRANTS RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT STRICT IMMUTABLE COMMENT = 'test comment' EXECUTE AS CALLER AS return 1;`, id.FullyQualifiedName()) }) } func TestProcedures_CreateForPython(t *testing.T) { id := randomSchemaObjectIdentifier() + secretId := randomSchemaObjectIdentifier() + secretId2 := randomSchemaObjectIdentifier() defaultOpts := func() *CreateForPythonProcedureOptions { return &CreateForPythonProcedureOptions{ @@ -414,7 +422,7 @@ func TestProcedures_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForPythonProcedureOptions", "Handler")) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -445,6 +453,21 @@ func TestProcedures_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonProcedureOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonProcedureOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{ + ResultDataType: &ProcedureReturnsResultDataType{}, + Table: &ProcedureReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonProcedureOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{ @@ -501,12 +524,6 @@ func TestProcedures_CreateForPython(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForSQLProcedureOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForPythonProcedureOptions.Returns", "ResultDataType", "Table")) - }) - // TODO [SNOW-1348106]: remove with old procedure removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -550,18 +567,19 @@ func TestProcedures_CreateForPython(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("import numpy as np") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (i int DEFAULT 1) COPY GRANTS RETURNS VARIANT NULL LANGUAGE PYTHON RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'import numpy as np'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("i" int DEFAULT 1) COPY GRANTS RETURNS VARIANT NULL LANGUAGE PYTHON STRICT IMMUTABLE RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) COMMENT = 'test comment' EXECUTE AS CALLER AS import numpy as np`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -606,18 +624,19 @@ func TestProcedures_CreateForPython(t *testing.T) { opts.Secrets = []SecretReference{ { VariableName: "variable1", - Name: "name1", + Name: secretId, }, { VariableName: "variable2", - Name: "name2", + Name: secretId2, }, } opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("import numpy as np") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (i NUMBER(36, 2) DEFAULT 1) COPY GRANTS RETURNS VARIANT NULL LANGUAGE PYTHON RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = name1, 'variable2' = name2) STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'import numpy as np'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("i" NUMBER(36, 2) DEFAULT 1) COPY GRANTS RETURNS VARIANT NULL LANGUAGE PYTHON STRICT IMMUTABLE RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' EXTERNAL_ACCESS_INTEGRATIONS = ("ext_integration") SECRETS = ('variable1' = %s, 'variable2' = %s) COMMENT = 'test comment' EXECUTE AS CALLER AS import numpy as np`, id.FullyQualifiedName(), secretId.FullyQualifiedName(), secretId2.FullyQualifiedName()) }) } @@ -660,7 +679,7 @@ func TestProcedures_CreateForScala(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForScalaProcedureOptions", "Handler")) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -691,6 +710,21 @@ func TestProcedures_CreateForScala(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaProcedureOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaProcedureOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{ + ResultDataType: &ProcedureReturnsResultDataType{}, + Table: &ProcedureReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaProcedureOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{ @@ -747,13 +781,7 @@ func TestProcedures_CreateForScala(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForSQLProcedureOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForScalaProcedureOptions.Returns", "ResultDataType", "Table")) - }) - - t.Run("validation: function definition", func(t *testing.T) { + t.Run("validation: procedure definition", func(t *testing.T) { opts := defaultOpts() opts.TargetPath = String("@~/testfunc.jar") opts.Packages = []ProcedurePackage{ @@ -797,10 +825,11 @@ func TestProcedures_CreateForScala(t *testing.T) { opts.Handler = "Echo.echoVarchar" opts.TargetPath = String("@~/testfunc.jar") opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("return x") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (x VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SCALA RUNTIME_VERSION = '2.0' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' TARGET_PATH = '@~/testfunc.jar' STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return x'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("x" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SCALA STRICT IMMUTABLE RUNTIME_VERSION = '2.0' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' TARGET_PATH = '@~/testfunc.jar' COMMENT = 'test comment' EXECUTE AS CALLER AS return x`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -835,10 +864,11 @@ func TestProcedures_CreateForScala(t *testing.T) { opts.Handler = "Echo.echoVarchar" opts.TargetPath = String("@~/testfunc.jar") opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = String("return x") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (x VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SCALA RUNTIME_VERSION = '2.0' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' TARGET_PATH = '@~/testfunc.jar' STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS 'return x'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("x" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SCALA STRICT IMMUTABLE RUNTIME_VERSION = '2.0' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('@udf_libs/echohandler.jar') HANDLER = 'Echo.echoVarchar' TARGET_PATH = '@~/testfunc.jar' COMMENT = 'test comment' EXECUTE AS CALLER AS return x`, id.FullyQualifiedName()) }) } @@ -868,23 +898,12 @@ func TestProcedures_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateForSQLProcedureOptions", "ProcedureDefinition")) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("create with no arguments", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureSQLReturns{ - ResultDataType: &ProcedureReturnsResultDataType{ - ResultDataType: dataTypeFloat, - }, - } - opts.ProcedureDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE PROCEDURE %s () RETURNS FLOAT LANGUAGE SQL AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) - }) - t.Run("validation: exactly one field from [opts.Arguments.ArgDataTypeOld opts.Arguments.ArgDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Arguments = []ProcedureArgument{ @@ -910,6 +929,21 @@ func TestProcedures_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLProcedureOptions.Arguments", "ArgDataTypeOld", "ArgDataType")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureSQLReturns{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLProcedureOptions.Returns", "ResultDataType", "Table")) + }) + + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureSQLReturns{ + ResultDataType: &ProcedureReturnsResultDataType{}, + Table: &ProcedureReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLProcedureOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType.ResultDataTypeOld opts.Returns.ResultDataType.ResultDataType] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureSQLReturns{ @@ -966,12 +1000,6 @@ func TestProcedures_CreateForSQL(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLProcedureOptions.Returns.Table.Columns", "ColumnDataTypeOld", "ColumnDataType")) }) - t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureSQLReturns{} - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateForSQLProcedureOptions.Returns", "ResultDataType", "Table")) - }) - // TODO [SNOW-1348106]: remove with old procedure removal for V1 t.Run("all options - old data types", func(t *testing.T) { opts := defaultOpts() @@ -992,10 +1020,11 @@ func TestProcedures_CreateForSQL(t *testing.T) { NotNull: Bool(true), } opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (message VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SQL STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("message" VARCHAR DEFAULT 'test') COPY GRANTS RETURNS VARCHAR NOT NULL LANGUAGE SQL STRICT IMMUTABLE COMMENT = 'test comment' EXECUTE AS CALLER AS 3.141592654::FLOAT`, id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1017,10 +1046,22 @@ func TestProcedures_CreateForSQL(t *testing.T) { NotNull: Bool(true), } opts.NullInputBehavior = NullInputBehaviorPointer(NullInputBehaviorStrict) + opts.ReturnResultsBehavior = Pointer(ReturnResultsBehaviorImmutable) opts.Comment = String("test comment") opts.ExecuteAs = ExecuteAsPointer(ExecuteAsCaller) opts.ProcedureDefinition = "3.141592654::FLOAT" - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s (message VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SQL STRICT COMMENT = 'test comment' EXECUTE AS CALLER AS '3.141592654::FLOAT'`, id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE SECURE PROCEDURE %s ("message" VARCHAR(100) DEFAULT 'test') COPY GRANTS RETURNS VARCHAR(100) NOT NULL LANGUAGE SQL STRICT IMMUTABLE COMMENT = 'test comment' EXECUTE AS CALLER AS 3.141592654::FLOAT`, id.FullyQualifiedName()) + }) + + t.Run("create with no arguments", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureSQLReturns{ + ResultDataType: &ProcedureReturnsResultDataType{ + ResultDataType: dataTypeFloat, + }, + } + opts.ProcedureDefinition = "3.141592654::FLOAT" + assertOptsValidAndSQLEquals(t, opts, `CREATE PROCEDURE %s () RETURNS FLOAT LANGUAGE SQL AS 3.141592654::FLOAT`, id.FullyQualifiedName()) }) } @@ -1038,7 +1079,7 @@ func TestProcedures_Drop(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifierWithArguments assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -1058,8 +1099,8 @@ func TestProcedures_Drop(t *testing.T) { } func TestProcedures_Alter(t *testing.T) { - noArgsId := randomSchemaObjectIdentifierWithArguments() id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) + secretId := randomSchemaObjectIdentifier() defaultOpts := func() *AlterProcedureOptions { return &AlterProcedureOptions{ @@ -1073,17 +1114,40 @@ func TestProcedures_Alter(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifierWithArguments assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("validation: exactly one field should be present", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.RenameTo] if set", func(t *testing.T) { + opts := defaultOpts() + opts.RenameTo = Pointer(emptySchemaObjectIdentifier) + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + }) + + t.Run("validation: exactly one field from [opts.RenameTo opts.Set opts.Unset opts.SetTags opts.UnsetTags opts.ExecuteAs] should be present", func(t *testing.T) { + opts := defaultOpts() + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterProcedureOptions", "RenameTo", "Set", "Unset", "SetTags", "UnsetTags", "ExecuteAs")) + }) + + t.Run("validation: exactly one field from [opts.RenameTo opts.Set opts.Unset opts.SetTags opts.UnsetTags opts.ExecuteAs] should be present - two present", func(t *testing.T) { + opts := defaultOpts() + opts.Set = &ProcedureSet{} + opts.Unset = &ProcedureUnset{} + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterProcedureOptions", "RenameTo", "Set", "Unset", "SetTags", "UnsetTags", "ExecuteAs")) + }) + + t.Run("validation: at least one of the fields [opts.Set.Comment opts.Set.ExternalAccessIntegrations opts.Set.SecretsList opts.Set.AutoEventLogging opts.Set.EnableConsoleOutput opts.Set.LogLevel opts.Set.MetricLevel opts.Set.TraceLevel] should be set", func(t *testing.T) { opts := defaultOpts() - opts.SetLogLevel = String("DEBUG") - opts.UnsetComment = Bool(true) - assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("AlterProcedureOptions", "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "UnsetComment", "SetTags", "UnsetTags", "ExecuteAs")) + opts.Set = &ProcedureSet{} + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("AlterProcedureOptions.Set", "Comment", "ExternalAccessIntegrations", "SecretsList", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) + }) + + t.Run("validation: at least one of the fields [opts.Unset.Comment opts.Unset.ExternalAccessIntegrations opts.Unset.AutoEventLogging opts.Unset.EnableConsoleOutput opts.Unset.LogLevel opts.Unset.MetricLevel opts.Unset.TraceLevel] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Unset = &ProcedureUnset{} + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("AlterProcedureOptions.Unset", "Comment", "ExternalAccessIntegrations", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) }) t.Run("alter: rename to", func(t *testing.T) { @@ -1100,35 +1164,42 @@ func TestProcedures_Alter(t *testing.T) { assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s EXECUTE AS CALLER`, id.FullyQualifiedName()) }) - t.Run("alter: set log level", func(t *testing.T) { + t.Run("alter: set", func(t *testing.T) { opts := defaultOpts() - opts.SetLogLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET LOG_LEVEL = 'DEBUG'`, id.FullyQualifiedName()) - }) - - t.Run("alter: set log level with no arguments", func(t *testing.T) { - opts := defaultOpts() - opts.name = noArgsId - opts.SetLogLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET LOG_LEVEL = 'DEBUG'`, noArgsId.FullyQualifiedName()) + opts.Set = &ProcedureSet{ + Comment: String("comment"), + TraceLevel: Pointer(TraceLevelOff), + } + assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET COMMENT = 'comment', TRACE_LEVEL = 'OFF'`, id.FullyQualifiedName()) }) - t.Run("alter: set trace level", func(t *testing.T) { + t.Run("alter: set empty secrets", func(t *testing.T) { opts := defaultOpts() - opts.SetTraceLevel = String("DEBUG") - assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET TRACE_LEVEL = 'DEBUG'`, id.FullyQualifiedName()) + opts.Set = &ProcedureSet{ + SecretsList: &SecretsList{}, + } + assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET SECRETS = ()`, id.FullyQualifiedName()) }) - t.Run("alter: set comment", func(t *testing.T) { + t.Run("alter: set non-empty secrets", func(t *testing.T) { opts := defaultOpts() - opts.SetComment = String("comment") - assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET COMMENT = 'comment'`, id.FullyQualifiedName()) + opts.Set = &ProcedureSet{ + SecretsList: &SecretsList{ + []SecretReference{ + {VariableName: "abc", Name: secretId}, + }, + }, + } + assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s SET SECRETS = ('abc' = %s)`, id.FullyQualifiedName(), secretId.FullyQualifiedName()) }) - t.Run("alter: unset comment", func(t *testing.T) { + t.Run("alter: unset", func(t *testing.T) { opts := defaultOpts() - opts.UnsetComment = Bool(true) - assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s UNSET COMMENT`, id.FullyQualifiedName()) + opts.Unset = &ProcedureUnset{ + Comment: Bool(true), + TraceLevel: Bool(true), + } + assertOptsValidAndSQLEquals(t, opts, `ALTER PROCEDURE IF EXISTS %s UNSET COMMENT, TRACE_LEVEL`, id.FullyQualifiedName()) }) t.Run("alter: set tags", func(t *testing.T) { @@ -1177,15 +1248,16 @@ func TestProcedures_Show(t *testing.T) { t.Run("show with in", func(t *testing.T) { opts := defaultOpts() - opts.In = &In{ - Account: Bool(true), + opts.In = &ExtendedIn{ + In: In{ + Account: Bool(true), + }, } assertOptsValidAndSQLEquals(t, opts, `SHOW PROCEDURES IN ACCOUNT`) }) } func TestProcedures_Describe(t *testing.T) { - noArgsId := randomSchemaObjectIdentifierWithArguments() id := randomSchemaObjectIdentifierWithArguments(DataTypeVARCHAR, DataTypeNumber) defaultOpts := func() *DescribeProcedureOptions { @@ -1199,18 +1271,12 @@ func TestProcedures_Describe(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifierWithArguments assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("no arguments", func(t *testing.T) { - opts := defaultOpts() - opts.name = noArgsId - assertOptsValidAndSQLEquals(t, opts, `DESCRIBE PROCEDURE %s`, noArgsId.FullyQualifiedName()) - }) - t.Run("all options", func(t *testing.T) { opts := defaultOpts() assertOptsValidAndSQLEquals(t, opts, `DESCRIBE PROCEDURE %s`, id.FullyQualifiedName()) @@ -1231,7 +1297,7 @@ func TestProcedures_Call(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: valid identifier for [opts.name]", func(t *testing.T) { opts := defaultOpts() opts.name = emptySchemaObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) @@ -1261,7 +1327,14 @@ func TestProcedures_CreateAndCallForJava(t *testing.T) { defaultOpts := func() *CreateAndCallForJavaProcedureOptions { return &CreateAndCallForJavaProcedureOptions{ - Name: id, + Name: id, + Handler: "TestFunc.echoVarchar", + Packages: []ProcedurePackage{ + { + Package: "com.snowflake:snowpark:1.2.0", + }, + }, + RuntimeVersion: "1.8", } } @@ -1270,18 +1343,51 @@ func TestProcedures_CreateAndCallForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, ErrNilOptions) }) - t.Run("validation: incorrect identifier", func(t *testing.T) { + t.Run("validation: [opts.RuntimeVersion] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.RuntimeVersion = "" + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateAndCallForJavaProcedureOptions", "RuntimeVersion")) + }) + + t.Run("validation: [opts.Packages] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Packages = nil + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateAndCallForJavaProcedureOptions", "Packages")) + }) + + t.Run("validation: [opts.Handler] should be set", func(t *testing.T) { + opts := defaultOpts() + opts.Handler = "" + assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateAndCallForJavaProcedureOptions", "Handler")) + }) + + t.Run("validation: valid identifier for [opts.ProcedureName]", func(t *testing.T) { + opts := defaultOpts() + opts.ProcedureName = emptyAccountObjectIdentifier + assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) + }) + + t.Run("validation: valid identifier for [opts.Name]", func(t *testing.T) { opts := defaultOpts() opts.Name = emptyAccountObjectIdentifier assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) }) - t.Run("validation: returns", func(t *testing.T) { + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{} assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForJavaProcedureOptions.Returns", "ResultDataType", "Table")) }) + t.Run("validation: exactly one field from [opts.Returns.ResultDataType opts.Returns.Table] should be present - both present", func(t *testing.T) { + opts := defaultOpts() + opts.Returns = ProcedureReturns{ + ResultDataType: &ProcedureReturnsResultDataType{}, + Table: &ProcedureReturnsTable{}, + } + assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForJavaProcedureOptions.Returns", "ResultDataType", "Table")) + }) + t.Run("validation: exactly one field should be present", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{ @@ -1300,18 +1406,6 @@ func TestProcedures_CreateAndCallForJava(t *testing.T) { assertOptsInvalidJoinedErrors(t, opts, errExactlyOneOf("CreateAndCallForJavaProcedureOptions.Returns", "ResultDataType", "Table")) }) - t.Run("validation: options are missing", func(t *testing.T) { - opts := defaultOpts() - opts.Returns = ProcedureReturns{ - ResultDataType: &ProcedureReturnsResultDataType{ - ResultDataType: dataTypeVarchar, - }, - } - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateAndCallForJavaProcedureOptions", "Handler")) - assertOptsInvalidJoinedErrors(t, opts, errNotSet("CreateAndCallForJavaProcedureOptions", "RuntimeVersion")) - assertOptsInvalidJoinedErrors(t, opts, ErrInvalidObjectIdentifier) - }) - t.Run("no arguments", func(t *testing.T) { opts := defaultOpts() opts.Returns = ProcedureReturns{ @@ -1374,7 +1468,7 @@ func TestProcedures_CreateAndCallForJava(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1", "rnd"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (id NUMBER, name VARCHAR) RETURNS TABLE (country_code VARCHAR) LANGUAGE JAVA RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' STRICT AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("id" NUMBER, "name" VARCHAR) RETURNS TABLE ("country_code" VARCHAR) LANGUAGE JAVA STRICT RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1422,7 +1516,7 @@ func TestProcedures_CreateAndCallForJava(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1", "rnd"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (id NUMBER(36, 2), name VARCHAR(100)) RETURNS TABLE (country_code VARCHAR(100)) LANGUAGE JAVA RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' STRICT AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("id" NUMBER(36, 2), "name" VARCHAR(100)) RETURNS TABLE ("country_code" VARCHAR(100)) LANGUAGE JAVA STRICT RUNTIME_VERSION = '1.8' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) } @@ -1546,7 +1640,7 @@ func TestProcedures_CreateAndCallForScala(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1", "rnd"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (id NUMBER, name VARCHAR) RETURNS TABLE (country_code VARCHAR) LANGUAGE SCALA RUNTIME_VERSION = '2.12' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' STRICT AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("id" NUMBER, "name" VARCHAR) RETURNS TABLE ("country_code" VARCHAR) LANGUAGE SCALA STRICT RUNTIME_VERSION = '2.12' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1596,7 +1690,7 @@ func TestProcedures_CreateAndCallForScala(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1", "rnd"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (id NUMBER(36, 2), name VARCHAR(100)) RETURNS TABLE (country_code VARCHAR(100)) LANGUAGE SCALA RUNTIME_VERSION = '2.12' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' STRICT AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("id" NUMBER(36, 2), "name" VARCHAR(100)) RETURNS TABLE ("country_code" VARCHAR(100)) LANGUAGE SCALA STRICT RUNTIME_VERSION = '2.12' PACKAGES = ('com.snowflake:snowpark:1.2.0') IMPORTS = ('test_jar.jar') HANDLER = 'TestFunc.echoVarchar' AS 'return id + name;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1, rnd) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) } @@ -1719,7 +1813,7 @@ func TestProcedures_CreateAndCallForPython(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (i int DEFAULT 1) RETURNS VARIANT NULL LANGUAGE PYTHON RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' STRICT AS 'import numpy as np' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("i" int DEFAULT 1) RETURNS VARIANT NULL LANGUAGE PYTHON STRICT RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' AS 'import numpy as np' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1768,7 +1862,7 @@ func TestProcedures_CreateAndCallForPython(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (i NUMBER(36, 2) DEFAULT 1) RETURNS VARIANT NULL LANGUAGE PYTHON RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' STRICT AS 'import numpy as np' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("i" NUMBER(36, 2) DEFAULT 1) RETURNS VARIANT NULL LANGUAGE PYTHON STRICT RUNTIME_VERSION = '3.8' PACKAGES = ('numpy', 'pandas') IMPORTS = ('numpy', 'pandas') HANDLER = 'udf' AS 'import numpy as np' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) } @@ -1831,7 +1925,7 @@ func TestProcedures_CreateAndCallForJavaScript(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (d DOUBLE DEFAULT 1.0) RETURNS DOUBLE NOT NULL LANGUAGE JAVASCRIPT STRICT AS 'return 1;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("d" DOUBLE DEFAULT 1.0) RETURNS DOUBLE NOT NULL LANGUAGE JAVASCRIPT STRICT AS 'return 1;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1858,7 +1952,7 @@ func TestProcedures_CreateAndCallForJavaScript(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (d FLOAT DEFAULT 1.0) RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT STRICT AS 'return 1;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("d" FLOAT DEFAULT 1.0) RETURNS FLOAT NOT NULL LANGUAGE JAVASCRIPT STRICT AS 'return 1;' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) } @@ -1950,7 +2044,7 @@ func TestProcedures_CreateAndCallForSQL(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (message VARCHAR DEFAULT 'test') RETURNS FLOAT LANGUAGE SQL STRICT AS '3.141592654::FLOAT' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("message" VARCHAR DEFAULT 'test') RETURNS FLOAT LANGUAGE SQL STRICT AS '3.141592654::FLOAT' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) t.Run("all options", func(t *testing.T) { @@ -1980,6 +2074,6 @@ func TestProcedures_CreateAndCallForSQL(t *testing.T) { opts.ProcedureName = id opts.ScriptingVariable = String(":ret") opts.CallArguments = []string{"1"} - assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE (message VARCHAR(100) DEFAULT 'test') RETURNS FLOAT LANGUAGE SQL STRICT AS '3.141592654::FLOAT' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `WITH %s AS PROCEDURE ("message" VARCHAR(100) DEFAULT 'test') RETURNS FLOAT LANGUAGE SQL STRICT AS '3.141592654::FLOAT' , %s (x, y) AS (select m.album_ID, m.album_name, b.band_name from music_albums) CALL %s (1) INTO :ret`, id.FullyQualifiedName(), cte.FullyQualifiedName(), id.FullyQualifiedName()) }) } diff --git a/pkg/sdk/procedures_impl_gen.go b/pkg/sdk/procedures_impl_gen.go index e63cf1f386..5a7e1ce84e 100644 --- a/pkg/sdk/procedures_impl_gen.go +++ b/pkg/sdk/procedures_impl_gen.go @@ -60,7 +60,7 @@ func (v *procedures) Show(ctx context.Context, request *ShowProcedureRequest) ([ } func (v *procedures) ShowByID(ctx context.Context, id SchemaObjectIdentifierWithArguments) (*Procedure, error) { - procedures, err := v.Show(ctx, NewShowProcedureRequest().WithIn(In{Schema: id.SchemaId()}).WithLike(Like{String(id.Name())})) + procedures, err := v.Show(ctx, NewShowProcedureRequest().WithIn(ExtendedIn{In: In{Schema: id.SchemaId()}}).WithLike(Like{String(id.Name())})) if err != nil { return nil, err } @@ -123,6 +123,7 @@ func (r *CreateForJavaProcedureRequest) toOpts() *CreateForJavaProcedureOptions Secrets: r.Secrets, TargetPath: r.TargetPath, NullInputBehavior: r.NullInputBehavior, + ReturnResultsBehavior: r.ReturnResultsBehavior, Comment: r.Comment, ExecuteAs: r.ExecuteAs, ProcedureDefinition: r.ProcedureDefinition, @@ -130,7 +131,12 @@ func (r *CreateForJavaProcedureRequest) toOpts() *CreateForJavaProcedureOptions if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -148,7 +154,11 @@ func (r *CreateForJavaProcedureRequest) toOpts() *CreateForJavaProcedureOptions if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -156,14 +166,18 @@ func (r *CreateForJavaProcedureRequest) toOpts() *CreateForJavaProcedureOptions if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -176,19 +190,25 @@ func (r *CreateForJavaScriptProcedureRequest) toOpts() *CreateForJavaScriptProce Secure: r.Secure, name: r.name, - CopyGrants: r.CopyGrants, - ResultDataTypeOld: r.ResultDataTypeOld, - ResultDataType: r.ResultDataType, - NotNull: r.NotNull, - NullInputBehavior: r.NullInputBehavior, - Comment: r.Comment, - ExecuteAs: r.ExecuteAs, - ProcedureDefinition: r.ProcedureDefinition, + CopyGrants: r.CopyGrants, + ResultDataTypeOld: r.ResultDataTypeOld, + ResultDataType: r.ResultDataType, + NotNull: r.NotNull, + NullInputBehavior: r.NullInputBehavior, + ReturnResultsBehavior: r.ReturnResultsBehavior, + Comment: r.Comment, + ExecuteAs: r.ExecuteAs, + ProcedureDefinition: r.ProcedureDefinition, } if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -209,6 +229,7 @@ func (r *CreateForPythonProcedureRequest) toOpts() *CreateForPythonProcedureOpti ExternalAccessIntegrations: r.ExternalAccessIntegrations, Secrets: r.Secrets, NullInputBehavior: r.NullInputBehavior, + ReturnResultsBehavior: r.ReturnResultsBehavior, Comment: r.Comment, ExecuteAs: r.ExecuteAs, ProcedureDefinition: r.ProcedureDefinition, @@ -216,7 +237,12 @@ func (r *CreateForPythonProcedureRequest) toOpts() *CreateForPythonProcedureOpti if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -234,7 +260,11 @@ func (r *CreateForPythonProcedureRequest) toOpts() *CreateForPythonProcedureOpti if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -242,14 +272,18 @@ func (r *CreateForPythonProcedureRequest) toOpts() *CreateForPythonProcedureOpti if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -266,17 +300,25 @@ func (r *CreateForScalaProcedureRequest) toOpts() *CreateForScalaProcedureOption RuntimeVersion: r.RuntimeVersion, - Handler: r.Handler, - TargetPath: r.TargetPath, - NullInputBehavior: r.NullInputBehavior, - Comment: r.Comment, - ExecuteAs: r.ExecuteAs, - ProcedureDefinition: r.ProcedureDefinition, + Handler: r.Handler, + ExternalAccessIntegrations: r.ExternalAccessIntegrations, + Secrets: r.Secrets, + TargetPath: r.TargetPath, + NullInputBehavior: r.NullInputBehavior, + ReturnResultsBehavior: r.ReturnResultsBehavior, + Comment: r.Comment, + ExecuteAs: r.ExecuteAs, + ProcedureDefinition: r.ProcedureDefinition, } if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -294,7 +336,11 @@ func (r *CreateForScalaProcedureRequest) toOpts() *CreateForScalaProcedureOption if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -302,14 +348,18 @@ func (r *CreateForScalaProcedureRequest) toOpts() *CreateForScalaProcedureOption if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -324,15 +374,21 @@ func (r *CreateForSQLProcedureRequest) toOpts() *CreateForSQLProcedureOptions { CopyGrants: r.CopyGrants, - NullInputBehavior: r.NullInputBehavior, - Comment: r.Comment, - ExecuteAs: r.ExecuteAs, - ProcedureDefinition: r.ProcedureDefinition, + NullInputBehavior: r.NullInputBehavior, + ReturnResultsBehavior: r.ReturnResultsBehavior, + Comment: r.Comment, + ExecuteAs: r.ExecuteAs, + ProcedureDefinition: r.ProcedureDefinition, } if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -343,6 +399,8 @@ func (r *CreateForSQLProcedureRequest) toOpts() *CreateForSQLProcedureOptions { opts.Returns.ResultDataType = &ProcedureReturnsResultDataType{ ResultDataTypeOld: r.Returns.ResultDataType.ResultDataTypeOld, ResultDataType: r.Returns.ResultDataType.ResultDataType, + Null: r.Returns.ResultDataType.Null, + NotNull: r.Returns.ResultDataType.NotNull, } } if r.Returns.Table != nil { @@ -350,7 +408,11 @@ func (r *CreateForSQLProcedureRequest) toOpts() *CreateForSQLProcedureOptions { if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -360,16 +422,41 @@ func (r *CreateForSQLProcedureRequest) toOpts() *CreateForSQLProcedureOptions { func (r *AlterProcedureRequest) toOpts() *AlterProcedureOptions { opts := &AlterProcedureOptions{ - IfExists: r.IfExists, - name: r.name, - RenameTo: r.RenameTo, - SetComment: r.SetComment, - SetLogLevel: r.SetLogLevel, - SetTraceLevel: r.SetTraceLevel, - UnsetComment: r.UnsetComment, - SetTags: r.SetTags, - UnsetTags: r.UnsetTags, - ExecuteAs: r.ExecuteAs, + IfExists: r.IfExists, + name: r.name, + RenameTo: r.RenameTo, + + SetTags: r.SetTags, + UnsetTags: r.UnsetTags, + ExecuteAs: r.ExecuteAs, + } + if r.Set != nil { + opts.Set = &ProcedureSet{ + Comment: r.Set.Comment, + ExternalAccessIntegrations: r.Set.ExternalAccessIntegrations, + + AutoEventLogging: r.Set.AutoEventLogging, + EnableConsoleOutput: r.Set.EnableConsoleOutput, + LogLevel: r.Set.LogLevel, + MetricLevel: r.Set.MetricLevel, + TraceLevel: r.Set.TraceLevel, + } + if r.Set.SecretsList != nil { + opts.Set.SecretsList = &SecretsList{ + SecretsList: r.Set.SecretsList.SecretsList, + } + } + } + if r.Unset != nil { + opts.Unset = &ProcedureUnset{ + Comment: r.Unset.Comment, + ExternalAccessIntegrations: r.Unset.ExternalAccessIntegrations, + AutoEventLogging: r.Unset.AutoEventLogging, + EnableConsoleOutput: r.Unset.EnableConsoleOutput, + LogLevel: r.Unset.LogLevel, + MetricLevel: r.Unset.MetricLevel, + TraceLevel: r.Unset.TraceLevel, + } } return opts } @@ -394,7 +481,7 @@ func (r procedureRow) convert() *Procedure { e := &Procedure{ CreatedOn: r.CreatedOn, Name: r.Name, - SchemaName: r.SchemaName, + SchemaName: strings.Trim(r.SchemaName, `"`), IsBuiltin: r.IsBuiltin == "Y", IsAggregate: r.IsAggregate == "Y", IsAnsi: r.IsAnsi == "Y", @@ -402,7 +489,7 @@ func (r procedureRow) convert() *Procedure { MaxNumArguments: r.MaxNumArguments, ArgumentsRaw: r.Arguments, Description: r.Description, - CatalogName: r.CatalogName, + CatalogName: strings.Trim(r.CatalogName, `"`), IsTableFunction: r.IsTableFunction == "Y", ValidForClustering: r.ValidForClustering == "Y", } @@ -431,8 +518,8 @@ func (r procedureDetailRow) convert() *ProcedureDetail { e := &ProcedureDetail{ Property: r.Property, } - if r.Value.Valid { - e.Value = r.Value.String + if r.Value.Valid && r.Value.String != "null" { + e.Value = String(r.Value.String) } return e } @@ -464,7 +551,12 @@ func (r *CreateAndCallForJavaProcedureRequest) toOpts() *CreateAndCallForJavaPro if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -482,7 +574,11 @@ func (r *CreateAndCallForJavaProcedureRequest) toOpts() *CreateAndCallForJavaPro if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -490,14 +586,18 @@ func (r *CreateAndCallForJavaProcedureRequest) toOpts() *CreateAndCallForJavaPro if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -528,7 +628,12 @@ func (r *CreateAndCallForScalaProcedureRequest) toOpts() *CreateAndCallForScalaP if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -546,7 +651,11 @@ func (r *CreateAndCallForScalaProcedureRequest) toOpts() *CreateAndCallForScalaP if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -554,14 +663,18 @@ func (r *CreateAndCallForScalaProcedureRequest) toOpts() *CreateAndCallForScalaP if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -596,7 +709,12 @@ func (r *CreateAndCallForJavaScriptProcedureRequest) toOpts() *CreateAndCallForJ if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -631,7 +749,12 @@ func (r *CreateAndCallForPythonProcedureRequest) toOpts() *CreateAndCallForPytho if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -649,7 +772,11 @@ func (r *CreateAndCallForPythonProcedureRequest) toOpts() *CreateAndCallForPytho if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } @@ -657,14 +784,18 @@ func (r *CreateAndCallForPythonProcedureRequest) toOpts() *CreateAndCallForPytho if r.Packages != nil { s := make([]ProcedurePackage, len(r.Packages)) for i, v := range r.Packages { - s[i] = ProcedurePackage(v) + s[i] = ProcedurePackage{ + Package: v.Package, + } } opts.Packages = s } if r.Imports != nil { s := make([]ProcedureImport, len(r.Imports)) for i, v := range r.Imports { - s[i] = ProcedureImport(v) + s[i] = ProcedureImport{ + Import: v.Import, + } } opts.Imports = s } @@ -696,7 +827,12 @@ func (r *CreateAndCallForSQLProcedureRequest) toOpts() *CreateAndCallForSQLProce if r.Arguments != nil { s := make([]ProcedureArgument, len(r.Arguments)) for i, v := range r.Arguments { - s[i] = ProcedureArgument(v) + s[i] = ProcedureArgument{ + ArgName: v.ArgName, + ArgDataTypeOld: v.ArgDataTypeOld, + ArgDataType: v.ArgDataType, + DefaultValue: v.DefaultValue, + } } opts.Arguments = s } @@ -714,7 +850,11 @@ func (r *CreateAndCallForSQLProcedureRequest) toOpts() *CreateAndCallForSQLProce if r.Returns.Table.Columns != nil { s := make([]ProcedureColumn, len(r.Returns.Table.Columns)) for i, v := range r.Returns.Table.Columns { - s[i] = ProcedureColumn(v) + s[i] = ProcedureColumn{ + ColumnName: v.ColumnName, + ColumnDataTypeOld: v.ColumnDataTypeOld, + ColumnDataType: v.ColumnDataType, + } } opts.Returns.Table.Columns = s } diff --git a/pkg/sdk/procedures_validations_gen.go b/pkg/sdk/procedures_validations_gen.go index 5e7557176f..8298767264 100644 --- a/pkg/sdk/procedures_validations_gen.go +++ b/pkg/sdk/procedures_validations_gen.go @@ -248,8 +248,18 @@ func (opts *AlterProcedureOptions) validate() error { if opts.RenameTo != nil && !ValidObjectIdentifier(opts.RenameTo) { errs = append(errs, ErrInvalidObjectIdentifier) } - if !exactlyOneValueSet(opts.RenameTo, opts.SetComment, opts.SetLogLevel, opts.SetTraceLevel, opts.UnsetComment, opts.SetTags, opts.UnsetTags, opts.ExecuteAs) { - errs = append(errs, errExactlyOneOf("AlterProcedureOptions", "RenameTo", "SetComment", "SetLogLevel", "SetTraceLevel", "UnsetComment", "SetTags", "UnsetTags", "ExecuteAs")) + if !exactlyOneValueSet(opts.RenameTo, opts.Set, opts.Unset, opts.SetTags, opts.UnsetTags, opts.ExecuteAs) { + errs = append(errs, errExactlyOneOf("AlterProcedureOptions", "RenameTo", "Set", "Unset", "SetTags", "UnsetTags", "ExecuteAs")) + } + if valueSet(opts.Set) { + if !anyValueSet(opts.Set.Comment, opts.Set.ExternalAccessIntegrations, opts.Set.SecretsList, opts.Set.AutoEventLogging, opts.Set.EnableConsoleOutput, opts.Set.LogLevel, opts.Set.MetricLevel, opts.Set.TraceLevel) { + errs = append(errs, errAtLeastOneOf("AlterProcedureOptions.Set", "Comment", "ExternalAccessIntegrations", "SecretsList", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) + } + } + if valueSet(opts.Unset) { + if !anyValueSet(opts.Unset.Comment, opts.Unset.ExternalAccessIntegrations, opts.Unset.AutoEventLogging, opts.Unset.EnableConsoleOutput, opts.Unset.LogLevel, opts.Unset.MetricLevel, opts.Unset.TraceLevel) { + errs = append(errs, errAtLeastOneOf("AlterProcedureOptions.Unset", "Comment", "ExternalAccessIntegrations", "AutoEventLogging", "EnableConsoleOutput", "LogLevel", "MetricLevel", "TraceLevel")) + } } return JoinErrors(errs...) } diff --git a/pkg/sdk/testint/accounts_integration_test.go b/pkg/sdk/testint/accounts_integration_test.go index c4c864e445..6ed6b1ac5d 100644 --- a/pkg/sdk/testint/accounts_integration_test.go +++ b/pkg/sdk/testint/accounts_integration_test.go @@ -37,7 +37,7 @@ func TestInt_Account(t *testing.T) { assert.NotEmpty(t, *account.CreatedOn) assert.Equal(t, "SNOWFLAKE", *account.Comment) assert.NotEmpty(t, account.AccountLocator) - assert.NotEmpty(t, *account.AccountLocatorURL) + assert.NotEmpty(t, *account.AccountLocatorUrl) assert.Zero(t, *account.ManagedAccounts) assert.NotEmpty(t, *account.ConsumptionBillingEntityName) assert.Nil(t, account.MarketplaceConsumerBillingEntityName) @@ -65,7 +65,7 @@ func TestInt_Account(t *testing.T) { assert.Nil(t, account.AccountURL) assert.Nil(t, account.CreatedOn) assert.Nil(t, account.Comment) - assert.Nil(t, account.AccountLocatorURL) + assert.Nil(t, account.AccountLocatorUrl) assert.Nil(t, account.ManagedAccounts) assert.Nil(t, account.ConsumptionBillingEntityName) assert.Nil(t, account.MarketplaceConsumerBillingEntityName) @@ -92,13 +92,27 @@ func TestInt_Account(t *testing.T) { assert.Nil(t, account.OrganizationUrlExpirationOn) } + assertCreateResponse := func(t *testing.T, response *sdk.AccountCreateResponse, account sdk.Account) { + t.Helper() + require.NotNil(t, response) + assert.Equal(t, account.AccountLocator, response.AccountLocator) + assert.Equal(t, *account.AccountLocatorUrl, response.AccountLocatorUrl) + assert.Equal(t, account.AccountName, response.AccountName) + assert.Equal(t, *account.AccountURL, response.Url) + assert.Equal(t, account.OrganizationName, response.OrganizationName) + assert.Equal(t, *account.Edition, response.Edition) + assert.NotEmpty(t, response.RegionGroup) + assert.NotEmpty(t, response.Cloud) + assert.NotEmpty(t, response.Region) + } + t.Run("create: minimal", func(t *testing.T) { id := testClientHelper().Ids.RandomAccountObjectIdentifier() name := testClientHelper().Ids.Alpha() password := random.Password() email := random.Email() - err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ + createResponse, err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ AdminName: name, AdminPassword: sdk.String(password), Email: email, @@ -110,6 +124,7 @@ func TestInt_Account(t *testing.T) { acc, err := client.Accounts.ShowByID(ctx, id) require.NoError(t, err) require.Equal(t, id, acc.ID()) + assertCreateResponse(t, createResponse, *acc) }) t.Run("create: user type service", func(t *testing.T) { @@ -118,7 +133,7 @@ func TestInt_Account(t *testing.T) { key, _ := random.GenerateRSAPublicKey(t) email := random.Email() - err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ + createResponse, err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ AdminName: name, AdminRSAPublicKey: sdk.String(key), AdminUserType: sdk.Pointer(sdk.UserTypeService), @@ -131,6 +146,7 @@ func TestInt_Account(t *testing.T) { acc, err := client.Accounts.ShowByID(ctx, id) require.NoError(t, err) require.Equal(t, id, acc.ID()) + assertCreateResponse(t, createResponse, *acc) }) t.Run("create: user type legacy service", func(t *testing.T) { @@ -139,7 +155,7 @@ func TestInt_Account(t *testing.T) { password := random.Password() email := random.Email() - err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ + createResponse, err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ AdminName: name, AdminPassword: sdk.String(password), AdminUserType: sdk.Pointer(sdk.UserTypeLegacyService), @@ -152,6 +168,7 @@ func TestInt_Account(t *testing.T) { acc, err := client.Accounts.ShowByID(ctx, id) require.NoError(t, err) require.Equal(t, id, acc.ID()) + assertCreateResponse(t, createResponse, *acc) }) t.Run("create: complete", func(t *testing.T) { @@ -167,7 +184,7 @@ func TestInt_Account(t *testing.T) { require.NoError(t, err) comment := random.Comment() - err = client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ + createResponse, err := client.Accounts.Create(ctx, id, &sdk.CreateAccountOptions{ AdminName: name, AdminPassword: sdk.String(password), FirstName: sdk.String("firstName"), @@ -187,6 +204,7 @@ func TestInt_Account(t *testing.T) { acc, err := client.Accounts.ShowByID(ctx, id) require.NoError(t, err) require.Equal(t, id, acc.ID()) + assertCreateResponse(t, createResponse, *acc) }) t.Run("alter: set / unset is org admin", func(t *testing.T) { diff --git a/pkg/sdk/testint/client_integration_test.go b/pkg/sdk/testint/client_integration_test.go index 2522155196..cbf8a52391 100644 --- a/pkg/sdk/testint/client_integration_test.go +++ b/pkg/sdk/testint/client_integration_test.go @@ -4,6 +4,7 @@ import ( "context" "database/sql" "os" + "strings" "testing" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testprofiles" @@ -54,7 +55,9 @@ func TestInt_Client_NewClient(t *testing.T) { require.NotNil(t, config) account := config.Account - t.Setenv(snowflakeenvs.Account, account) + parts := strings.Split(account, "-") + t.Setenv(snowflakeenvs.OrganizationName, parts[0]) + t.Setenv(snowflakeenvs.AccountName, parts[1]) dir, err := os.UserHomeDir() require.NoError(t, err) diff --git a/pkg/sdk/testint/functions_integration_test.go b/pkg/sdk/testint/functions_integration_test.go index 5c19d66af4..bb292cd627 100644 --- a/pkg/sdk/testint/functions_integration_test.go +++ b/pkg/sdk/testint/functions_integration_test.go @@ -2,11 +2,17 @@ package testint import ( "context" - "errors" "fmt" + "strings" "testing" "time" + assertions "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/objectassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/objectparametersassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testdatatypes" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" @@ -14,264 +20,1359 @@ import ( "github.com/stretchr/testify/require" ) -/* -todo: add tests for: - - creating functions with different languages (java, javascript, python, scala, sql) from stages using [ TARGET_PATH = '' ] - - execute and execute-immediate for scripting https://docs.snowflake.com/en/sql-reference/sql/execute-immediate -*/ - -func TestInt_CreateFunctions(t *testing.T) { +// TODO [SNOW-1348103]: schemaName and catalog name are quoted (because we use lowercase) +// TODO [SNOW-1850370]: HasArgumentsRawFrom(functionId, arguments, return) +// TODO [SNOW-1850370]: extract show assertions with commons fields +// TODO [SNOW-1850370]: test confirming that runtime version is required for Scala function +// TODO [SNOW-1348103 or SNOW-1850370]: test create or replace with name change, args change +// TODO [SNOW-1348103]: test rename more (arg stays, can't change arg, rename to different schema) +// TODO [SNOW-1348103]: test weird names for arg name - lower/upper if used with double quotes, to upper without quotes, dots, spaces, and both quotes not permitted +// TODO [SNOW-1850370]: add test documenting that UNSET SECRETS does not work +// TODO [SNOW-1850370]: add test documenting [JAVA]: 391516 (42601): SQL compilation error: Cannot specify TARGET_PATH without a function BODY. +// TODO [SNOW-1348103 or SNOW-1850370]: test secure +// TODO [SNOW-1348103]: python aggregate func (100357 (P0000): Could not find accumulate method in function CVVEMHIT_06547800_08D6_DBCA_1AC7_5E422AFF8B39 with handler dump) +// TODO [SNOW-1348103]: add a test documenting that we can't set parameters in create (and revert adding these parameters directly in object...) +// TODO [SNOW-1850370]: active warehouse vs validations +// TODO [SNOW-1348103]: add a test documenting STRICT behavior +func TestInt_Functions(t *testing.T) { client := testClient(t) ctx := context.Background() + secretId := testClientHelper().Ids.RandomSchemaObjectIdentifier() - cleanupFunctionHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Functions.Drop(ctx, sdk.NewDropFunctionRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } + networkRule, networkRuleCleanup := testClientHelper().NetworkRule.Create(t) + t.Cleanup(networkRuleCleanup) + + secret, secretCleanup := testClientHelper().Secret.CreateWithGenericString(t, secretId, "test_secret_string") + t.Cleanup(secretCleanup) + + externalAccessIntegration, externalAccessIntegrationCleanup := testClientHelper().ExternalAccessIntegration.CreateExternalAccessIntegrationWithNetworkRuleAndSecret(t, networkRule.ID(), secret.ID()) + t.Cleanup(externalAccessIntegrationCleanup) + + tmpJavaFunction := testClientHelper().CreateSampleJavaFunctionAndJar(t) + tmpPythonFunction := testClientHelper().CreateSamplePythonFunctionAndModule(t) + + assertParametersSet := func(t *testing.T, functionParametersAssert *objectparametersassert.FunctionParametersAssert) { + t.Helper() + assertions.AssertThatObject(t, functionParametersAssert. + HasEnableConsoleOutput(true). + HasLogLevel(sdk.LogLevelWarn). + HasMetricLevel(sdk.MetricLevelAll). + HasTraceLevel(sdk.TraceLevelAlways), + ) } - t.Run("create function for Java", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) + t.Run("create function for Java - inline minimal", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 - definition := ` - class TestFunc { - public static String echoVarchar(String x) { - return x; - } - }` - target := fmt.Sprintf("@~/tf-%d.jar", time.Now().Unix()) - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := testClientHelper().Function.SampleJavaDefinition(t, className, funcName, argName) + + request := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithFunctionDefinitionWrapped(definition) + + err := client.Functions.CreateForJava(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("JAVA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(handler). + HasRuntimeVersionNil(). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Java - inline full", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewFunctionArgumentRequest("x", nil).WithDefaultValue("'abc'").WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, "TestFunc.echoVarchar"). + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := testClientHelper().Function.SampleJavaDefinition(t, className, funcName, argName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + + request := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). WithOrReplace(true). WithArguments([]sdk.FunctionArgumentRequest{*argument}). - WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorCalledOnNullInput)). - WithTargetPath(target). - WithFunctionDefinition(definition) + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithRuntimeVersion("11"). + WithComment("comment"). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpJavaFunction.JarLocation())}). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:snowpark:1.14.0"), + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:telemetry:0.1.0"), + }). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithTargetPath(targetPath). + WithEnableConsoleOutput(true). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithFunctionDefinitionWrapped(definition) + err := client.Functions.CreateForJava(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + t.Cleanup(testClientHelper().Stage.RemoveFromUserStageFunc(t, jarName)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "JAVA", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasIsExternalFunction(false). + HasLanguage("JAVA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + // TODO [SNOW-1348103]: parse to identifier list + // TODO [SNOW-1348103]: check multiple secrets (to know how to parse) + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaFunction.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPath(targetPath). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("create function for Javascript", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeFloat) + t.Run("create function for Java - staged minimal", func(t *testing.T) { + dataType := tmpJavaFunction.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - definition := ` - if (D <= 0) { - return 1; - } else { - var result = 1; - for (var i = 2; i <= D; i++) { - result = result * i; - } - return result; - }` + argName := "x" + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + handler := tmpJavaFunction.JavaHandler() + importPath := tmpJavaFunction.JarLocation() - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeFloat) + requestStaged := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(importPath)}) + + err := client.Functions.CreateForJava(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("JAVA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, importPath)). + HasHandler(handler). + HasRuntimeVersionNil(). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Java - staged full", func(t *testing.T) { + dataType := tmpJavaFunction.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewFunctionArgumentRequest("d", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request := sdk.NewCreateForJavascriptFunctionRequest(id.SchemaObjectId(), *returns, definition). + handler := tmpJavaFunction.JavaHandler() + + requestStaged := sdk.NewCreateForJavaFunctionRequest(id.SchemaObjectId(), *returns, handler). WithOrReplace(true). WithArguments([]sdk.FunctionArgumentRequest{*argument}). - WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorCalledOnNullInput)) + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithRuntimeVersion("11"). + WithComment("comment"). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpJavaFunction.JarLocation())}). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:snowpark:1.14.0"), + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:telemetry:0.1.0"), + }). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}) + + err := client.Functions.CreateForJava(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasIsExternalFunction(false). + HasLanguage("JAVA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaFunction.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Javascript - inline minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "d" + definition := testClientHelper().Function.SampleJavascriptDefinition(t, argName) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + + request := sdk.NewCreateForJavascriptFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). + WithArguments([]sdk.FunctionArgumentRequest{*argument}) + err := client.Functions.CreateForJavascript(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "JAVASCRIPT", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("JAVASCRIPT"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVASCRIPT"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("create function for Python", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeNumber) + t.Run("create function for Javascript - inline full", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - definition := ` -def dump(i): - print("Hello World!")` - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVariant) + argName := "d" + definition := testClientHelper().Function.SampleJavascriptDefinition(t, argName) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewFunctionArgumentRequest("i", nil).WithArgDataTypeOld(sdk.DataTypeNumber) - request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", "dump"). + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForJavascriptFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). WithOrReplace(true). WithArguments([]sdk.FunctionArgumentRequest{*argument}). - WithFunctionDefinition(definition) - err := client.Functions.CreateForPython(ctx, request) + WithCopyGrants(true). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment") + + err := client.Functions.CreateForJavascript(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "PYTHON", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("JAVASCRIPT"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVASCRIPT"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("create function for Scala", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) + t.Run("create function for Python - inline minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeNumber_36_2 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - definition := ` - class Echo { - def echoVarchar(x : String): String = { - return x - } - }` + argName := "i" + funcName := "dump" + definition := testClientHelper().Function.SamplePythonDefinition(t, funcName, argName) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", funcName). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithFunctionDefinitionWrapped(definition) - argument := sdk.NewFunctionArgumentRequest("x", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForScalaFunctionRequest(id.SchemaObjectId(), nil, "Echo.echoVarchar"). - WithResultDataTypeOld(sdk.DataTypeVARCHAR). + err := client.Functions.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("PYTHON"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")). // TODO [SNOW-1348103]: do we care about this whitespace? + HasLanguage("PYTHON"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(funcName). + HasRuntimeVersion("3.8"). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasIsAggregate(false), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Python - inline full", func(t *testing.T) { + dataType := testdatatypes.DataTypeNumber_36_2 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + funcName := "dump" + definition := testClientHelper().Function.SamplePythonDefinition(t, funcName, argName) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", funcName). WithOrReplace(true). WithArguments([]sdk.FunctionArgumentRequest{*argument}). - WithRuntimeVersion("2.12"). - WithFunctionDefinition(definition) - err := client.Functions.CreateForScala(ctx, request) + WithCopyGrants(true). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpPythonFunction.PythonModuleLocation())}). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("absl-py==0.10.0"), + *sdk.NewFunctionPackageRequest().WithPackage("about-time==4.2.1"), + }). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithFunctionDefinitionWrapped(definition) + + err := client.Functions.CreateForPython(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "SCALA", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasIsExternalFunction(false). + HasLanguage("PYTHON"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")+" NOT NULL"). // TODO [SNOW-1348103]: do we care about this whitespace? + HasLanguage("PYTHON"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(funcName). + HasRuntimeVersion("3.8"). + HasPackages(`['absl-py==0.10.0','about-time==4.2.1']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasIsAggregate(false), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("create function for SQL", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeFloat) + t.Run("create function for Python - staged minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeVarchar_100 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", tmpPythonFunction.PythonHandler()). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpPythonFunction.PythonModuleLocation())}) + + err := client.Functions.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("PYTHON"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")). + HasLanguage("PYTHON"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(tmpPythonFunction.PythonHandler()). + HasRuntimeVersion("3.8"). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasIsAggregate(false), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) - definition := "3.141592654::FLOAT" + t.Run("create function for Python - staged full", func(t *testing.T) { + dataType := testdatatypes.DataTypeVarchar_100 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeFloat) + argName := "i" + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewFunctionArgumentRequest("x", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request := sdk.NewCreateForSQLFunctionRequest(id.SchemaObjectId(), *returns, definition). + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForPythonFunctionRequest(id.SchemaObjectId(), *returns, "3.8", tmpPythonFunction.PythonHandler()). + WithOrReplace(true). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithCopyGrants(true). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("absl-py==0.10.0"), + *sdk.NewFunctionPackageRequest().WithPackage("about-time==4.2.1"), + }). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpPythonFunction.PythonModuleLocation())}) + + err := client.Functions.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasIsExternalFunction(false). + HasLanguage("PYTHON"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")+" NOT NULL"). + HasLanguage("PYTHON"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(tmpPythonFunction.PythonHandler()). + HasRuntimeVersion("3.8"). + HasPackages(`['absl-py==0.10.0','about-time==4.2.1']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasIsAggregate(false), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Scala - inline minimal", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + definition := testClientHelper().Function.SampleScalaDefinition(t, className, funcName, argName) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + handler := fmt.Sprintf("%s.%s", className, funcName) + request := sdk.NewCreateForScalaFunctionRequest(id.SchemaObjectId(), dataType, handler, "2.12"). WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithFunctionDefinitionWrapped(definition) + + err := client.Functions.CreateForScala(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("SCALA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SCALA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Scala - inline full", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + definition := testClientHelper().Function.SampleScalaDefinition(t, className, funcName, argName) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + handler := fmt.Sprintf("%s.%s", className, funcName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + request := sdk.NewCreateForScalaFunctionRequest(id.SchemaObjectId(), dataType, handler, "2.12"). WithOrReplace(true). - WithComment("comment") - err := client.Functions.CreateForSQL(ctx, request) + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithComment("comment"). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpJavaFunction.JarLocation())}). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:snowpark:1.14.0"), + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:telemetry:0.1.0"), + }). + WithTargetPath(targetPath). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithEnableConsoleOutput(true). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithFunctionDefinitionWrapped(definition) + + err := client.Functions.CreateForScala(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + t.Cleanup(testClientHelper().Stage.RemoveFromUserStageFunc(t, jarName)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "SQL", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasIsExternalFunction(false). + HasLanguage("SCALA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SCALA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaFunction.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPath(targetPath). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("create function for SQL with no arguments", func(t *testing.T) { - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments() + t.Run("create function for Scala - staged minimal", func(t *testing.T) { + dataType := tmpJavaFunction.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - definition := "3.141592654::FLOAT" + argName := "x" + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + handler := tmpJavaFunction.JavaHandler() + importPath := tmpJavaFunction.JarLocation() - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeFloat) - returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - request := sdk.NewCreateForSQLFunctionRequest(id.SchemaObjectId(), *returns, definition). + requestStaged := sdk.NewCreateForScalaFunctionRequest(id.SchemaObjectId(), dataType, handler, "2.12"). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(importPath)}) + + err := client.Functions.CreateForScala(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("SCALA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SCALA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, importPath)). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for Scala - staged full", func(t *testing.T) { + dataType := tmpJavaFunction.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + handler := tmpJavaFunction.JavaHandler() + + requestStaged := sdk.NewCreateForScalaFunctionRequest(id.SchemaObjectId(), dataType, handler, "2.12"). WithOrReplace(true). - WithComment("comment") + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithComment("comment"). + WithPackages([]sdk.FunctionPackageRequest{ + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:snowpark:1.14.0"), + *sdk.NewFunctionPackageRequest().WithPackage("com.snowflake:telemetry:0.1.0"), + }). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithImports([]sdk.FunctionImportRequest{*sdk.NewFunctionImportRequest().WithImport(tmpJavaFunction.JarLocation())}) + + err := client.Functions.CreateForScala(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + + function, err := client.Functions.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasIsExternalFunction(false). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasLanguage("SCALA"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SCALA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaFunction.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create function for SQL - inline minimal", func(t *testing.T) { + argName := "x" + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + definition := testClientHelper().Function.SampleSqlDefinition(t) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). + WithArguments([]sdk.FunctionArgumentRequest{*argument}) + err := client.Functions.CreateForSQL(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - require.Equal(t, id.Name(), function.Name) - require.Equal(t, "SQL", function.Language) + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("SQL"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SQL"). + HasBody(definition). + HasNullHandlingNil(). + HasVolatilityNil(). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) -} -func TestInt_OtherFunctions(t *testing.T) { - client := testClient(t) - ctx := testContext(t) + t.Run("create function for SQL - inline full", func(t *testing.T) { + argName := "x" + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) - assertFunction := func(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments, secure bool, withArguments bool) { - t.Helper() + definition := testClientHelper().Function.SampleSqlDefinition(t) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) + returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewFunctionArgumentRequest(argName, dataType) + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). + WithOrReplace(true). + WithArguments([]sdk.FunctionArgumentRequest{*argument}). + WithCopyGrants(true). + WithReturnNullValues(sdk.ReturnNullValuesNotNull). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithMemoizable(true). + WithComment("comment") + + err := client.Functions.CreateForSQL(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - assert.NotEmpty(t, function.CreatedOn) - assert.Equal(t, id.Name(), function.Name) - assert.Equal(t, false, function.IsBuiltin) - assert.Equal(t, false, function.IsAggregate) - assert.Equal(t, false, function.IsAnsi) - if withArguments { - assert.Equal(t, 1, function.MinNumArguments) - assert.Equal(t, 1, function.MaxNumArguments) - } else { - assert.Equal(t, 0, function.MinNumArguments) - assert.Equal(t, 0, function.MaxNumArguments) - } - assert.NotEmpty(t, function.ArgumentsRaw) - assert.NotEmpty(t, function.ArgumentsOld) - assert.NotEmpty(t, function.Description) - assert.NotEmpty(t, function.CatalogName) - assert.Equal(t, false, function.IsTableFunction) - assert.Equal(t, false, function.ValidForClustering) - assert.Equal(t, secure, function.IsSecure) - assert.Equal(t, false, function.IsExternalFunction) - assert.Equal(t, "SQL", function.Language) - assert.Equal(t, false, function.IsMemoizable) - } + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("SQL"). + HasIsMemoizable(true). + HasIsDataMetric(false), + ) - cleanupFunctionHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Functions.Drop(ctx, sdk.NewDropFunctionRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } - } + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SQL"). + HasBody(definition). + HasNullHandlingNil(). + // TODO [SNOW-1348103]: volatility is not returned and is present in create syntax + // HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasVolatilityNil(). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) - createFunctionForSQLHandle := func(t *testing.T, cleanup bool, withArguments bool) *sdk.Function { - t.Helper() - var id sdk.SchemaObjectIdentifierWithArguments - if withArguments { - id = testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeFloat) - } else { - id = testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments() - } + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) - definition := "3.141592654::FLOAT" + t.Run("create function for SQL - no arguments", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments() - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeFloat) + definition := testClientHelper().Function.SampleSqlDefinition(t) + dt := sdk.NewFunctionReturnsResultDataTypeRequest(dataType) returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - request := sdk.NewCreateForSQLFunctionRequest(id.SchemaObjectId(), *returns, definition). - WithOrReplace(true) - if withArguments { - argument := sdk.NewFunctionArgumentRequest("x", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request = request.WithArguments([]sdk.FunctionArgumentRequest{*argument}) - } + request := sdk.NewCreateForSQLFunctionRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition) + err := client.Functions.CreateForSQL(ctx, request) require.NoError(t, err) - if cleanup { - t.Cleanup(cleanupFunctionHandle(id)) - } + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, id)) + function, err := client.Functions.ShowByID(ctx, id) require.NoError(t, err) - return function - } - t.Run("alter function: rename", func(t *testing.T) { - f := createFunctionForSQLHandle(t, false, true) + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(0). + HasMaxNumArguments(0). + HasArgumentsOld([]sdk.DataType{}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s() RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultFunctionComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrations(""). + HasSecrets(""). + HasIsExternalFunction(false). + HasLanguage("SQL"). + HasIsMemoizable(false). + HasIsDataMetric(false), + ) + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, function.ID()). + HasSignature("()"). + HasReturns(dataType.ToSql()). + HasLanguage("SQL"). + HasBody(definition). + HasNullHandlingNil(). + HasVolatilityNil(). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasIsAggregateNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("show parameters", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) + id := f.ID() + + param, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterLogLevel, sdk.Object{ObjectType: sdk.ObjectTypeFunction, Name: id}) + require.NoError(t, err) + assert.Equal(t, string(sdk.LogLevelOff), param.Value) + + parameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Function: id, + }, + }) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParametersPrefetched(t, id, parameters). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + // check that ShowParameters on function level works too + parameters, err = client.Functions.ShowParameters(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParametersPrefetched(t, id, parameters). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("alter function: rename", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) id := f.ID() - nid := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeFloat) + + nid := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(id.ArgumentDataTypes()...) err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithRenameTo(nid.SchemaObjectId())) - if err != nil { - t.Cleanup(cleanupFunctionHandle(id)) - } else { - t.Cleanup(cleanupFunctionHandle(nid)) - } require.NoError(t, err) + t.Cleanup(testClientHelper().Function.DropFunctionFunc(t, nid)) _, err = client.Functions.ShowByID(ctx, id) assert.ErrorIs(t, err, collections.ErrObjectNotFound) @@ -281,89 +1382,175 @@ func TestInt_OtherFunctions(t *testing.T) { require.Equal(t, nid.Name(), e.Name) }) - t.Run("alter function: set log level", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) - + t.Run("alter function: set and unset all for Java", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateJava(t) + t.Cleanup(fCleanup) id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetLogLevel(string(sdk.LogLevelDebug))) - require.NoError(t, err) - assertFunction(t, id, false, true) - }) - t.Run("alter function: unset log level", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasName(id.Name()). + HasDescription(sdk.DefaultFunctionComment), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetLogLevel(true)) - require.NoError(t, err) - assertFunction(t, id, false, true) - }) + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, id). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) - t.Run("alter function: set trace level", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + request := sdk.NewAlterFunctionRequest(id).WithSet(*sdk.NewFunctionSetRequest(). + WithEnableConsoleOutput(true). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecretsList(*sdk.NewSecretsListRequest([]sdk.SecretReference{{VariableName: "abc", Name: secretId}})). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithComment("new comment"), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetTraceLevel(string(sdk.TraceLevelAlways))) + err := client.Functions.Alter(ctx, request) require.NoError(t, err) - assertFunction(t, id, false, true) - }) - t.Run("alter function: unset trace level", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasName(id.Name()). + HasDescription("new comment"), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetTraceLevel(true)) + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, id). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}), + ) + + assertParametersSet(t, objectparametersassert.FunctionParameters(t, id)) + + unsetRequest := sdk.NewAlterFunctionRequest(id).WithUnset(*sdk.NewFunctionUnsetRequest(). + WithEnableConsoleOutput(true). + WithExternalAccessIntegrations(true). + WithEnableConsoleOutput(true). + WithLogLevel(true). + WithMetricLevel(true). + WithTraceLevel(true). + WithComment(true), + ) + + err = client.Functions.Alter(ctx, unsetRequest) require.NoError(t, err) - assertFunction(t, id, false, true) - }) - t.Run("alter function: set comment", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasName(id.Name()). + HasDescription(sdk.DefaultFunctionComment). + HasExactlyExternalAccessIntegrations(). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetComment("test comment")) + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, id). + HasExternalAccessIntegrationsNil(). + // TODO [SNOW-1850370]: apparently UNSET external access integrations cleans out secrets in the describe but leaves it in SHOW + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + unsetSecretsRequest := sdk.NewAlterFunctionRequest(id).WithSet(*sdk.NewFunctionSetRequest(). + WithSecretsList(*sdk.NewSecretsListRequest([]sdk.SecretReference{})), + ) + + err = client.Functions.Alter(ctx, unsetSecretsRequest) require.NoError(t, err) - assertFunction(t, id, false, true) - }) - t.Run("alter function: unset comment", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.FunctionDetails(t, id). + HasSecretsNil(), + ) + }) + t.Run("alter function: set and unset all for SQL", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetComment(true)) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + request := sdk.NewAlterFunctionRequest(id).WithSet(*sdk.NewFunctionSetRequest(). + WithEnableConsoleOutput(true). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithComment("new comment"), + ) + + err := client.Functions.Alter(ctx, request) require.NoError(t, err) - assertFunction(t, id, false, true) - }) - t.Run("alter function: set secure", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasName(id.Name()). + HasDescription("new comment"), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetSecure(true)) + assertParametersSet(t, objectparametersassert.FunctionParameters(t, id)) + + unsetRequest := sdk.NewAlterFunctionRequest(id).WithUnset(*sdk.NewFunctionUnsetRequest(). + WithEnableConsoleOutput(true). + WithLogLevel(true). + WithMetricLevel(true). + WithTraceLevel(true). + WithComment(true), + ) + + err = client.Functions.Alter(ctx, unsetRequest) require.NoError(t, err) - assertFunction(t, id, true, true) + + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasDescription(sdk.DefaultFunctionComment), + ) + + assertions.AssertThatObject(t, objectparametersassert.FunctionParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("alter function: set secure with no arguments", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + t.Run("alter function: set and unset secure", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) id := f.ID() + + assertions.AssertThatObject(t, objectassert.FunctionFromObject(t, f). + HasIsSecure(false), + ) + err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithSetSecure(true)) require.NoError(t, err) - assertFunction(t, id, true, true) - }) - t.Run("alter function: unset secure", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasIsSecure(true), + ) - id := f.ID() - err := client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetSecure(true)) + err = client.Functions.Alter(ctx, sdk.NewAlterFunctionRequest(id).WithUnsetSecure(true)) require.NoError(t, err) - assertFunction(t, id, false, true) + + assertions.AssertThatObject(t, objectassert.Function(t, id). + HasIsSecure(false), + ) }) - t.Run("show function for SQL: without like", func(t *testing.T) { - f1 := createFunctionForSQLHandle(t, true, true) - f2 := createFunctionForSQLHandle(t, true, true) + t.Run("show function: without like", func(t *testing.T) { + f1, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) + + f2, fCleanup2 := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup2) functions, err := client.Functions.Show(ctx, sdk.NewShowFunctionRequest()) require.NoError(t, err) @@ -372,9 +1559,12 @@ func TestInt_OtherFunctions(t *testing.T) { require.Contains(t, functions, *f2) }) - t.Run("show function for SQL: with like", func(t *testing.T) { - f1 := createFunctionForSQLHandle(t, true, true) - f2 := createFunctionForSQLHandle(t, true, true) + t.Run("show function: with like", func(t *testing.T) { + f1, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) + + f2, fCleanup2 := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup2) functions, err := client.Functions.Show(ctx, sdk.NewShowFunctionRequest().WithLike(sdk.Like{Pattern: &f1.Name})) require.NoError(t, err) @@ -384,81 +1574,90 @@ func TestInt_OtherFunctions(t *testing.T) { require.NotContains(t, functions, *f2) }) - t.Run("show function for SQL: no matches", func(t *testing.T) { - functions, err := client.Functions.Show(ctx, sdk.NewShowFunctionRequest().WithLike(sdk.Like{Pattern: sdk.String("non-existing-id-pattern")})) + t.Run("show function: no matches", func(t *testing.T) { + functions, err := client.Functions.Show(ctx, sdk.NewShowFunctionRequest(). + WithIn(sdk.ExtendedIn{In: sdk.In{Schema: testClientHelper().Ids.SchemaId()}}). + WithLike(sdk.Like{Pattern: sdk.String(NonExistingSchemaObjectIdentifier.Name())})) require.NoError(t, err) require.Equal(t, 0, len(functions)) }) - t.Run("describe function for SQL", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, true) + t.Run("describe function: for Java - minimal", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateJava(t) + t.Cleanup(fCleanup) + id := f.ID() - details, err := client.Functions.Describe(ctx, f.ID()) + details, err := client.Functions.Describe(ctx, id) require.NoError(t, err) - pairs := make(map[string]string) + assert.Len(t, details, 11) + + pairs := make(map[string]*string) for _, detail := range details { pairs[detail.Property] = detail.Value } - require.Equal(t, "SQL", pairs["language"]) - require.Equal(t, "FLOAT", pairs["returns"]) - require.Equal(t, "3.141592654::FLOAT", pairs["body"]) - require.Equal(t, "(X FLOAT)", pairs["signature"]) + assert.Equal(t, "(x VARCHAR)", *pairs["signature"]) + assert.Equal(t, "VARCHAR(100)", *pairs["returns"]) + assert.Equal(t, "JAVA", *pairs["language"]) + assert.NotEmpty(t, *pairs["body"]) + assert.Equal(t, string(sdk.NullInputBehaviorCalledOnNullInput), *pairs["null handling"]) + assert.Equal(t, string(sdk.VolatileTableKind), *pairs["volatility"]) + assert.Nil(t, pairs["external_access_integration"]) + assert.Nil(t, pairs["secrets"]) + assert.Equal(t, "[]", *pairs["imports"]) + assert.Equal(t, "TestFunc.echoVarchar", *pairs["handler"]) + assert.Nil(t, pairs["runtime_version"]) }) - t.Run("describe function for SQL: no arguments", func(t *testing.T) { - f := createFunctionForSQLHandle(t, true, false) + t.Run("describe function: for SQL - with arguments", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSql(t) + t.Cleanup(fCleanup) + id := f.ID() - details, err := client.Functions.Describe(ctx, f.ID()) + details, err := client.Functions.Describe(ctx, id) require.NoError(t, err) + assert.Len(t, details, 4) + pairs := make(map[string]string) for _, detail := range details { - pairs[detail.Property] = detail.Value + pairs[detail.Property] = *detail.Value } - require.Equal(t, "SQL", pairs["language"]) - require.Equal(t, "FLOAT", pairs["returns"]) - require.Equal(t, "3.141592654::FLOAT", pairs["body"]) - require.Equal(t, "()", pairs["signature"]) + assert.Equal(t, "(x FLOAT)", pairs["signature"]) + assert.Equal(t, "FLOAT", pairs["returns"]) + assert.Equal(t, "SQL", pairs["language"]) + assert.Equal(t, "3.141592654::FLOAT", pairs["body"]) }) -} -func TestInt_FunctionsShowByID(t *testing.T) { - client := testClient(t) - ctx := testContext(t) - - cleanupFunctionHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Functions.Drop(ctx, sdk.NewDropFunctionRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } - } - - createFunctionForSQLHandle := func(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) { - t.Helper() - - definition := "3.141592654::FLOAT" - dt := sdk.NewFunctionReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeFloat) - returns := sdk.NewFunctionReturnsRequest().WithResultDataType(*dt) - request := sdk.NewCreateForSQLFunctionRequest(id.SchemaObjectId(), *returns, definition).WithOrReplace(true) + t.Run("describe function: for SQL - no arguments", func(t *testing.T) { + f, fCleanup := testClientHelper().Function.CreateSqlNoArgs(t) + t.Cleanup(fCleanup) + id := f.ID() - argument := sdk.NewFunctionArgumentRequest("x", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request = request.WithArguments([]sdk.FunctionArgumentRequest{*argument}) - err := client.Functions.CreateForSQL(ctx, request) + details, err := client.Functions.Describe(ctx, id) require.NoError(t, err) - t.Cleanup(cleanupFunctionHandle(id)) - } + assert.Len(t, details, 4) + + pairs := make(map[string]string) + for _, detail := range details { + pairs[detail.Property] = *detail.Value + } + assert.Equal(t, "()", pairs["signature"]) + assert.Equal(t, "FLOAT", pairs["returns"]) + assert.Equal(t, "SQL", pairs["language"]) + assert.Equal(t, "3.141592654::FLOAT", pairs["body"]) + }) t.Run("show by id - same name in different schemas", func(t *testing.T) { schema, schemaCleanup := testClientHelper().Schema.CreateSchema(t) t.Cleanup(schemaCleanup) - id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeFloat) - id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(id1.Name(), schema.ID(), sdk.DataTypeFloat) + dataType := testdatatypes.DataTypeFloat + id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(id1.Name(), schema.ID(), sdk.LegacyDataTypeFrom(dataType)) - createFunctionForSQLHandle(t, id1) - createFunctionForSQLHandle(t, id2) + _, fCleanup1 := testClientHelper().Function.CreateSqlWithIdentifierAndArgument(t, id1.SchemaObjectId(), dataType) + t.Cleanup(fCleanup1) + _, fCleanup2 := testClientHelper().Function.CreateSqlWithIdentifierAndArgument(t, id2.SchemaObjectId(), dataType) + t.Cleanup(fCleanup2) e1, err := client.Functions.ShowByID(ctx, id1) require.NoError(t, err) @@ -475,21 +1674,13 @@ func TestInt_FunctionsShowByID(t *testing.T) { require.Equal(t, id2, e2Id) }) - t.Run("show function by id - different name, same arguments", func(t *testing.T) { - id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) - id2 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) - e := testClientHelper().Function.CreateWithIdentifier(t, id1) - testClientHelper().Function.CreateWithIdentifier(t, id2) - - es, err := client.Functions.ShowByID(ctx, id1) - require.NoError(t, err) - require.Equal(t, *e, *es) - }) - t.Run("show function by id - same name, different arguments", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat name := testClientHelper().Ids.Alpha() - id1 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) + + id1 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.LegacyDataTypeFrom(dataType)) id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.DataTypeInt, sdk.DataTypeVARCHAR) + e := testClientHelper().Function.CreateWithIdentifier(t, id1) testClientHelper().Function.CreateWithIdentifier(t, id2) @@ -538,7 +1729,7 @@ func TestInt_FunctionsShowByID(t *testing.T) { "add", ). WithArguments(args). - WithFunctionDefinition("def add(A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, R, S, T, U, V, W, X, Y, Z): A + A"), + WithFunctionDefinitionWrapped("def add(A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, R, S, T, U, V, W, X, Y, Z): A + A"), ) require.NoError(t, err) @@ -593,6 +1784,7 @@ func TestInt_FunctionsShowByID(t *testing.T) { t.Run(fmt.Sprintf("function returns non detailed data types of arguments for %s", tc), func(t *testing.T) { id := testClientHelper().Ids.RandomSchemaObjectIdentifier() argName := "A" + funcName := "identity" dataType, err := datatypes.ParseDataType(tc) require.NoError(t, err) args := []sdk.FunctionArgumentRequest{ @@ -603,10 +1795,10 @@ func TestInt_FunctionsShowByID(t *testing.T) { id, *sdk.NewFunctionReturnsRequest().WithResultDataType(*sdk.NewFunctionReturnsResultDataTypeRequest(dataType)), "3.8", - "add", + funcName, ). WithArguments(args). - WithFunctionDefinition(fmt.Sprintf("def add(%[1]s): %[1]s", argName)), + WithFunctionDefinitionWrapped(testClientHelper().Function.PythonIdentityDefinition(t, funcName, argName)), ) require.NoError(t, err) @@ -622,7 +1814,7 @@ func TestInt_FunctionsShowByID(t *testing.T) { require.NoError(t, err) pairs := make(map[string]string) for _, detail := range details { - pairs[detail.Property] = detail.Value + pairs[detail.Property] = *detail.Value } assert.Equal(t, fmt.Sprintf("(%s %s)", argName, oldDataType), pairs["signature"]) assert.Equal(t, dataType.Canonical(), pairs["returns"]) diff --git a/pkg/sdk/testint/procedures_integration_test.go b/pkg/sdk/testint/procedures_integration_test.go index 4543791c4d..2a69ef42c2 100644 --- a/pkg/sdk/testint/procedures_integration_test.go +++ b/pkg/sdk/testint/procedures_integration_test.go @@ -3,8 +3,16 @@ package testint import ( "errors" "fmt" + "strings" "testing" + "time" + assertions "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/objectassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/bettertestspoc/assert/objectparametersassert" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testdatatypes" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/datatypes" @@ -12,59 +20,1203 @@ import ( "github.com/stretchr/testify/require" ) -// todo: add tests for: -// - creating procedure with different languages from stages - -func TestInt_CreateProcedures(t *testing.T) { +// TODO [SNOW-1850370]: 'ExtendedIn' struct for procedures not support keyword "CLASS" now +// TODO [SNOW-1850370]: Call/CreateAndCall methods were not updated before V1 because we are not using them +func TestInt_Procedures(t *testing.T) { client := testClient(t) ctx := testContext(t) - cleanupProcedureHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Procedures.Drop(ctx, sdk.NewDropProcedureRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } + secretId := testClientHelper().Ids.RandomSchemaObjectIdentifier() + + networkRule, networkRuleCleanup := testClientHelper().NetworkRule.Create(t) + t.Cleanup(networkRuleCleanup) + + secret, secretCleanup := testClientHelper().Secret.CreateWithGenericString(t, secretId, "test_secret_string") + t.Cleanup(secretCleanup) + + externalAccessIntegration, externalAccessIntegrationCleanup := testClientHelper().ExternalAccessIntegration.CreateExternalAccessIntegrationWithNetworkRuleAndSecret(t, networkRule.ID(), secret.ID()) + t.Cleanup(externalAccessIntegrationCleanup) + + tmpJavaProcedure := testClientHelper().CreateSampleJavaProcedureAndJar(t) + tmpPythonFunction := testClientHelper().CreateSamplePythonFunctionAndModule(t) + + assertParametersSet := func(t *testing.T, procedureParametersAssert *objectparametersassert.ProcedureParametersAssert) { + t.Helper() + assertions.AssertThatObject(t, procedureParametersAssert. + // TODO [SNOW-1850370]: every value end with invalid value [OFF] for parameter 'AUTO_EVENT_LOGGING' + // HasAutoEventLogging(sdk.AutoEventLoggingTracing). + HasEnableConsoleOutput(true). + HasLogLevel(sdk.LogLevelWarn). + HasMetricLevel(sdk.MetricLevelAll). + HasTraceLevel(sdk.TraceLevelAlways), + ) } - t.Run("create procedure for Java: returns result data type", func(t *testing.T) { - // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-java#reading-a-dynamically-specified-file-with-inputstream - name := "file_reader_java_proc_snowflakefile" - id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR) + t.Run("create procedure for Java - inline minimal", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 - definition := ` - import java.io.InputStream; - import java.io.IOException; - import java.nio.charset.StandardCharsets; - import com.snowflake.snowpark_java.types.SnowflakeFile; - import com.snowflake.snowpark_java.Session; - class FileReader { - public String execute(Session session, String fileName) throws IOException { - InputStream input = SnowflakeFile.newInstance(fileName).getInputStream(); - return new String(input.readAllBytes(), StandardCharsets.UTF_8); - } - }` + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := testClientHelper().Procedure.SampleJavaDefinition(t, className, funcName, argName) + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0")} - dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) + request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithProcedureDefinitionWrapped(definition) + + err := client.Procedures.CreateForJava(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + procedure, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, procedure). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, procedure.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, procedure.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Java - inline full", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewProcedureArgumentRequest("input", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:latest")} - request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, "FileReader.execute"). + handler := fmt.Sprintf("%s.%s", className, funcName) + definition := testClientHelper().Procedure.SampleJavaDefinition(t, className, funcName, argName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0"), + *sdk.NewProcedurePackageRequest("com.snowflake:telemetry:0.1.0"), + } + + request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithProcedureDefinition(definition) + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpJavaProcedure.JarLocation())}). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithTargetPath(targetPath). + WithProcedureDefinitionWrapped(definition) + err := client.Procedures.CreateForJava(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + t.Cleanup(testClientHelper().Stage.RemoveFromUserStageFunc(t, jarName)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + // TODO [SNOW-1850370]: apparently external access integrations and secrets are not filled out correctly for procedures + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaProcedure.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPath(targetPath). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) - procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) + t.Run("create procedure for Java - staged minimal", func(t *testing.T) { + dataType := tmpJavaProcedure.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := tmpJavaProcedure.JavaHandler() + importPath := tmpJavaProcedure.JarLocation() + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0"), + *sdk.NewProcedurePackageRequest("com.snowflake:telemetry:0.1.0"), + } + + requestStaged := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(importPath)}) + + err := client.Procedures.CreateForJava(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, importPath)). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Java - staged full", func(t *testing.T) { + dataType := tmpJavaProcedure.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := tmpJavaProcedure.JavaHandler() + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0"), + *sdk.NewProcedurePackageRequest("com.snowflake:telemetry:0.1.0"), + } + + requestStaged := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, handler). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpJavaProcedure.JarLocation())}). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}) + + err := client.Procedures.CreateForJava(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaProcedure.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("11"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Javascript - inline minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "d" + definition := testClientHelper().Procedure.SampleJavascriptDefinition(t, argName) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), dataType, definition). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}) + + err := client.Procedures.CreateForJavaScript(ctx, request) require.NoError(t, err) - require.GreaterOrEqual(t, len(procedures), 1) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("JAVASCRIPT"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Javascript - inline full", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "d" + definition := testClientHelper().Procedure.SampleJavascriptDefinition(t, argName) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), dataType, definition). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNotNull(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithExecuteAs(sdk.ExecuteAsCaller). + WithComment("comment") + + err := client.Procedures.CreateForJavaScript(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("JAVASCRIPT"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Python - inline minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeNumber_36_2 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + funcName := "dump" + definition := testClientHelper().Procedure.SamplePythonDefinition(t, funcName, argName) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("snowflake-snowpark-python==1.14.0"), + } + request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, funcName). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithProcedureDefinitionWrapped(definition) + + err := client.Procedures.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")). + HasLanguage("PYTHON"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(funcName). + HasRuntimeVersion("3.8"). + HasPackages(`['snowflake-snowpark-python==1.14.0']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Python - inline full", func(t *testing.T) { + dataType := testdatatypes.DataTypeNumber_36_2 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + funcName := "dump" + definition := testClientHelper().Procedure.SamplePythonDefinition(t, funcName, argName) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("snowflake-snowpark-python==1.14.0"), + *sdk.NewProcedurePackageRequest("absl-py==0.10.0"), + } + + request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, funcName). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpPythonFunction.PythonModuleLocation())}). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithExecuteAs(sdk.ExecuteAsCaller). + WithProcedureDefinitionWrapped(definition) + + err := client.Procedures.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")+" NOT NULL"). + HasLanguage("PYTHON"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(funcName). + HasRuntimeVersion("3.8"). + HasPackages(`['snowflake-snowpark-python==1.14.0','absl-py==0.10.0']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Python - staged minimal", func(t *testing.T) { + dataType := testdatatypes.DataTypeVarchar_100 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("snowflake-snowpark-python==1.14.0"), + } + request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, tmpPythonFunction.PythonHandler()). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpPythonFunction.PythonModuleLocation())}) + + err := client.Procedures.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")). + HasLanguage("PYTHON"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(tmpPythonFunction.PythonHandler()). + HasRuntimeVersion("3.8"). + HasPackages(`['snowflake-snowpark-python==1.14.0']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Python - staged full", func(t *testing.T) { + dataType := testdatatypes.DataTypeVarchar_100 + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "i" + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("snowflake-snowpark-python==1.14.0"), + *sdk.NewProcedurePackageRequest("absl-py==0.10.0"), + } + + request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, tmpPythonFunction.PythonHandler()). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpPythonFunction.PythonModuleLocation())}). + WithExecuteAs(sdk.ExecuteAsCaller) + + err := client.Procedures.CreateForPython(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(strings.ReplaceAll(dataType.ToSql(), " ", "")+" NOT NULL"). + HasLanguage("PYTHON"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpPythonFunction.PythonModuleLocation())). + HasHandler(tmpPythonFunction.PythonHandler()). + HasRuntimeVersion("3.8"). + HasPackages(`['snowflake-snowpark-python==1.14.0','absl-py==0.10.0']`). + HasTargetPathNil(). + HasInstalledPackagesNotEmpty(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Scala - inline minimal", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + definition := testClientHelper().Procedure.SampleScalaDefinition(t, className, funcName, argName) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + handler := fmt.Sprintf("%s.%s", className, funcName) + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0")} + + request := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithProcedureDefinitionWrapped(definition) + + err := client.Procedures.CreateForScala(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SCALA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(`[]`). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Scala - inline full", func(t *testing.T) { + className := "TestFunc" + funcName := "echoVarchar" + argName := "x" + dataType := testdatatypes.DataTypeVarchar_100 + + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + definition := testClientHelper().Procedure.SampleScalaDefinition(t, className, funcName, argName) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + handler := fmt.Sprintf("%s.%s", className, funcName) + jarName := fmt.Sprintf("tf-%d-%s.jar", time.Now().Unix(), random.AlphaN(5)) + targetPath := fmt.Sprintf("@~/%s", jarName) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0"), + *sdk.NewProcedurePackageRequest("com.snowflake:telemetry:0.1.0"), + } + + request := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, handler). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpJavaProcedure.JarLocation())}). + WithTargetPath(targetPath). + WithExecuteAs(sdk.ExecuteAsCaller). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithProcedureDefinitionWrapped(definition) + + err := client.Procedures.CreateForScala(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + t.Cleanup(testClientHelper().Stage.RemoveFromUserStageFunc(t, jarName)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SCALA"). + HasBody(definition). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaProcedure.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPath(targetPath). + HasInstalledPackagesNil(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Scala - staged minimal", func(t *testing.T) { + dataType := tmpJavaProcedure.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + handler := tmpJavaProcedure.JavaHandler() + importPath := tmpJavaProcedure.JarLocation() + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0")} + + requestStaged := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, handler). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(importPath)}) + + err := client.Procedures.CreateForScala(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SCALA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorCalledOnNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorVolatile)). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImports(fmt.Sprintf(`[%s]`, importPath)). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for Scala - staged full", func(t *testing.T) { + dataType := tmpJavaProcedure.ArgType + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + argName := "x" + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + handler := tmpJavaProcedure.JavaHandler() + + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) + packages := []sdk.ProcedurePackageRequest{ + *sdk.NewProcedurePackageRequest("com.snowflake:snowpark:1.14.0"), + *sdk.NewProcedurePackageRequest("com.snowflake:telemetry:0.1.0"), + } + + requestStaged := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, handler). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithComment("comment"). + WithExecuteAs(sdk.ExecuteAsCaller). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecrets([]sdk.SecretReference{{VariableName: "abc", Name: secretId}}). + WithImports([]sdk.ProcedureImportRequest{*sdk.NewProcedureImportRequest(tmpJavaProcedure.JarLocation())}) + + err := client.Procedures.CreateForScala(ctx, requestStaged) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SCALA"). + HasBodyNil(). + HasNullHandling(string(sdk.NullInputBehaviorReturnsNullInput)). + HasVolatility(string(sdk.ReturnResultsBehaviorImmutable)). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}). + HasImports(fmt.Sprintf(`[%s]`, tmpJavaProcedure.JarLocation())). + HasHandler(handler). + HasRuntimeVersion("2.12"). + HasPackages(`[com.snowflake:snowpark:1.14.0,com.snowflake:telemetry:0.1.0]`). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for SQL - inline minimal", func(t *testing.T) { + argName := "x" + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + definition := testClientHelper().Procedure.SampleSqlDefinition(t) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType) + returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}) + + err := client.Procedures.CreateForSQL(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription(sdk.DefaultProcedureComment). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(dataType.ToSql()). + HasLanguage("SQL"). + HasBody(definition). + HasNullHandlingNil(). + HasVolatilityNil(). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("OWNER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) + + t.Run("create procedure for SQL - inline full", func(t *testing.T) { + argName := "x" + dataType := testdatatypes.DataTypeFloat + id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + + definition := testClientHelper().Procedure.SampleSqlDefinition(t) + dt := sdk.NewProcedureReturnsResultDataTypeRequest(dataType). + WithNotNull(true) + returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt) + argument := sdk.NewProcedureArgumentRequest(argName, dataType) + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). + WithOrReplace(true). + WithArguments([]sdk.ProcedureArgumentRequest{*argument}). + WithCopyGrants(true). + WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). + WithReturnResultsBehavior(sdk.ReturnResultsBehaviorImmutable). + WithExecuteAs(sdk.ExecuteAsCaller). + WithComment("comment") + + err := client.Procedures.CreateForSQL(ctx, request) + require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) + + function, err := client.Procedures.ShowByID(ctx, id) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureFromObject(t, function). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasSchemaName(id.SchemaName()). + HasIsBuiltin(false). + HasIsAggregate(false). + HasIsAnsi(false). + HasMinNumArguments(1). + HasMaxNumArguments(1). + HasArgumentsOld([]sdk.DataType{sdk.LegacyDataTypeFrom(dataType)}). + HasArgumentsRaw(fmt.Sprintf(`%[1]s(%[2]s) RETURN %[2]s`, function.ID().Name(), dataType.ToLegacyDataTypeSql())). + HasDescription("comment"). + HasCatalogName(id.DatabaseName()). + HasIsTableFunction(false). + HasValidForClustering(false). + HasIsSecure(false). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, function.ID()). + HasSignature(fmt.Sprintf(`(%s %s)`, argName, dataType.ToLegacyDataTypeSql())). + HasReturns(fmt.Sprintf(`%s NOT NULL`, dataType.ToSql())). + HasLanguage("SQL"). + HasBody(definition). + // TODO [SNOW-1348103]: null handling and volatility are not returned and is present in create syntax + HasNullHandlingNil(). + HasVolatilityNil(). + HasVolatilityNil(). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(). + HasImportsNil(). + HasHandlerNil(). + HasRuntimeVersionNil(). + HasPackagesNil(). + HasTargetPathNil(). + HasInstalledPackagesNil(). + HasExecuteAs("CALLER"), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Java: returns table", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-java#specifying-return-column-names-and-types name := "filter_by_role" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR, sdk.DataTypeVARCHAR) @@ -89,17 +1241,20 @@ func TestInt_CreateProcedures(t *testing.T) { request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, "Filter.filterByRole"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForJava(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Javascript", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-javascript#basic-examples name := "stproc1" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeFloat) @@ -116,37 +1271,43 @@ func TestInt_CreateProcedures(t *testing.T) { return "Failed: " + err; // Return a success/error indicator. }` argument := sdk.NewProcedureArgumentRequest("FLOAT_PARAM1", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request := sdk.NewCreateForJavaScriptProcedureRequest(id.SchemaObjectId(), nil, definition). + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), nil, definition). WithResultDataTypeOld(sdk.DataTypeString). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). WithNullInputBehavior(*sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorStrict)). WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsCaller)) err := client.Procedures.CreateForJavaScript(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Javascript: no arguments", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-javascript#basic-examples name := "sp_pi" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name) definition := `return 3.1415926;` - request := sdk.NewCreateForJavaScriptProcedureRequest(id.SchemaObjectId(), nil, definition).WithResultDataTypeOld(sdk.DataTypeFloat).WithNotNull(true).WithOrReplace(true) + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), nil, definition).WithResultDataTypeOld(sdk.DataTypeFloat).WithNotNull(true).WithOrReplace(true) err := client.Procedures.CreateForJavaScript(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Scala: returns result data type", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-scala#reading-a-dynamically-specified-file-with-snowflakefile name := "file_reader_scala_proc_snowflakefile" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR) @@ -169,17 +1330,20 @@ func TestInt_CreateProcedures(t *testing.T) { request := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, "FileReader.execute"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForScala(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Scala: returns table", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-scala#specifying-return-column-names-and-types name := "filter_by_role" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR, sdk.DataTypeVARCHAR) @@ -205,17 +1369,20 @@ func TestInt_CreateProcedures(t *testing.T) { request := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, "Filter.filterByRole"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForScala(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Python: returns result data type", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-python#running-concurrent-tasks-with-worker-processes name := "joblib_multiprocessing_proc" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeInt) @@ -237,17 +1404,20 @@ def joblib_multiprocessing(session, i): request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, "joblib_multiprocessing"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForPython(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for Python: returns table", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-python#specifying-return-column-names-and-types name := "filterByRole" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR, sdk.DataTypeVARCHAR) @@ -268,17 +1438,20 @@ def filter_by_role(session, table_name, role): request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, "filter_by_role"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForPython(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for SQL: returns result data type", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + // https://docs.snowflake.com/en/developer-guide/stored-procedure/stored-procedures-snowflake-scripting name := "output_message" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR) @@ -291,7 +1464,7 @@ def filter_by_role(session, table_name, role): dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt).WithNotNull(true) argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). WithOrReplace(true). // Suddenly this is erroring out, when it used to not have an problem. Must be an error with the Snowflake API. // Created issue in docs-discuss channel. https://snowflake.slack.com/archives/C6380540P/p1707511734666249 @@ -299,18 +1472,21 @@ def filter_by_role(session, table_name, role): // 001003 (42000): SQL compilation error: // syntax error line 1 at position 210 unexpected 'NULL'. // syntax error line 1 at position 215 unexpected 'ON'. - // WithNullInputBehavior(sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnNullInput)). + // WithNullInputBehavior(sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). WithArguments([]sdk.ProcedureArgumentRequest{*argument}) err := client.Procedures.CreateForSQL(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) + // TODO [SNOW-1348103]: adjust or remove t.Run("create procedure for SQL: returns table", func(t *testing.T) { + t.Skipf("Skipped for now; left as inspiration for resource rework as part of SNOW-1348103") + name := "find_invoice_by_id" id := testClientHelper().Ids.NewSchemaObjectIdentifierWithArguments(name, sdk.DataTypeVARCHAR) @@ -325,216 +1501,443 @@ def filter_by_role(session, table_name, role): returnsTable := sdk.NewProcedureReturnsTableRequest().WithColumns([]sdk.ProcedureColumnRequest{*column1, *column2}) returns := sdk.NewProcedureSQLReturnsRequest().WithTable(*returnsTable) argument := sdk.NewProcedureArgumentRequest("id", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). WithOrReplace(true). // SNOW-1051627 todo: uncomment once null input behavior working again - // WithNullInputBehavior(sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnNullInput)). + // WithNullInputBehavior(sdk.NullInputBehaviorPointer(sdk.NullInputBehaviorReturnsNullInput)). WithArguments([]sdk.ProcedureArgumentRequest{*argument}) err := client.Procedures.CreateForSQL(ctx, request) require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, id)) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) }) -} - -func TestInt_OtherProcedureFunctions(t *testing.T) { - client := testClient(t) - ctx := testContext(t) - assertProcedure := func(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments, secure bool) { - t.Helper() + t.Run("show parameters", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + id := p.ID() - procedure, err := client.Procedures.ShowByID(ctx, id) + param, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterLogLevel, sdk.Object{ObjectType: sdk.ObjectTypeProcedure, Name: id}) require.NoError(t, err) + assert.Equal(t, string(sdk.LogLevelOff), param.Value) - assert.NotEmpty(t, procedure.CreatedOn) - assert.Equal(t, id.Name(), procedure.Name) - assert.Equal(t, false, procedure.IsBuiltin) - assert.Equal(t, false, procedure.IsAggregate) - assert.Equal(t, false, procedure.IsAnsi) - assert.Equal(t, 1, procedure.MinNumArguments) - assert.Equal(t, 1, procedure.MaxNumArguments) - assert.NotEmpty(t, procedure.ArgumentsOld) - assert.NotEmpty(t, procedure.ArgumentsRaw) - assert.NotEmpty(t, procedure.Description) - assert.NotEmpty(t, procedure.CatalogName) - assert.Equal(t, false, procedure.IsTableFunction) - assert.Equal(t, false, procedure.ValidForClustering) - assert.Equal(t, secure, procedure.IsSecure) - } - - cleanupProcedureHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Procedures.Drop(ctx, sdk.NewDropProcedureRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } - } + parameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Procedure: id, + }, + }) + require.NoError(t, err) - createProcedureForSQLHandle := func(t *testing.T, cleanup bool) *sdk.Procedure { - t.Helper() + assertions.AssertThatObject(t, objectparametersassert.ProcedureParametersPrefetched(t, id, parameters). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) - definition := ` - BEGIN - RETURN message; - END;` - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) - dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) - returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt).WithNotNull(true) - argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). - WithSecure(true). - WithOrReplace(true). - WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsCaller)) - err := client.Procedures.CreateForSQL(ctx, request) - require.NoError(t, err) - if cleanup { - t.Cleanup(cleanupProcedureHandle(id)) - } - procedure, err := client.Procedures.ShowByID(ctx, id) + // check that ShowParameters on procedure level works too + parameters, err = client.Procedures.ShowParameters(ctx, id) require.NoError(t, err) - return procedure - } + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParametersPrefetched(t, id, parameters). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + }) t.Run("alter procedure: rename", func(t *testing.T) { - f := createProcedureForSQLHandle(t, false) + p, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + id := p.ID() - id := f.ID() - nid := testClientHelper().Ids.RandomSchemaObjectIdentifier() - nidWithArguments := sdk.NewSchemaObjectIdentifierWithArguments(nid.DatabaseName(), nid.SchemaName(), nid.Name(), id.ArgumentDataTypes()...) + nid := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(id.ArgumentDataTypes()...) - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithRenameTo(nid)) - if err != nil { - t.Cleanup(cleanupProcedureHandle(id)) - } else { - t.Cleanup(cleanupProcedureHandle(nidWithArguments)) - } + err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithRenameTo(nid.SchemaObjectId())) require.NoError(t, err) + t.Cleanup(testClientHelper().Procedure.DropProcedureFunc(t, nid)) _, err = client.Procedures.ShowByID(ctx, id) assert.ErrorIs(t, err, collections.ErrObjectNotFound) - e, err := client.Procedures.ShowByID(ctx, nidWithArguments) + e, err := client.Procedures.ShowByID(ctx, nid) require.NoError(t, err) require.Equal(t, nid.Name(), e.Name) }) - t.Run("alter procedure: set log level", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) - - id := f.ID() - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithSetLogLevel("DEBUG")) - require.NoError(t, err) - assertProcedure(t, id, true) + t.Run("alter procedure: set and unset all for Java", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateJava(t) + t.Cleanup(pCleanup) + id := p.ID() + + assertions.AssertThatObject(t, objectassert.Procedure(t, id). + HasName(id.Name()). + HasDescription(sdk.DefaultProcedureComment), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + request := sdk.NewAlterProcedureRequest(id).WithSet(*sdk.NewProcedureSetRequest(). + WithExternalAccessIntegrations([]sdk.AccountObjectIdentifier{externalAccessIntegration}). + WithSecretsList(*sdk.NewSecretsListRequest([]sdk.SecretReference{{VariableName: "abc", Name: secretId}})). + // TODO [SNOW-1850370]: every value end with invalid value [OFF] for parameter 'AUTO_EVENT_LOGGING' + // WithAutoEventLogging(sdk.AutoEventLoggingAll). + WithEnableConsoleOutput(true). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithComment("new comment"), + ) + + err := client.Procedures.Alter(ctx, request) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.Procedure(t, id). + HasName(id.Name()). + HasDescription("new comment"), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExactlyExternalAccessIntegrations(externalAccessIntegration). + HasExactlySecrets(map[string]sdk.SchemaObjectIdentifier{"abc": secretId}), + ) + + assertParametersSet(t, objectparametersassert.ProcedureParameters(t, id)) + + unsetRequest := sdk.NewAlterProcedureRequest(id).WithUnset(*sdk.NewProcedureUnsetRequest(). + WithExternalAccessIntegrations(true). + // WithAutoEventLogging(true). + WithEnableConsoleOutput(true). + WithLogLevel(true). + WithMetricLevel(true). + WithTraceLevel(true). + WithComment(true), + ) + + err = client.Procedures.Alter(ctx, unsetRequest) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.Procedure(t, id). + HasName(id.Name()). + HasDescription(sdk.DefaultProcedureComment). + // both nil, because they are always nil in SHOW for procedures + HasExternalAccessIntegrationsNil(). + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExternalAccessIntegrationsNil(). + // TODO [SNOW-1850370]: apparently UNSET external access integrations cleans out secrets in the describe but leaves it in SHOW + HasSecretsNil(), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + unsetSecretsRequest := sdk.NewAlterProcedureRequest(id).WithSet(*sdk.NewProcedureSetRequest(). + WithSecretsList(*sdk.NewSecretsListRequest([]sdk.SecretReference{})), + ) + + err = client.Procedures.Alter(ctx, unsetSecretsRequest) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasSecretsNil(), + ) }) - t.Run("alter procedure: set trace level", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) - - id := f.ID() - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithSetTraceLevel("ALWAYS")) - require.NoError(t, err) - assertProcedure(t, id, true) + t.Run("alter procedure: set and unset all for SQL", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + id := p.ID() + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) + + request := sdk.NewAlterProcedureRequest(id).WithSet(*sdk.NewProcedureSetRequest(). + // WithAutoEventLogging(sdk.AutoEventLoggingTracing). + WithEnableConsoleOutput(true). + WithLogLevel(sdk.LogLevelWarn). + WithMetricLevel(sdk.MetricLevelAll). + WithTraceLevel(sdk.TraceLevelAlways). + WithComment("new comment"), + ) + + err := client.Procedures.Alter(ctx, request) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.Procedure(t, id). + HasName(id.Name()). + HasDescription("new comment"), + ) + + assertParametersSet(t, objectparametersassert.ProcedureParameters(t, id)) + + unsetRequest := sdk.NewAlterProcedureRequest(id).WithUnset(*sdk.NewProcedureUnsetRequest(). + // WithAutoEventLogging(true). + WithEnableConsoleOutput(true). + WithLogLevel(true). + WithMetricLevel(true). + WithTraceLevel(true). + WithComment(true), + ) + + err = client.Procedures.Alter(ctx, unsetRequest) + require.NoError(t, err) + + assertions.AssertThatObject(t, objectassert.Procedure(t, id). + HasCreatedOnNotEmpty(). + HasName(id.Name()). + HasDescription(sdk.DefaultProcedureComment), + ) + + assertions.AssertThatObject(t, objectparametersassert.ProcedureParameters(t, id). + HasAllDefaults(). + HasAllDefaultsExplicit(), + ) }) - t.Run("alter procedure: set comment", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) - - id := f.ID() - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithSetComment("comment")) - require.NoError(t, err) - assertProcedure(t, id, true) - }) + t.Run("alter procedure: set execute as", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + id := p.ID() - t.Run("alter procedure: unset comment", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExecuteAs("OWNER"), + ) - id := f.ID() - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithUnsetComment(true)) + err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsCaller))) require.NoError(t, err) - assertProcedure(t, id, true) - }) - t.Run("alter procedure: set execute as", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExecuteAs("CALLER"), + ) - id := f.ID() - err := client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsOwner))) + err = client.Procedures.Alter(ctx, sdk.NewAlterProcedureRequest(id).WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsOwner))) require.NoError(t, err) - assertProcedure(t, id, true) + + assertions.AssertThatObject(t, objectassert.ProcedureDetails(t, id). + HasExecuteAs("OWNER"), + ) }) - t.Run("show procedure for SQL: without like", func(t *testing.T) { - f1 := createProcedureForSQLHandle(t, true) - f2 := createProcedureForSQLHandle(t, true) + t.Run("show procedure: without like", func(t *testing.T) { + p1, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + p2, pCleanup2 := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup2) procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest()) require.NoError(t, err) require.GreaterOrEqual(t, len(procedures), 1) - require.Contains(t, procedures, *f1) - require.Contains(t, procedures, *f2) + require.Contains(t, procedures, *p1) + require.Contains(t, procedures, *p2) }) - t.Run("show procedure for SQL: with like", func(t *testing.T) { - f1 := createProcedureForSQLHandle(t, true) - f2 := createProcedureForSQLHandle(t, true) + t.Run("show procedure: with like", func(t *testing.T) { + p1, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + p2, pCleanup2 := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup2) - procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest().WithLike(sdk.Like{Pattern: &f1.Name})) + procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest().WithLike(sdk.Like{Pattern: &p1.Name})) require.NoError(t, err) require.Equal(t, 1, len(procedures)) - require.Contains(t, procedures, *f1) - require.NotContains(t, procedures, *f2) + require.Contains(t, procedures, *p1) + require.NotContains(t, procedures, *p2) }) - t.Run("show procedure for SQL: no matches", func(t *testing.T) { - procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest().WithLike(sdk.Like{Pattern: sdk.String("non-existing-id-pattern")})) + t.Run("show procedure: no matches", func(t *testing.T) { + procedures, err := client.Procedures.Show(ctx, sdk.NewShowProcedureRequest(). + WithIn(sdk.ExtendedIn{In: sdk.In{Schema: testClientHelper().Ids.SchemaId()}}). + WithLike(sdk.Like{Pattern: sdk.String(NonExistingSchemaObjectIdentifier.Name())})) require.NoError(t, err) require.Equal(t, 0, len(procedures)) }) - t.Run("describe procedure for SQL", func(t *testing.T) { - f := createProcedureForSQLHandle(t, true) - id := f.ID() + t.Run("describe procedure: for SQL", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateSql(t) + t.Cleanup(pCleanup) + id := p.ID() details, err := client.Procedures.Describe(ctx, id) require.NoError(t, err) - pairs := make(map[string]string) + assert.Len(t, details, 5) + + pairs := make(map[string]*string) for _, detail := range details { pairs[detail.Property] = detail.Value } - require.Equal(t, "SQL", pairs["language"]) - require.Equal(t, "CALLER", pairs["execute as"]) - require.Equal(t, "(MESSAGE VARCHAR)", pairs["signature"]) - require.Equal(t, "\n\tBEGIN\n\t\tRETURN message;\n\tEND;", pairs["body"]) + assert.Equal(t, "(x FLOAT)", *pairs["signature"]) + assert.Equal(t, "FLOAT", *pairs["returns"]) + assert.Equal(t, "SQL", *pairs["language"]) + assert.Equal(t, "\nBEGIN\n\tRETURN 3.141592654::FLOAT;\nEND;\n", *pairs["body"]) + assert.Equal(t, "OWNER", *pairs["execute as"]) + }) + + t.Run("describe procedure: for Java", func(t *testing.T) { + p, pCleanup := testClientHelper().Procedure.CreateJava(t) + t.Cleanup(pCleanup) + id := p.ID() + + details, err := client.Procedures.Describe(ctx, id) + require.NoError(t, err) + assert.Len(t, details, 12) + + pairs := make(map[string]*string) + for _, detail := range details { + pairs[detail.Property] = detail.Value + } + assert.Equal(t, "(x VARCHAR)", *pairs["signature"]) + assert.Equal(t, "VARCHAR(100)", *pairs["returns"]) + assert.Equal(t, "JAVA", *pairs["language"]) + assert.NotEmpty(t, *pairs["body"]) + assert.Equal(t, string(sdk.NullInputBehaviorCalledOnNullInput), *pairs["null handling"]) + assert.Equal(t, string(sdk.VolatileTableKind), *pairs["volatility"]) + assert.Nil(t, pairs["external_access_integration"]) + assert.Nil(t, pairs["secrets"]) + assert.Equal(t, "[]", *pairs["imports"]) + assert.Equal(t, "TestFunc.echoVarchar", *pairs["handler"]) + assert.Equal(t, "11", *pairs["runtime_version"]) + assert.Equal(t, "OWNER", *pairs["execute as"]) }) t.Run("drop procedure for SQL", func(t *testing.T) { - definition := ` - BEGIN - RETURN message; - END;` - id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) - dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) - returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt).WithNotNull(true) - argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). - WithOrReplace(true). - WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsCaller)) - err := client.Procedures.CreateForSQL(ctx, request) + p, pCleanup := testClientHelper().Procedure.CreateJava(t) + t.Cleanup(pCleanup) + id := p.ID() + + err := client.Procedures.Drop(ctx, sdk.NewDropProcedureRequest(id)) + require.NoError(t, err) + }) + + t.Run("show by id - same name in different schemas", func(t *testing.T) { + schema, schemaCleanup := testClientHelper().Schema.CreateSchema(t) + t.Cleanup(schemaCleanup) + + dataType := testdatatypes.DataTypeFloat + id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.LegacyDataTypeFrom(dataType)) + id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(id1.Name(), schema.ID(), sdk.LegacyDataTypeFrom(dataType)) + + _, pCleanup1 := testClientHelper().Procedure.CreateSqlWithIdentifierAndArgument(t, id1.SchemaObjectId(), dataType, testClientHelper().Procedure.SampleSqlDefinition(t)) + t.Cleanup(pCleanup1) + _, pCleanup2 := testClientHelper().Procedure.CreateSqlWithIdentifierAndArgument(t, id2.SchemaObjectId(), dataType, testClientHelper().Procedure.SampleSqlDefinition(t)) + t.Cleanup(pCleanup2) + + e1, err := client.Procedures.ShowByID(ctx, id1) require.NoError(t, err) + require.Equal(t, id1, e1.ID()) + + e2, err := client.Procedures.ShowByID(ctx, id2) + require.NoError(t, err) + require.Equal(t, id2, e2.ID()) + }) + + t.Run("show procedure by id - same name, different arguments", func(t *testing.T) { + dataType := testdatatypes.DataTypeFloat + name := testClientHelper().Ids.Alpha() + + id1 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.LegacyDataTypeFrom(dataType)) + id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.DataTypeInt, sdk.DataTypeVARCHAR) + + e := testClientHelper().Procedure.CreateWithIdentifier(t, id1) + testClientHelper().Procedure.CreateWithIdentifier(t, id2) - err = client.Procedures.Drop(ctx, sdk.NewDropProcedureRequest(id)) + es, err := client.Procedures.ShowByID(ctx, id1) require.NoError(t, err) + require.Equal(t, *e, *es) }) + + // This test shows behavior of detailed types (e.g. VARCHAR(20) and NUMBER(10, 0)) on Snowflake side for procedures. + // For SHOW, data type is generalized both for argument and return type (to e.g. VARCHAR and NUMBER). + // FOR DESCRIBE, data type is generalized for argument and works weirdly for the return type: type is generalized to the canonical one, but we also get the attributes. + for _, tc := range []string{ + "NUMBER(36, 5)", + "NUMBER(36)", + "NUMBER", + "DECIMAL", + "INTEGER", + "FLOAT", + "DOUBLE", + "VARCHAR", + "VARCHAR(20)", + "CHAR", + "CHAR(10)", + "TEXT", + "BINARY", + "BINARY(1000)", + "VARBINARY", + "BOOLEAN", + "DATE", + "DATETIME", + "TIME", + "TIMESTAMP_LTZ", + "TIMESTAMP_NTZ", + "TIMESTAMP_TZ", + "VARIANT", + "OBJECT", + "ARRAY", + "GEOGRAPHY", + "GEOMETRY", + "VECTOR(INT, 16)", + "VECTOR(FLOAT, 8)", + } { + tc := tc + t.Run(fmt.Sprintf("procedure returns non detailed data types of arguments for %s", tc), func(t *testing.T) { + procName := "add" + argName := "A" + dataType, err := datatypes.ParseDataType(tc) + require.NoError(t, err) + args := []sdk.ProcedureArgumentRequest{ + *sdk.NewProcedureArgumentRequest(argName, dataType), + } + oldDataType := sdk.LegacyDataTypeFrom(dataType) + idWithArguments := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(oldDataType) + + packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("snowflake-snowpark-python")} + definition := fmt.Sprintf("def add(%[1]s): %[1]s", argName) + + err = client.Procedures.CreateForPython(ctx, sdk.NewCreateForPythonProcedureRequest( + idWithArguments.SchemaObjectId(), + *sdk.NewProcedureReturnsRequest().WithResultDataType(*sdk.NewProcedureReturnsResultDataTypeRequest(dataType)), + "3.8", + packages, + procName, + ). + WithArguments(args). + WithProcedureDefinitionWrapped(definition), + ) + require.NoError(t, err) + + procedure, err := client.Procedures.ShowByID(ctx, idWithArguments) + require.NoError(t, err) + assert.Equal(t, []sdk.DataType{oldDataType}, procedure.ArgumentsOld) + assert.Equal(t, fmt.Sprintf("%[1]s(%[2]s) RETURN %[2]s", idWithArguments.Name(), oldDataType), procedure.ArgumentsRaw) + + details, err := client.Procedures.Describe(ctx, idWithArguments) + require.NoError(t, err) + pairs := make(map[string]string) + for _, detail := range details { + pairs[detail.Property] = *detail.Value + } + assert.Equal(t, fmt.Sprintf("(%s %s)", argName, oldDataType), pairs["signature"]) + assert.Equal(t, dataType.Canonical(), pairs["returns"]) + }) + } } func TestInt_CallProcedure(t *testing.T) { @@ -574,13 +1977,13 @@ func TestInt_CallProcedure(t *testing.T) { definition := ` BEGIN - RETURN message; + RETURN MESSAGE; END;` id := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt).WithNotNull(true) - argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). + argument := sdk.NewProcedureArgumentRequest("MESSAGE", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) + request := sdk.NewCreateForSQLProcedureRequestDefinitionWrapped(id.SchemaObjectId(), *returns, definition). WithSecure(true). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). @@ -603,7 +2006,7 @@ func TestInt_CallProcedure(t *testing.T) { t.Run("call procedure for SQL: argument names", func(t *testing.T) { f := createProcedureForSQLHandle(t, true) - err := client.Procedures.Call(ctx, sdk.NewCallProcedureRequest(f.ID().SchemaObjectId()).WithCallArguments([]string{"message => 'hi'"})) + err := client.Procedures.Call(ctx, sdk.NewCallProcedureRequest(f.ID().SchemaObjectId()).WithCallArguments([]string{"MESSAGE => 'hi'"})) require.NoError(t, err) }) @@ -632,7 +2035,7 @@ func TestInt_CallProcedure(t *testing.T) { request := sdk.NewCreateForJavaProcedureRequest(id.SchemaObjectId(), *returns, "11", packages, "Filter.filterByRole"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForJava(ctx, request) require.NoError(t, err) t.Cleanup(cleanupProcedureHandle(id)) @@ -666,7 +2069,7 @@ func TestInt_CallProcedure(t *testing.T) { request := sdk.NewCreateForScalaProcedureRequest(id.SchemaObjectId(), *returns, "2.12", packages, "Filter.filterByRole"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForScala(ctx, request) require.NoError(t, err) t.Cleanup(cleanupProcedureHandle(id)) @@ -693,7 +2096,7 @@ func TestInt_CallProcedure(t *testing.T) { return "Failed: " + err; // Return a success/error indicator. }` arg := sdk.NewProcedureArgumentRequest("FLOAT_PARAM1", nil).WithArgDataTypeOld(sdk.DataTypeFloat) - request := sdk.NewCreateForJavaScriptProcedureRequest(id.SchemaObjectId(), nil, definition). + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), nil, definition). WithResultDataTypeOld(sdk.DataTypeString). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg}). @@ -713,7 +2116,7 @@ func TestInt_CallProcedure(t *testing.T) { id := sdk.NewSchemaObjectIdentifierWithArguments(databaseId.Name(), schemaId.Name(), name) definition := `return 3.1415926;` - request := sdk.NewCreateForJavaScriptProcedureRequest(id.SchemaObjectId(), nil, definition).WithResultDataTypeOld(sdk.DataTypeFloat).WithNotNull(true).WithOrReplace(true) + request := sdk.NewCreateForJavaScriptProcedureRequestDefinitionWrapped(id.SchemaObjectId(), nil, definition).WithResultDataTypeOld(sdk.DataTypeFloat).WithNotNull(true).WithOrReplace(true) err := client.Procedures.CreateForJavaScript(ctx, request) require.NoError(t, err) t.Cleanup(cleanupProcedureHandle(id)) @@ -739,7 +2142,7 @@ def filter_by_role(session, name, role): request := sdk.NewCreateForPythonProcedureRequest(id.SchemaObjectId(), *returns, "3.8", packages, "filter_by_role"). WithOrReplace(true). WithArguments([]sdk.ProcedureArgumentRequest{*arg1, *arg2}). - WithProcedureDefinition(definition) + WithProcedureDefinitionWrapped(definition) err := client.Procedures.CreateForPython(ctx, request) require.NoError(t, err) t.Cleanup(cleanupProcedureHandle(id)) @@ -876,13 +2279,13 @@ func TestInt_CreateAndCallProcedures(t *testing.T) { t.Run("create and call procedure for SQL: argument positions", func(t *testing.T) { definition := ` BEGIN - RETURN message; + RETURN MESSAGE; END;` name := testClientHelper().Ids.RandomAccountObjectIdentifier() dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) returns := sdk.NewProcedureReturnsRequest().WithResultDataType(*dt) - argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) + argument := sdk.NewProcedureArgumentRequest("MESSAGE", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) request := sdk.NewCreateAndCallForSQLProcedureRequest(name, *returns, definition, name). WithArguments([]sdk.ProcedureArgumentRequest{*argument}). WithCallArguments([]string{"message => 'hi'"}) @@ -949,155 +2352,3 @@ def filter_by_role(session, name, role): require.NoError(t, err) }) } - -func TestInt_ProceduresShowByID(t *testing.T) { - client := testClient(t) - ctx := testContext(t) - - cleanupProcedureHandle := func(id sdk.SchemaObjectIdentifierWithArguments) func() { - return func() { - err := client.Procedures.Drop(ctx, sdk.NewDropProcedureRequest(id)) - if errors.Is(err, sdk.ErrObjectNotExistOrAuthorized) { - return - } - require.NoError(t, err) - } - } - - createProcedureForSQLHandle := func(t *testing.T, id sdk.SchemaObjectIdentifierWithArguments) { - t.Helper() - - definition := ` - BEGIN - RETURN message; - END;` - dt := sdk.NewProcedureReturnsResultDataTypeRequest(nil).WithResultDataTypeOld(sdk.DataTypeVARCHAR) - returns := sdk.NewProcedureSQLReturnsRequest().WithResultDataType(*dt).WithNotNull(true) - argument := sdk.NewProcedureArgumentRequest("message", nil).WithArgDataTypeOld(sdk.DataTypeVARCHAR) - request := sdk.NewCreateForSQLProcedureRequest(id.SchemaObjectId(), *returns, definition). - WithArguments([]sdk.ProcedureArgumentRequest{*argument}). - WithExecuteAs(*sdk.ExecuteAsPointer(sdk.ExecuteAsCaller)) - err := client.Procedures.CreateForSQL(ctx, request) - require.NoError(t, err) - t.Cleanup(cleanupProcedureHandle(id)) - } - - t.Run("show by id - same name in different schemas", func(t *testing.T) { - schema, schemaCleanup := testClientHelper().Schema.CreateSchema(t) - t.Cleanup(schemaCleanup) - - id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeVARCHAR) - id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(id1.Name(), schema.ID(), sdk.DataTypeVARCHAR) - - createProcedureForSQLHandle(t, id1) - createProcedureForSQLHandle(t, id2) - - e1, err := client.Procedures.ShowByID(ctx, id1) - require.NoError(t, err) - require.Equal(t, id1, e1.ID()) - - e2, err := client.Procedures.ShowByID(ctx, id2) - require.NoError(t, err) - require.Equal(t, id2, e2.ID()) - }) - - t.Run("show procedure by id - different name, same arguments", func(t *testing.T) { - id1 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) - id2 := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) - e := testClientHelper().Procedure.CreateWithIdentifier(t, id1) - testClientHelper().Procedure.CreateWithIdentifier(t, id2) - - es, err := client.Procedures.ShowByID(ctx, id1) - require.NoError(t, err) - require.Equal(t, *e, *es) - }) - - t.Run("show procedure by id - same name, different arguments", func(t *testing.T) { - name := testClientHelper().Ids.Alpha() - id1 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.DataTypeInt, sdk.DataTypeFloat, sdk.DataTypeVARCHAR) - id2 := testClientHelper().Ids.NewSchemaObjectIdentifierWithArgumentsInSchema(name, testClientHelper().Ids.SchemaId(), sdk.DataTypeInt, sdk.DataTypeVARCHAR) - e := testClientHelper().Procedure.CreateWithIdentifier(t, id1) - testClientHelper().Procedure.CreateWithIdentifier(t, id2) - - es, err := client.Procedures.ShowByID(ctx, id1) - require.NoError(t, err) - require.Equal(t, *e, *es) - }) - - // This test shows behavior of detailed types (e.g. VARCHAR(20) and NUMBER(10, 0)) on Snowflake side for procedures. - // For SHOW, data type is generalized both for argument and return type (to e.g. VARCHAR and NUMBER). - // FOR DESCRIBE, data type is generalized for argument and works weirdly for the return type: type is generalized to the canonical one, but we also get the attributes. - for _, tc := range []string{ - "NUMBER(36, 5)", - "NUMBER(36)", - "NUMBER", - "DECIMAL", - "INTEGER", - "FLOAT", - "DOUBLE", - "VARCHAR", - "VARCHAR(20)", - "CHAR", - "CHAR(10)", - "TEXT", - "BINARY", - "BINARY(1000)", - "VARBINARY", - "BOOLEAN", - "DATE", - "DATETIME", - "TIME", - "TIMESTAMP_LTZ", - "TIMESTAMP_NTZ", - "TIMESTAMP_TZ", - "VARIANT", - "OBJECT", - "ARRAY", - "GEOGRAPHY", - "GEOMETRY", - "VECTOR(INT, 16)", - "VECTOR(FLOAT, 8)", - } { - tc := tc - t.Run(fmt.Sprintf("procedure returns non detailed data types of arguments for %s", tc), func(t *testing.T) { - procName := "add" - argName := "A" - dataType, err := datatypes.ParseDataType(tc) - require.NoError(t, err) - args := []sdk.ProcedureArgumentRequest{ - *sdk.NewProcedureArgumentRequest(argName, dataType), - } - oldDataType := sdk.LegacyDataTypeFrom(dataType) - idWithArguments := testClientHelper().Ids.RandomSchemaObjectIdentifierWithArguments(oldDataType) - - packages := []sdk.ProcedurePackageRequest{*sdk.NewProcedurePackageRequest("snowflake-snowpark-python")} - definition := fmt.Sprintf("def add(%[1]s): %[1]s", argName) - - err = client.Procedures.CreateForPython(ctx, sdk.NewCreateForPythonProcedureRequest( - idWithArguments.SchemaObjectId(), - *sdk.NewProcedureReturnsRequest().WithResultDataType(*sdk.NewProcedureReturnsResultDataTypeRequest(dataType)), - "3.8", - packages, - procName, - ). - WithArguments(args). - WithProcedureDefinition(definition), - ) - require.NoError(t, err) - - procedure, err := client.Procedures.ShowByID(ctx, idWithArguments) - require.NoError(t, err) - assert.Equal(t, []sdk.DataType{oldDataType}, procedure.ArgumentsOld) - assert.Equal(t, fmt.Sprintf("%[1]s(%[2]s) RETURN %[2]s", idWithArguments.Name(), oldDataType), procedure.ArgumentsRaw) - - details, err := client.Procedures.Describe(ctx, idWithArguments) - require.NoError(t, err) - pairs := make(map[string]string) - for _, detail := range details { - pairs[detail.Property] = detail.Value - } - assert.Equal(t, fmt.Sprintf("(%s %s)", argName, oldDataType), pairs["signature"]) - assert.Equal(t, dataType.Canonical(), pairs["returns"]) - }) - } -} diff --git a/pkg/sdk/testint/security_integrations_gen_integration_test.go b/pkg/sdk/testint/security_integrations_gen_integration_test.go index 3f6ee32ddf..0ae5905195 100644 --- a/pkg/sdk/testint/security_integrations_gen_integration_test.go +++ b/pkg/sdk/testint/security_integrations_gen_integration_test.go @@ -530,6 +530,18 @@ func TestInt_SecurityIntegrations(t *testing.T) { assertSecurityIntegration(t, integration, id, "OAUTH - CUSTOM", true, "a") }) + // Prove that creating a security integration with a specified network policy id with lower case characters fails. This is a bug in Snowflake. + // https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3229 + t.Run("CreateOauthCustom_issue3229", func(t *testing.T) { + id := testClientHelper().Ids.RandomAccountObjectIdentifierWithPrefix("test") + networkPolicy, networkPolicyCleanup := testClientHelper().NetworkPolicy.CreateNetworkPolicyWithRequest(t, sdk.NewCreateNetworkPolicyRequest(id)) + t.Cleanup(networkPolicyCleanup) + + req := sdk.NewCreateOauthForCustomClientsSecurityIntegrationRequest(id, sdk.OauthSecurityIntegrationClientTypePublic, "https://example.com").WithNetworkPolicy(networkPolicy.ID()) + err := client.SecurityIntegrations.CreateOauthForCustomClients(ctx, req) + require.ErrorContains(t, err, "object does not exist or not authorized") + }) + t.Run("CreateSaml2", func(t *testing.T) { _, id, issuer := createSAML2Integration(t, func(r *sdk.CreateSaml2SecurityIntegrationRequest) { r.WithAllowedEmailPatterns([]sdk.EmailPattern{{Pattern: "^(.+dev)@example.com$"}}). @@ -924,6 +936,24 @@ func TestInt_SecurityIntegrations(t *testing.T) { assert.Contains(t, details, sdk.SecurityIntegrationProperty{Name: "OAUTH_CLIENT_RSA_PUBLIC_KEY_2_FP", Type: "String", Value: "", Default: ""}) }) + // Prove that altering a security integration with a specified network policy id with lower case characters fails. This is a bug in Snowflake. + // https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3229 + t.Run("AlterOauthCustom_issue3229", func(t *testing.T) { + neworkPolicyId := testClientHelper().Ids.RandomAccountObjectIdentifierWithPrefix("test") + networkPolicy, networkPolicyCleanup := testClientHelper().NetworkPolicy.CreateNetworkPolicyWithRequest(t, sdk.NewCreateNetworkPolicyRequest(neworkPolicyId)) + t.Cleanup(networkPolicyCleanup) + + _, id := createOauthCustom(t, nil) + + setRequest := sdk.NewAlterOauthForCustomClientsSecurityIntegrationRequest(id). + WithSet( + *sdk.NewOauthForCustomClientsIntegrationSetRequest(). + WithNetworkPolicy(networkPolicy.ID()), + ) + err := client.SecurityIntegrations.AlterOauthForCustomClients(ctx, setRequest) + require.ErrorContains(t, err, "object does not exist or not authorized") + }) + t.Run("AlterSAML2Integration", func(t *testing.T) { _, id, issuer := createSAML2Integration(t, nil) diff --git a/pkg/sdk/testint/setup_test.go b/pkg/sdk/testint/setup_test.go index ee949d3d67..c5b60802a4 100644 --- a/pkg/sdk/testint/setup_test.go +++ b/pkg/sdk/testint/setup_test.go @@ -160,65 +160,69 @@ func (itc *integrationTestContext) initialize() error { } itc.warehouse = wh - config, err := sdk.ProfileConfig(testprofiles.Secondary) - if err != nil { - return err - } + itc.testClient = helpers.NewTestClient(c, TestDatabaseName, TestSchemaName, TestWarehouseName, random.IntegrationTestsSuffix) - if config.Account == defaultConfig.Account { - log.Println("[WARN] default and secondary configs are set to the same account; it may cause problems in tests requiring multiple accounts") - } + // TODO [SNOW-1763603]: improve setup; this is a quick workaround for faster local testing + if os.Getenv(string(testenvs.SimplifiedIntegrationTestsSetup)) == "" { + config, err := sdk.ProfileConfig(testprofiles.Secondary) + if err != nil { + return err + } - secondaryClient, err := sdk.NewClient(config) - if err != nil { - return err - } - itc.secondaryClient = secondaryClient - itc.secondaryCtx = context.Background() + if config.Account == defaultConfig.Account { + log.Println("[WARN] default and secondary configs are set to the same account; it may cause problems in tests requiring multiple accounts") + } - secondaryDb, secondaryDbCleanup, err := createDb(itc.secondaryClient, itc.secondaryCtx, true) - itc.secondaryDatabaseCleanup = secondaryDbCleanup - if err != nil { - return err - } - itc.secondaryDatabase = secondaryDb + secondaryClient, err := sdk.NewClient(config) + if err != nil { + return err + } + itc.secondaryClient = secondaryClient + itc.secondaryCtx = context.Background() - secondarySchema, secondarySchemaCleanup, err := createSc(itc.secondaryClient, itc.secondaryCtx, itc.database, true) - itc.secondarySchemaCleanup = secondarySchemaCleanup - if err != nil { - return err - } - itc.secondarySchema = secondarySchema + secondaryDb, secondaryDbCleanup, err := createDb(itc.secondaryClient, itc.secondaryCtx, true) + itc.secondaryDatabaseCleanup = secondaryDbCleanup + if err != nil { + return err + } + itc.secondaryDatabase = secondaryDb - secondaryWarehouse, secondaryWarehouseCleanup, err := createWh(itc.secondaryClient, itc.secondaryCtx, true) - itc.secondaryWarehouseCleanup = secondaryWarehouseCleanup - if err != nil { - return err - } - itc.secondaryWarehouse = secondaryWarehouse + secondarySchema, secondarySchemaCleanup, err := createSc(itc.secondaryClient, itc.secondaryCtx, itc.database, true) + itc.secondarySchemaCleanup = secondarySchemaCleanup + if err != nil { + return err + } + itc.secondarySchema = secondarySchema - itc.testClient = helpers.NewTestClient(c, TestDatabaseName, TestSchemaName, TestWarehouseName, random.IntegrationTestsSuffix) - itc.secondaryTestClient = helpers.NewTestClient(secondaryClient, TestDatabaseName, TestSchemaName, TestWarehouseName, random.IntegrationTestsSuffix) + secondaryWarehouse, secondaryWarehouseCleanup, err := createWh(itc.secondaryClient, itc.secondaryCtx, true) + itc.secondaryWarehouseCleanup = secondaryWarehouseCleanup + if err != nil { + return err + } + itc.secondaryWarehouse = secondaryWarehouse - err = helpers.EnsureQuotedIdentifiersIgnoreCaseIsSetToFalse(itc.client, itc.ctx) - if err != nil { - return err - } - err = helpers.EnsureQuotedIdentifiersIgnoreCaseIsSetToFalse(itc.secondaryClient, itc.secondaryCtx) - if err != nil { - return err - } + itc.secondaryTestClient = helpers.NewTestClient(secondaryClient, TestDatabaseName, TestSchemaName, TestWarehouseName, random.IntegrationTestsSuffix) - // TODO(SNOW-1842271): Adjust test setup to work properly with Accountadmin role for object tests and Orgadmin for account tests - if os.Getenv(string(testenvs.TestAccountCreate)) == "" { - err = helpers.EnsureScimProvisionerRolesExist(itc.client, itc.ctx) + err = helpers.EnsureQuotedIdentifiersIgnoreCaseIsSetToFalse(itc.client, itc.ctx) if err != nil { return err } - err = helpers.EnsureScimProvisionerRolesExist(itc.secondaryClient, itc.secondaryCtx) + err = helpers.EnsureQuotedIdentifiersIgnoreCaseIsSetToFalse(itc.secondaryClient, itc.secondaryCtx) if err != nil { return err } + + // TODO(SNOW-1842271): Adjust test setup to work properly with Accountadmin role for object tests and Orgadmin for account tests + if os.Getenv(string(testenvs.TestAccountCreate)) == "" { + err = helpers.EnsureScimProvisionerRolesExist(itc.client, itc.ctx) + if err != nil { + return err + } + err = helpers.EnsureScimProvisionerRolesExist(itc.secondaryClient, itc.secondaryCtx) + if err != nil { + return err + } + } } return nil diff --git a/templates/data-sources/roles.md.tmpl b/templates/data-sources/roles.md.tmpl index 60acfefd96..da95cdd5c0 100644 --- a/templates/data-sources/roles.md.tmpl +++ b/templates/data-sources/roles.md.tmpl @@ -11,6 +11,9 @@ description: |- !> **V1 release candidate** This datasource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. + +-> **Note** Fields `STARTS WITH` and `LIMIT` are currently missing. They will be added in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/data-sources/schemas.md.tmpl b/templates/data-sources/schemas.md.tmpl index 0b004f8501..67da95dca4 100644 --- a/templates/data-sources/schemas.md.tmpl +++ b/templates/data-sources/schemas.md.tmpl @@ -11,6 +11,9 @@ description: |- !> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +-> **Note** Field `WITH PRIVILEGES` is currently missing. It will be added in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/guides/unassigning_policies.md.tmpl b/templates/guides/unassigning_policies.md.tmpl new file mode 100644 index 0000000000..de5de63e86 --- /dev/null +++ b/templates/guides/unassigning_policies.md.tmpl @@ -0,0 +1,65 @@ +--- +page_title: "Unassigning policies" +subcategory: "" +description: |- + +--- +# Unassigning policies + +For some objects, like network policies, Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes) suggest that a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. + +Before dropping the resource: +- if the objects the policy is assigned to are managed in Terraform, follow the example below +- if they are not managed in Terraform, list them with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` + +## Example + +When you have a configuration like +```terraform +resource "snowflake_network_policy" "example" { + name = "network_policy_name" +} + +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] + network_policy = snowflake_network_policy.example.fully_qualified_name +} +``` + +and try removing the network policy, Terraform fails with +``` +│ Error deleting network policy EXAMPLE, err = 001492 (42601): SQL compilation error: +│ Cannot perform Drop operation on network policy EXAMPLE. The policy is attached to INTEGRATION with name EXAMPLE. Unset the network policy from INTEGRATION and try the +│ Drop operation again. +``` + +In order to remove the policy correctly, first adjust the configuration to +```terraform +resource "snowflake_network_policy" "example" { + name = "network_policy_name" +} + +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] +} +``` + +Note that the network policy has been unassigned. Now, run `terraform apply`. This should cause the policy to be unassigned. Now, adjust the configuration once again to +```terraform +resource "snowflake_oauth_integration_for_custom_clients" "example" { + name = "integration" + oauth_client_type = "CONFIDENTIAL" + oauth_redirect_uri = "https://example.com" + blocked_roles_list = ["ACCOUNTADMIN", "SECURITYADMIN"] +} +``` + +Now the network policy should be removed successfully. + +This behavior will be fixed in the provider in the future. diff --git a/templates/index.md.tmpl b/templates/index.md.tmpl index 11f3ba84bb..99bf6b0369 100644 --- a/templates/index.md.tmpl +++ b/templates/index.md.tmpl @@ -9,7 +9,7 @@ description: Manage SnowflakeDB with Terraform. ~> **Note** Please check the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md) when changing the version of the provider. --> **Note** the current roadmap is available in our GitHub repository: [ROADMAP.md](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md). +-> **Note** The current roadmap is available in our GitHub repository: [ROADMAP.md](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md). This is a terraform provider plugin for managing [Snowflake](https://www.snowflake.com/) accounts. Coverage is focused on part of Snowflake related to access control. @@ -55,7 +55,7 @@ To export the variables into your provider: ```shell export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key" ``` ### Keypair Authentication Passphrase @@ -77,7 +77,7 @@ To export the variables into your provider: ```shell export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key.p8" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key.p8" export SNOWFLAKE_PRIVATE_KEY_PASSPHRASE="..." ``` @@ -87,7 +87,7 @@ If you have an OAuth access token, export these credentials as environment varia ```shell export SNOWFLAKE_USER='...' -export SNOWFLAKE_OAUTH_ACCESS_TOKEN='...' +export SNOWFLAKE_TOKEN='...' ``` Note that once this access token expires, you'll need to request a new one through an external application. @@ -97,11 +97,11 @@ Note that once this access token expires, you'll need to request a new one throu If you have an OAuth Refresh token, export these credentials as environment variables: ```shell -export SNOWFLAKE_OAUTH_REFRESH_TOKEN='...' -export SNOWFLAKE_OAUTH_CLIENT_ID='...' -export SNOWFLAKE_OAUTH_CLIENT_SECRET='...' -export SNOWFLAKE_OAUTH_ENDPOINT='...' -export SNOWFLAKE_OAUTH_REDIRECT_URL='https://localhost.com' +export SNOWFLAKE_TOKEN_ACCESSOR_REFRESH_TOKEN='...' +export SNOWFLAKE_TOKEN_ACCESSOR_CLIENT_ID='...' +export SNOWFLAKE_TOKEN_ACCESSOR_CLIENT_SECRET='...' +export SNOWFLAKE_TOKEN_ACCESSOR_TOKEN_ENDPOINT='...' +export SNOWFLAKE_TOKEN_ACCESSOR_REDIRECT_URI='https://localhost.com' ``` Note because access token have a short life; typically 10 minutes, by passing refresh token new access token will be generated. @@ -136,7 +136,7 @@ provider "snowflake" { ```bash export SNOWFLAKE_USER="..." -export SNOWFLAKE_PRIVATE_KEY_PATH="~/.ssh/snowflake_key" +export SNOWFLAKE_PRIVATE_KEY="~/.ssh/snowflake_key" ``` 3. In a TOML file (default in ~/.snowflake/config). Notice the use of different profiles. The profile name needs to be specified in the Terraform configuration file in `profile` field. When this is not specified, `default` profile is loaded. diff --git a/templates/resources/account.md.tmpl b/templates/resources/account.md.tmpl index 973e844784..c05e6ff4bc 100644 --- a/templates/resources/account.md.tmpl +++ b/templates/resources/account.md.tmpl @@ -9,26 +9,29 @@ description: |- {{- end }} --- +!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0990--v01000) to use it. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} -!> **Warning** This resource cannot be destroyed!!! The only way to delete accounts is to go through [Snowflake Support](https://docs.snowflake.com/en/user-guide/organizations-manage-accounts.html#deleting-an-account) - -~> **Note** ORGADMIN priviliges are required for this resource +~> **Note** To use this resource you have to use an account with a privilege to use the ORGADMIN role. +{{ if .HasExample -}} ## Example Usage {{ tffile (printf "examples/resources/%s/resource.tf" .Name)}} -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). +{{- end }} + {{ .SchemaMarkdown | trimspace }} +{{- if .HasImport }} ## Import Import is supported using the following syntax: -```shell -terraform import snowflake_account.account -``` +{{ codefile "shell" (printf "examples/resources/%s/import.sh" .Name)}} +{{- end }} diff --git a/templates/resources/api_authentication_integration_with_authorization_code_grant.md.tmpl b/templates/resources/api_authentication_integration_with_authorization_code_grant.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/api_authentication_integration_with_authorization_code_grant.md.tmpl +++ b/templates/resources/api_authentication_integration_with_authorization_code_grant.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/api_authentication_integration_with_client_credentials.md.tmpl b/templates/resources/api_authentication_integration_with_client_credentials.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/api_authentication_integration_with_client_credentials.md.tmpl +++ b/templates/resources/api_authentication_integration_with_client_credentials.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/api_authentication_integration_with_jwt_bearer.md.tmpl b/templates/resources/api_authentication_integration_with_jwt_bearer.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/api_authentication_integration_with_jwt_bearer.md.tmpl +++ b/templates/resources/api_authentication_integration_with_jwt_bearer.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/authentication_policy.md.tmpl b/templates/resources/authentication_policy.md.tmpl index 93b46362ee..ca835354ca 100644 --- a/templates/resources/authentication_policy.md.tmpl +++ b/templates/resources/authentication_policy.md.tmpl @@ -9,8 +9,7 @@ description: |- {{- end }} --- -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-authentication-policy#usage-notes), an authentication policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-authentication-policy#usage-notes), an authentication policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/database.md.tmpl b/templates/resources/database.md.tmpl index 28e2af568d..719a06f49e 100644 --- a/templates/resources/database.md.tmpl +++ b/templates/resources/database.md.tmpl @@ -11,6 +11,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/external_oauth_integration.md.tmpl b/templates/resources/external_oauth_integration.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/external_oauth_integration.md.tmpl +++ b/templates/resources/external_oauth_integration.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/masking_policy.md.tmpl b/templates/resources/masking_policy.md.tmpl index 8c42e823de..c516e8e1b1 100644 --- a/templates/resources/masking_policy.md.tmpl +++ b/templates/resources/masking_policy.md.tmpl @@ -11,8 +11,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-masking-policy#usage-notes), a masking policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-masking-policy#usage-notes), a masking policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/network_policy.md.tmpl b/templates/resources/network_policy.md.tmpl index c509e6a3e9..1432fcbee1 100644 --- a/templates/resources/network_policy.md.tmpl +++ b/templates/resources/network_policy.md.tmpl @@ -11,8 +11,9 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes), a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-network-policy#usage-notes), a network policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + +!> **Note** Due to technical limitations in Terraform SDK, changes in `allowed_network_rule_list` and `blocked_network_rule_list` do not cause diff for `show_output` and `describe_output`. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/network_rule.md.tmpl b/templates/resources/network_rule.md.tmpl new file mode 100644 index 0000000000..c96f3e8a41 --- /dev/null +++ b/templates/resources/network_rule.md.tmpl @@ -0,0 +1,35 @@ +--- +page_title: "{{.Name}} {{.Type}} - {{.ProviderName}}" +subcategory: "" +description: |- +{{ if gt (len (split .Description "")) 1 -}} +{{ index (split .Description "") 1 | plainmarkdown | trimspace | prefixlines " " }} +{{- else -}} +{{ .Description | plainmarkdown | trimspace | prefixlines " " }} +{{- end }} +--- + +!> **Note** A network rule cannot be dropped successfully if it is currently assigned to a network policy. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + +# {{.Name}} ({{.Type}}) + +{{ .Description | trimspace }} + +{{ if .HasExample -}} +## Example Usage + +{{ tffile (printf "examples/resources/%s/resource.tf" .Name)}} +-> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). + + +{{- end }} + +{{ .SchemaMarkdown | trimspace }} +{{- if .HasImport }} + +## Import + +Import is supported using the following syntax: + +{{ codefile "shell" (printf "examples/resources/%s/import.sh" .Name)}} +{{- end }} diff --git a/templates/resources/oauth_integration_for_custom_clients.md.tmpl b/templates/resources/oauth_integration_for_custom_clients.md.tmpl index 28e2af568d..dc107a14ad 100644 --- a/templates/resources/oauth_integration_for_custom_clients.md.tmpl +++ b/templates/resources/oauth_integration_for_custom_clients.md.tmpl @@ -11,6 +11,10 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** Setting a network policy with lowercase letters does not work correctly in Snowflake (see [issue](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3229)). As a workaround, set the network policy with uppercase letters only, or use unsafe_execute with network policy ID wrapped in `'`. + +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/oauth_integration_for_partner_applications.md.tmpl b/templates/resources/oauth_integration_for_partner_applications.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/oauth_integration_for_partner_applications.md.tmpl +++ b/templates/resources/oauth_integration_for_partner_applications.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/password_policy.md.tmpl b/templates/resources/password_policy.md.tmpl index 28771e2c07..2dbed59233 100644 --- a/templates/resources/password_policy.md.tmpl +++ b/templates/resources/password_policy.md.tmpl @@ -9,8 +9,7 @@ description: |- {{- end }} --- -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-password-policy#usage-notes), a password policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-password-policy#usage-notes), a password policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/primary_connection.md.tmpl b/templates/resources/primary_connection.md.tmpl index a4e271811a..76cb507a90 100644 --- a/templates/resources/primary_connection.md.tmpl +++ b/templates/resources/primary_connection.md.tmpl @@ -22,7 +22,7 @@ description: |- -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](../docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources). --> **Note** To demote `snowflake_primary_connection` to [`snowflake_secondary_connection`](./secondary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state, then recreate it in manually using: +-> **Note** To demote `snowflake_primary_connection` to [`snowflake_secondary_connection`](./secondary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state with [terraform state rm](https://developer.hashicorp.com/terraform/cli/commands/state/rm), then recreate it in manually using: ``` CREATE CONNECTION AS REPLICA OF ..; ``` diff --git a/templates/resources/row_access_policy.md.tmpl b/templates/resources/row_access_policy.md.tmpl index eed337762b..eeff47ba51 100644 --- a/templates/resources/row_access_policy.md.tmpl +++ b/templates/resources/row_access_policy.md.tmpl @@ -11,8 +11,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0950--v0960) to use it. -> [!WARNING] -> According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-row-access-policy#usage-notes), a row access policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, list the assigned objects with `SELECT * from table(information_schema.policy_references(policy_name=>''));` and unassign them manually with `ALTER ...` or with updated Terraform configuration, if possible. +!> **Note** According to Snowflake [docs](https://docs.snowflake.com/en/sql-reference/sql/drop-row-access-policy#usage-notes), a row access policy cannot be dropped successfully if it is currently assigned to another object. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the policy from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/saml2_integration.md.tmpl b/templates/resources/saml2_integration.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/saml2_integration.md.tmpl +++ b/templates/resources/saml2_integration.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/schema.md.tmpl b/templates/resources/schema.md.tmpl index d1045d341c..dea0e19916 100644 --- a/templates/resources/schema.md.tmpl +++ b/templates/resources/schema.md.tmpl @@ -11,6 +11,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +-> **Note** Field `CLASSIFICATION_ROLE` is currently missing. It will be added in the future. + +!> **Note** A schema cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098508 (2BP01): Cannot drop schema SCHEMA as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/scim_integration.md.tmpl b/templates/resources/scim_integration.md.tmpl index 28e2af568d..fe7454c7f2 100644 --- a/templates/resources/scim_integration.md.tmpl +++ b/templates/resources/scim_integration.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on security integration type. In this case, remove the integration of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/secondary_connection.md.tmpl b/templates/resources/secondary_connection.md.tmpl index b6955369ea..d54554f7d1 100644 --- a/templates/resources/secondary_connection.md.tmpl +++ b/templates/resources/secondary_connection.md.tmpl @@ -22,7 +22,7 @@ description: |- -> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](../guides/identifiers#new-computed-fully-qualified-name-field-in-resources). --> **Note** To promote `snowflake_secondary_connection` to [`snowflake_primary_connection`](./primary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state, then promote it manually using: +-> **Note** To promote `snowflake_secondary_connection` to [`snowflake_primary_connection`](./primary_connection), resources need to be migrated manually. For guidance on removing and importing resources into the state check [resource migration](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). Remove the resource from the state with [terraform state rm](https://developer.hashicorp.com/terraform/cli/commands/state/rm), then promote it manually using: ``` ALTER CONNECTION PRIMARY; ``` diff --git a/templates/resources/secondary_database.md.tmpl b/templates/resources/secondary_database.md.tmpl index 35c607137b..acb3bfb61c 100644 --- a/templates/resources/secondary_database.md.tmpl +++ b/templates/resources/secondary_database.md.tmpl @@ -12,6 +12,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # {{.Name}} ({{.Type}}) ~> **Note** The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. Check out the examples to see how to set up the refresh task. For SQL-based replication guide, see the [official documentation](https://docs.snowflake.com/en/user-guide/db-replication-config#replicating-a-database-to-another-account). diff --git a/templates/resources/shared_database.md.tmpl b/templates/resources/shared_database.md.tmpl index 28e2af568d..719a06f49e 100644 --- a/templates/resources/shared_database.md.tmpl +++ b/templates/resources/shared_database.md.tmpl @@ -11,6 +11,11 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **Note** The provider does not detect external changes on database type. In this case, remove the database of wrong type manually with `terraform destroy` and recreate the resource. It will be addressed in the future. + +!> **Note** A database cannot be dropped successfully if it contains network rule-network policy associations. The error looks like `098507 (2BP01): Cannot drop database DATABASE as it includes network rule - policy associations. +`. Currently, the provider does not unassign such objects automatically. Before dropping the resource, first unassign the network rule from the relevant objects. See [guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/unassigning_policies) for more details. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/stream_on_directory_table.md.tmpl b/templates/resources/stream_on_directory_table.md.tmpl index be9cb3fb69..6a7aa75378 100644 --- a/templates/resources/stream_on_directory_table.md.tmpl +++ b/templates/resources/stream_on_directory_table.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. +~> **Note about copy_grants** Fields like `stage`, and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/stream_on_external_table.md.tmpl b/templates/resources/stream_on_external_table.md.tmpl index c82f23be97..8a062a52e7 100644 --- a/templates/resources/stream_on_external_table.md.tmpl +++ b/templates/resources/stream_on_external_table.md.tmpl @@ -11,7 +11,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0960--v0970) to use it. -!> **Note about copy_grants** Fields like `external_table`, `insert_only`, `at`, `before` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. +~> **Note about copy_grants** Fields like `external_table`, `insert_only`, `at`, `before` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/stream_on_table.md.tmpl b/templates/resources/stream_on_table.md.tmpl index 53dd2b9daf..270789e8c5 100644 --- a/templates/resources/stream_on_table.md.tmpl +++ b/templates/resources/stream_on_table.md.tmpl @@ -11,7 +11,7 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0960--v0970) to use it. -!> **Note about copy_grants** Fields like `table`, `append_only`, `at`, `before`, `show_initial_rows` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. +~> **Note about copy_grants** Fields like `table`, `append_only`, `at`, `before`, `show_initial_rows` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/stream_on_view.md.tmpl b/templates/resources/stream_on_view.md.tmpl index be9cb3fb69..31e3a88a68 100644 --- a/templates/resources/stream_on_view.md.tmpl +++ b/templates/resources/stream_on_view.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it. +~> **Note about copy_grants** Fields like `view`, `append_only`, `at`, `before`, `show_initial_rows` and `stale` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-stream)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/streamlit.md.tmpl b/templates/resources/streamlit.md.tmpl index d1045d341c..dfcd9aef7f 100644 --- a/templates/resources/streamlit.md.tmpl +++ b/templates/resources/streamlit.md.tmpl @@ -11,6 +11,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +!> **Note** Setting a query warehouse with lowercase letters does not work correctly in Snowflake. As a workaround, set the query warehouse with uppercase letters only, or use unsafe_execute with query warehouse ID wrapped in `'`. + + +-> **Note** Field `IMPORTS` is currently missing. It will be added in the future. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/tag.md.tmpl b/templates/resources/tag.md.tmpl index 7a876a0017..d040e9d5b0 100644 --- a/templates/resources/tag.md.tmpl +++ b/templates/resources/tag.md.tmpl @@ -11,6 +11,8 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0980--v0990) to use it. +~> **Required warehouse** For this resource, the provider now uses [tag references](https://docs.snowflake.com/en/sql-reference/functions/tag_references) to get information about masking policies attached to tags. This function requires a warehouse in the connection. Please, make sure you have either set a `DEFAULT_WAREHOUSE` for the user, or specified a warehouse in the provider configuration. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/templates/resources/view.md.tmpl b/templates/resources/view.md.tmpl index 636271fb96..f9d69cd5de 100644 --- a/templates/resources/view.md.tmpl +++ b/templates/resources/view.md.tmpl @@ -11,10 +11,10 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v094x--v0950) to use it. -!> **Note about copy_grants** Fields like `is_recursive`, `is_temporary`, `copy_grants` and `statement` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-view)), and a change means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. - !> Due to Snowflake limitations, to properly compute diff on `statement` field, the provider parses a `text` field which contains the whole CREATE query used to create the resource. We recommend not using special characters, especially `(`, `,`, `)` in any of the fields, if possible. +~> **Note about copy_grants** Fields like `is_recursive`, `is_temporary`, `copy_grants` and `statement` can not be ALTERed on Snowflake side (check [docs](https://docs.snowflake.com/en/sql-reference/sql/alter-view)), and a change on these fields means recreation of the resource. ForceNew can not be used because it does not preserve grants from `copy_grants`. Beware that even though a change is marked as update, the resource is recreated. + ~> **Required warehouse** For this resource, the provider uses [policy references](https://docs.snowflake.com/en/sql-reference/functions/policy_references) which requires a warehouse in the connection. Please, make sure you have either set a DEFAULT_WAREHOUSE for the user, or specified a warehouse in the provider configuration. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/warehouse.md.tmpl b/templates/resources/warehouse.md.tmpl index 28e2af568d..21df1ad83c 100644 --- a/templates/resources/warehouse.md.tmpl +++ b/templates/resources/warehouse.md.tmpl @@ -11,6 +11,12 @@ description: |- !> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. + +-> **Note** Field `RESOURCE_CONSTRAINT` is currently missing. It will be added in the future. + + +-> **Note** Assigning resource monitors to warehouses requires ACCOUNTADMIN role. To do this, either manage the warehouse resource with ACCOUNTADMIN role, or use [unsafe_execute](./unsafe_execute) instead. See [this issue](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3019) for more details. + # {{.Name}} ({{.Type}}) {{ .Description | trimspace }} diff --git a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD index 2f69537f57..c020bce7b3 100644 --- a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD +++ b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD @@ -33,7 +33,7 @@ newer provider versions. We will address these while working on the given object | STREAMLIT | 🚀 | - | | TABLE | 👨‍💻 | [#2997](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2997), [#2844](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2844), [#2839](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2839), [#2735](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2735), [#2733](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2733), [#2683](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2683), [#2676](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2676), [#2674](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2674), [#2629](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2629), [#2418](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2418), [#2415](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2415), [#2406](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2406), [#2236](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2236), [#2035](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2035), [#1823](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1823), [#1799](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1799), [#1764](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1764), [#1600](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1600), [#1387](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1387), [#1272](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1272), [#1271](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1271), [#1248](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1248), [#1241](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1241), [#1146](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1146), [#1032](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1032), [#420](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/420) | | TAG | 👨‍💻 | [#2943](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2943), [#2598](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2598), [#1910](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1910), [#1909](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1909), [#1862](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1862), [#1806](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1806), [#1657](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1657), [#1496](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1496), [#1443](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1443), [#1394](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1394), [#1372](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1372), [#1074](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1074) | -| TASK | 👨‍💻 | [#3136](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3136), [#1419](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1419), [#1250](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1250), [#1194](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1194), [#1088](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1088) | +| TASK | 🚀 | [#3136](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/3136), [#1419](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1419), [#1250](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1250), [#1194](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1194), [#1088](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1088) | | VIEW | 🚀 | issues in the older versions: [resources](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues?q=label%3Aresource%3Aview+) and [datasources](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues?q=label%3Adata_source%3Aviews+) | | snowflake_unsafe_execute | 👨‍💻 | [#2934](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2934) | diff --git a/v1-preparations/LIST_OF_PREVIEW_FEATURES_FOR_V1.md b/v1-preparations/LIST_OF_PREVIEW_FEATURES_FOR_V1.md index dcb77aa33d..8eb3d34669 100644 --- a/v1-preparations/LIST_OF_PREVIEW_FEATURES_FOR_V1.md +++ b/v1-preparations/LIST_OF_PREVIEW_FEATURES_FOR_V1.md @@ -45,6 +45,8 @@ * [snowflake_system_get_aws_sns_iam_policy](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/data-sources/system_get_aws_sns_iam_policy) (datasource) * [snowflake_system_get_privatelink_config](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/data-sources/system_get_privatelink_config) (datasource) * [snowflake_system_get_snowflake_platform_info](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/data-sources/system_get_snowflake_platform_info) (datasource) +* [snowflake_table](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/table) +* [snowflake_tables](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/tables) (datasource) * [snowflake_table_column_masking_policy_application](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/table_column_masking_policy_application) * [snowflake_table_constraint](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/table_constraint) (undecided - may be deleted instead) * [snowflake_user_public_keys](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.97.0/docs/resources/user_public_keys) diff --git a/v1-preparations/LIST_OF_STABLE_RESOURCES_FOR_V1.md b/v1-preparations/LIST_OF_STABLE_RESOURCES_FOR_V1.md index b196a5a418..bb7a9d5426 100644 --- a/v1-preparations/LIST_OF_STABLE_RESOURCES_FOR_V1.md +++ b/v1-preparations/LIST_OF_STABLE_RESOURCES_FOR_V1.md @@ -1,6 +1,6 @@ We estimate the given list to be accurate, but it may be subject to small changes: -* Account (in progress) +* Account * [snowflake_account](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/account) * [snowflake_accounts](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/accounts) (datasource) * Connection @@ -37,7 +37,7 @@ We estimate the given list to be accurate, but it may be subject to small change * Network Policy * [snowflake_network_policy](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/network_policy) * [snowflake_network_policies](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/network_policies) (datasource) -* Procedure (in progress) +* Procedure (in progress) * snowflake_procedure_java * snowflake_procedure_javascript * snowflake_procedure_python @@ -85,14 +85,11 @@ We estimate the given list to be accurate, but it may be subject to small change * Streamlit * [snowflake_streamlit](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/streamlit) * [snowflake_streamlits](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/streamlits) (datasource) -* Table (in progress) - * [snowflake_table](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/table) - * [snowflake_tables](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/tables) (datasource) -* Tag (in progress) +* Tag * [snowflake_tag](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/tag) * [snowflake_tag_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/tag_association) * [snowflake_tags](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/tags) (datasource) -* Task (in progress) +* Task * [snowflake_task](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/resources/task) * [snowflake_tasks](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/0.98.0/docs/data-sources/tasks) (datasource) * User