From 41a868db4e600b0ab2f5a73f6b08614fd4f1a79a Mon Sep 17 00:00:00 2001 From: Przemek Denkiewicz Date: Tue, 22 Jul 2025 14:05:56 +0200 Subject: [PATCH 1/2] Add SF_AWS_ENDPOINT_URL config var to Snowflake docs --- src/content/docs/snowflake/capabilities/configuration.md | 1 + 1 file changed, 1 insertion(+) diff --git a/src/content/docs/snowflake/capabilities/configuration.md b/src/content/docs/snowflake/capabilities/configuration.md index 7c52c564..323ecb45 100644 --- a/src/content/docs/snowflake/capabilities/configuration.md +++ b/src/content/docs/snowflake/capabilities/configuration.md @@ -23,6 +23,7 @@ Options that affect the core Snowflake emulator functionality. | `DEBUG` | `0` (default) \| `1` | Flag to increase log level and print more verbose logs (useful for troubleshooting issues) | | `SF_LOG` | `trace` | Specify the log level. Currently overrides the `DEBUG` configuration. `trace` for detailed request/response | | `SF_S3_ENDPOINT` | `s3.localhost.localstack.cloud:4566` (default) | Specify the S3 endpoint to use for the Snowflake emulator. | +| `SF_AWS_ENDPOINT_URL` | `localhost:4566` (default) | AWS services endpoint to connect to for other AWS services (SQS, SNS, etc.) from Snowflake | `DNS_NAME_PATTERNS_TO_RESOLVE_UPSTREAM` | `*.s3.amazonaws.com` (example) | List of domain names that should NOT be resolved to the LocalStack container, but instead always forwarded to the upstream resolver (S3 for example). this would be required when importing data into a stage from an external S3 bucket on the real AWS cloud. Comma-separated list of Python-flavored regex patterns. | | `SF_HOSTNAME_REGEX` | `snowflake\..+` (default) | Allows you to customize the hostname used for matching the Snowflake API routes in the HTTP router. If not set, then it matches on any hostnames that contain a `snowflake.*` subdomain (e.g., `snowflake.localhost.localstack.cloud`). | | `SF_CSV_IMPORT_MAX_ROWS` | `50000` (default) | Maximum number of rows to import from CSV files into tables | From e6a0151f77ca43a8dc2a45f24cbad25420b89d3f Mon Sep 17 00:00:00 2001 From: Harsh Mishra Date: Wed, 23 Jul 2025 00:51:02 +0530 Subject: [PATCH 2/2] Update configuration.md --- src/content/docs/snowflake/capabilities/configuration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/content/docs/snowflake/capabilities/configuration.md b/src/content/docs/snowflake/capabilities/configuration.md index 323ecb45..7dc2c000 100644 --- a/src/content/docs/snowflake/capabilities/configuration.md +++ b/src/content/docs/snowflake/capabilities/configuration.md @@ -23,7 +23,7 @@ Options that affect the core Snowflake emulator functionality. | `DEBUG` | `0` (default) \| `1` | Flag to increase log level and print more verbose logs (useful for troubleshooting issues) | | `SF_LOG` | `trace` | Specify the log level. Currently overrides the `DEBUG` configuration. `trace` for detailed request/response | | `SF_S3_ENDPOINT` | `s3.localhost.localstack.cloud:4566` (default) | Specify the S3 endpoint to use for the Snowflake emulator. | -| `SF_AWS_ENDPOINT_URL` | `localhost:4566` (default) | AWS services endpoint to connect to for other AWS services (SQS, SNS, etc.) from Snowflake +| `SF_AWS_ENDPOINT_URL` | `localhost:4566` (default) | AWS services endpoint for connecting to other AWS services (SQS, SNS, etc.) from the Snowflake emulator. | | `DNS_NAME_PATTERNS_TO_RESOLVE_UPSTREAM` | `*.s3.amazonaws.com` (example) | List of domain names that should NOT be resolved to the LocalStack container, but instead always forwarded to the upstream resolver (S3 for example). this would be required when importing data into a stage from an external S3 bucket on the real AWS cloud. Comma-separated list of Python-flavored regex patterns. | | `SF_HOSTNAME_REGEX` | `snowflake\..+` (default) | Allows you to customize the hostname used for matching the Snowflake API routes in the HTTP router. If not set, then it matches on any hostnames that contain a `snowflake.*` subdomain (e.g., `snowflake.localhost.localstack.cloud`). | | `SF_CSV_IMPORT_MAX_ROWS` | `50000` (default) | Maximum number of rows to import from CSV files into tables |