Skip to content

Commit a75229c

Browse files
authored
Add missing region to MinIO getting-started example (#2411)
The example was missing an AWS region, thus causing Spark to fail with: ``` spark-sql ()> create table ns.t1 as select 'abc'; 25/08/20 16:25:06 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) software.amazon.awssdk.core.exception.SdkClientException: Unable to load region from any of the providers in the chain software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain@47578c86: [software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider@1656f847: Unable to load region from system settings. Region must be specified either via environment variable (AWS_REGION) or system property (aws.region)., software.amazon.awssdk.regions.providers.AwsProfileRegionProvider@2bbaabe3: No region provided in profile: default, software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider@54b1cfd8: Unable to contact EC2 metadata service.] ... at org.apache.iceberg.aws.AwsClientFactories$DefaultAwsClientFactory.s3(AwsClientFactories.java:119) at org.apache.iceberg.aws.s3.S3FileIO.client(S3FileIO.java:391) at org.apache.iceberg.aws.s3.S3FileIO.newOutputFile(S3FileIO.java:193) ```
1 parent 9c455ed commit a75229c

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

getting-started/minio/README.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,11 +56,15 @@ bin/spark-sql \
5656
--conf spark.sql.catalog.polaris.warehouse=quickstart_catalog \
5757
--conf spark.sql.catalog.polaris.scope=PRINCIPAL_ROLE:ALL \
5858
--conf spark.sql.catalog.polaris.header.X-Iceberg-Access-Delegation=vended-credentials \
59-
--conf spark.sql.catalog.polaris.credential=root:s3cr3t
59+
--conf spark.sql.catalog.polaris.credential=root:s3cr3t \
60+
--conf spark.sql.catalog.polaris.client.region=irrelevant
6061
```
6162

6263
Note: `s3cr3t` is defined as the password for the `root` users in the `docker-compose.yml` file.
6364

65+
Note: The `client.region` configuration is required for the AWS S3 client to work, but it is not used in this example
66+
since MinIO does not require a specific region.
67+
6468
## Running Queries
6569

6670
Run inside the Spark SQL shell:

0 commit comments

Comments
 (0)