Support for S3 Alternatives when using KafkaConnect S3 Sink #9177
-
I would like to test the KafkaConnect S3 Sink locally. Therefore I would like to setup the connector to write to Questions:
SetupRunning everything locally via docker-compose based on this documentation. The images are:
Connector configuration: {
"aws.access.key.id": "test",
"aws.s3.bucket.name": "my-test-bucket",
"aws.secret.access.key": "test",
"connector.class": "com.redpanda.kafka.connect.s3.S3SinkConnector",
"format.output.type": "json",
"name": "s3-test-connector",
"store.url": "http://minio:9000",
"topics": "my-topic"
} Error Taceback:
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Perhaps way too late but this is what works fine for me with minio: { I am using the following components: After this I got files in the following path: warehouse/topics/events/year=2024/month=10/day=11 |
Beta Was this translation helpful? Give feedback.
-
Also checkout Redpanda Connect if you want a more modern replacement: https://www.redpanda.com/connect |
Beta Was this translation helpful? Give feedback.
Perhaps way too late but this is what works fine for me with minio:
{
"aws.access.key.id": "key",
"aws.secret.access.key": "access",
"connector.class": "io.confluent.connect.s3.S3SinkConnector",
"flush.size": "1", // just to see results after each message you produce
"format.class": "io.confluent.connect.s3.format.json.JsonFormat",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"locale": "US",
"name": "s3-minio-sink",
"partitioner.class": "io.confluent.connect.storage.partitioner.DailyPartitioner",
"path.format": "'year'=YYYY/'month'=MM/'day'=dd",
"s3.bucket.name": "warehouse",
"s3.compression.type": "gzip",
"s3.part.size": "5242880",