Skip to content

Support for S3 Alternatives when using KafkaConnect S3 Sink #9177

Answered by dynamike2010
JimFawkes asked this question in Q&A
Discussion options

You must be logged in to vote

Perhaps way too late but this is what works fine for me with minio:

{
"aws.access.key.id": "key",
"aws.secret.access.key": "access",
"connector.class": "io.confluent.connect.s3.S3SinkConnector",
"flush.size": "1", // just to see results after each message you produce
"format.class": "io.confluent.connect.s3.format.json.JsonFormat",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"locale": "US",
"name": "s3-minio-sink",
"partitioner.class": "io.confluent.connect.storage.partitioner.DailyPartitioner",
"path.format": "'year'=YYYY/'month'=MM/'day'=dd",
"s3.bucket.name": "warehouse",
"s3.compression.type": "gzip",
"s3.part.size": "5242880",

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by rockwotj
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants