该项目为flume的S3 sink插件,运行该项目需要AWS通行证credentials (Credentials stored in ~/.aws/credentails file)
该自定义S3 sink支持滚动条件如下:
- Batch Size
- Roll Interval
- Compression
写入S3时,可以自动安装天和时间分区,格式如下 For example, the file path will look like this
s3://customers/dt=2019-08-01/hr=22/log-SEH-NHT-1.txt
defaultRollInterval = 300;
defaultBatchSize = 500;
defaultAvroSchema = "";
defaultAvroSchemaRegistryURL = "";
defaultFilePrefix = "flumeS3Sink";
defaultFileSufix = ".data";
defaultTempFile = "/tmp/flumes3sink/data/file";
defaultCompress = "false";
- Clone the git repo
- Run the maven command
mvn package
在Flume安装目录下创建plugins.d/flumes3sink/lib文件夹,将打包好的s3 sink上传到该文件夹下
#注意事项:
a1.sinks.k1.type = com.rab4u.flume.FlumeS3Sink
a1.sinks.k1.s3.bucketName = dp-cts-stream-test/after-sales/tracking
a1.sinks.k1.s3.awsRegion = eu-central-1
a1.sinks.k1.s3.filePrefix = after-sales
a1.sinks.k1.s3.FileSufix = avro
a1.sinks.k1.s3.rollInterval = 60
a1.sinks.k1.s3.tempFile = /home/ravindrachellubani/Documents/code/git/apache-flume-to-s3/after-sales-temp-file
a1.sinks.k1.s3.AvroSchema = /home/ravindrachellubani/Documents/code/git/apache-flume-to-s3/tracking.avsc
a1.sinks.k1.s3.batchSize = 500
a1.sinks.k1.s3.compress = false
AWS Credentials Not Found
check the ~/.aws/credentials file is present or not. if not create by running the following command
aws configure