Generic shell script to upload / synchronize log files (ubuntu) to Amazon AWS S3 buckets for centralized log management based on Nginx example.
Most approaches, at least all I found, had the following limitations:
- Sharedscripts directive of logrotate, executes scripts in pre- and postrotate only once Solution: Relocate scripts to lastaction, execute upload script in prerotate and deactivate sharedscripts (nosharedscripts is default)
- Logs in a directory (wildcard usage) not dynamically handled Solution: Hand over rotated file with $1 variable
- File handed over in $1 variable contains fully qualified log path Solution: Process file path and extract file name
- No dynamic retrieval of instance ID Solution: Replace IP of server with generic address "instance-data"
- Logs from different servers with same name overwrite each other when i.e. leveraging AWS AutoScaling
- Redundancy: Log storage in S3 is a high redundancy, infinite scaleable fallback
- Access: Centralized S3 log storage allows providing access third parties
- Analysis: Saving in centralized location enables easy retrieval i.e. through Knime S3 Connector leveraged for advanced analysis
- Create an S3 bucket to store the logs in. Pay attention to the region you create it in.
- Create am AWS IAM policy granting access to the former bucket
- Create an Amazon IAM user (Programmatic access), save the access plus secret key and assigning him to the role
- Install s3cmd
sudo apt-get install s3cmd
or from github s3tools / s3cmd - Configure
s3cmd --configure
and enter the access as well as the secret key from step three - Test upload and access rights by
s3cmd put YOUR_FILE s3://YOUR_BUCKET_NAME/
- Download synchronization script upload-log-s3.sh
wget https://raw.githubusercontent.com/mikeg-de/nginx-logrotate/master/upload-log-s3.sh
- Make downloaded file upload-log-s3.sh executable
sudo chmod +x upload-log-s3.sh
- Modify the script in accordance to the bucket you created
- Download nginx file for logrotate
sudo wget https://raw.githubusercontent.com/mikeg-de/nginx-logrotate/master/nginx -P /etc/logrotate.d/
- Modify file path to upload-log-s3.sh based on where you have downloaded it to
- Test upload during logrotate by
sudo logrotate -f /etc/logrotate.d/nginx
and check the file(s) in your S3 bucket
0.2 Current Release
- Removed seconds from S3 sub-folder for better overview
- Updated readme
0.1 Initialization
Suggestions welcome!
Mike Wiegand - atMedia Online Marketing
See also the list of Acknowledgments where their work greatly contributed to this project.
This project is licensed under the MIT License - see the LICENSE.md file for details
- Logrotate Manual for details about logrotate commands
- Yasith Fernando
- Dowd and Associates