This project demonstrates a real-world FinOps solution using shell scripting to optimize cloud costs for CI/CD logging infrastructure. The implementation addresses the common problem of high ELK stack costs by intelligently offloading Jenkins logs to Amazon S3 storage.
- Practical Shell Scripting: Applied bash scripting to solve real business problems
- Cloud Cost Optimization (FinOps): Implemented cost-reduction strategies for cloud infrastructure
- AWS Integration: Utilized AWS CLI for automated cloud operations
- CI/CD Log Management: Optimized logging workflows for development teams
- High ELK Stack Costs: Self-hosted Elasticsearch, Logstash, and Kibana cluster consuming significant compute and storage resources
- Jenkins Log Volume: 100+ developers generating thousands of daily build logs from commits and pull requests
- Inefficient Storage: Jenkins logs stored for backup purposes only, not active analysis
- Cost Impact: ELK infrastructure costs becoming unsustainable for the organization
- Jenkins logs were primary contributors to high log volume
- Logs were primarily used for backup/safety, not active debugging
- Developers relied on email/Slack notifications for build failures
- ELK stack was over-engineered for Jenkins log requirements
Offload Jenkins logs from expensive ELK stack → Store in cost-effective Amazon S3
- Cost-Effective: Cheapest storage solution available
- Lifecycle Management: Automatic tiering to Glacier/Deep Archive
- Scalability: Handles large volumes without infrastructure management
- Integration: Seamless AWS CLI integration
#!/bin/bash
# Variables
JENKINS_HOME="/var/lib/jenkins"
S3_BUCKET="s3://your-s3-bucket-name"
DATE=$(date +%Y-%m-%d)
# AWS CLI validation
if ! command -v aws &> /dev/null; then
echo "AWS CLI is not installed. Please install it to proceed."
exit 1
fi
# Process Jenkins jobs and builds
for job_dir in "$JENKINS_HOME/jobs/"*/; do
job_name=$(basename "$job_dir")
for build_dir in "$job_dir/builds/"*/; do
build_number=$(basename "$build_dir")
log_file="$build_dir/log"
# Upload only today's logs
if [ -f "$log_file" ] && [ "$(date -r "$log_file" +%Y-%m-%d)" == "$DATE" ]; then
aws s3 cp "$log_file" "$S3_BUCKET/$job_name-$build_number.log" --only-show-errors
if [ $? -eq 0 ]; then
echo "Uploaded: $job_name/$build_number to $S3_BUCKET/$job_name-$build_number.log"
else
echo "Failed to upload: $job_name/$build_number"
fi
fi
done
done- Date-Based Filtering: Only processes logs created today
- Nested Loop Structure: Efficiently iterates through jobs and builds
- Error Handling: Validates AWS CLI installation and upload success
- Organized Storage: Creates structured S3 folder hierarchy
- Automated Execution: Designed for cron job scheduling
- 50%+ Cost Reduction: Significant savings on ELK infrastructure
- Compute Savings: Reduced ELK cluster resource requirements
- Storage Optimization: Leveraged S3's cost-effective storage tiers
- Operational Efficiency: Automated daily log management
- Lifecycle Management: Automatic archival to cheaper storage classes
- Scalability: No infrastructure scaling concerns
- Maintenance: Reduced operational overhead
- EC2 instance with Jenkins installed
- S3 bucket for log storage
- AWS CLI installed on Jenkins server
- AWS credentials configured (
aws configure)
-
Clone the repository
git clone https://github.com/yourusername/jenkins-log-optimization.git cd jenkins-log-optimization -
Configure script variables
# Edit the script variables JENKINS_HOME="/var/lib/jenkins" # Your Jenkins home directory S3_BUCKET="s3://your-bucket-name" # Your S3 bucket
-
Set up AWS credentials
aws configure
-
Make script executable
chmod +x jenkins-log-uploader.sh
-
Schedule with cron (optional)
# Run daily at 2 AM 0 2 * * * /path/to/jenkins-log-uploader.sh
This implementation is based on the YouTube video: "FinOps | Reduce cloud cost for CI/CD logging" by Abhishek.Veeramalla
The video provided a comprehensive real-world use case demonstrating how shell scripting can be applied to solve practical business problems in cloud cost optimization.
- Shell Scripting: Advanced bash scripting with loops, conditionals, and error handling
- AWS CLI: Command-line interface for cloud operations
- System Administration: Jenkins directory structure and log management
- Automation: Cron job scheduling and unattended execution
- FinOps: Cloud financial operations and cost optimization
- Problem Analysis: Identifying cost drivers and optimization opportunities
- Solution Architecture: Designing efficient, scalable solutions
- Multi-part Upload: Handle very large log files
- Compression: Reduce storage costs further with log compression
- Monitoring: Add CloudWatch metrics for upload success rates
- Retention Policies: Implement automated log cleanup
- Notification System: Add alerts for failed uploads
Feel free to submit issues, feature requests, or pull requests to improve this solution.
This project is licensed under the MIT License - see the LICENSE file for details.
Note: This project demonstrates practical application of shell scripting for real-world business problems, making it valuable for DevOps and cloud engineering portfolios.