Using Jenkins with AWS EC2 and S3: A Practical Guide

Updated: February 4, 2024 By: Guest Contributor Post a comment

Introduction

The integration of Jenkins, Amazon Web Services’ EC2, and S3 offers a powerful solution for continuous integration and delivery. In this guide, we will explore how to use Jenkins with AWS EC2 and S3, providing the steps you need to set up a fully functional automation pipeline. Whether you’re new to automation or looking to enhance your existing setup, this guide aims to equip you with practical knowledge and examples.

Setting Up AWS EC2 Instance for Jenkins

First, you’ll need to set up an EC2 instance where you can install Jenkins. Follow these steps to get started:

  1. Create an EC2 Instance: Log in to your AWS Management Console, navigate to the EC2 Dashboard, and click on ‘Launch Instance’. Select an Amazon Machine Image (AMI) with a pre-installed Jenkins or choose a basic Amazon Linux AMI and install Jenkins manually.
  2. Configuring Security Groups: Ensure your security group allows inbound traffic on port 8080, as this is the default port for Jenkins web UI.
  3. Accessing Jenkins: Once your instance is set up, access it using its Public DNS or IP address followed by :8080 in your browser. You’ll be prompted to complete the initial Jenkins setup.

Integrating Jenkins with AWS S3

Next, we’ll integrate Jenkins with AWS S3 to store build artifacts. This requires the AWS S3 plugin for Jenkins. Here’s how to set it up:

  1. Install AWS S3 Plugin in Jenkins: Navigate to Jenkins Dashboard > Manage Jenkins > Manage Plugins. Search for ‘AWS S3’ and install the plugin.
  2. Configure AWS S3 in Jenkins: Go to Manage Jenkins > Configure System and find the AWS S3 section. Enter your S3 Bucket name and AWS credentials.
  3. Uploading Artifacts to S3: In your job configuration, add a post-build action ‘Upload artifacts to S3 Bucket’ and specify the artifacts you want to upload.

Code example for automating uploads using Jenkins pipeline script:

pipeline {
    agent any
    stages {
        stage('Build') {
            steps {
                // Your build steps here
            }
        }
        stage('Upload to S3') {
            steps {
                script {
                    s3Upload(bucket: 'your-bucket-name', file: 'path/to/artifact.zip')
                }
            }
        }
    }
}

Advanced Configuration: Using EC2 Instances as Jenkins Slaves

To scale your Jenkins setup, you can use EC2 instances as Jenkins slaves. This involves creating EC2 instances that Jenkins can dynamically launch and terminate as part of the build process. The EC2 plugin for Jenkins simplifies this process.

  1. Installing the EC2 Plugin: Similar to the AWS S3 plugin, install the EC2 plugin from the Jenkins plugin manager.
  2. Configuring EC2 Cloud in Jenkins: Go to Manage Jenkins > Configure System and find the Cloud section. Add a new cloud of type EC2 and configure it with your AWS credentials and instance details.
  3. Dynamically Launching Slaves: In your Jenkins job configurations, you can now select ‘EC2’ as the label restrict where this project can be run, enabling Jenkins to dynamically launch EC2 instances as build executors.

Example EC2 Cloud Configuration:

Cloud:
  Name: 'AWS-EC2'
  Credentials: 'your-aws-credentials-id'
  Instance Type: 't2.micro'
  AMI ID: 'ami-xxxxxxxx'
  Region: 'us-east-1'
  Number of Executors: 1
  Remote FS Root: '/jenkins'
  Usage: 'Only build jobs with label expressions matching this node'

Conclusion

Using Jenkins with AWS EC2 and S3 can significantly streamline your CI/CD pipeline, offering scalability and a high degree of automation. This guide has outlined the steps to integrate these services, from basic setup and configuration to more advanced techniques. With Jenkins at the helm of your deployment process, combined with the power and flexibility of AWS services, you’re well-equipped to handle the demands of modern software development.