Building a DevSecOps CI/CD Pipeline with Jenkins, SonarQube, and Snyk Using Terraform
Introduction
Incorporating security into Continuous Integration/Continuous Deployment (CI/CD) pipelines is a core DevSecOps practice. By leveraging tools like Jenkins, SonarQube, and Snyk, you can automate static and dependency vulnerability scans within your pipeline. This post will guide you through deploying and configuring Jenkins and SonarQube using Terraform, setting up the pipeline in Jenkins, and integrating SonarQube for Static Application Security Testing (SAST) and Snyk for Software Composition Analysis (SCA).
Prerequisites
- Terraform installed locally.
- Git repository containing the application code.
- Snyk API key
Step 1: Deploying Infrastructure with Terraform
To automate the setup, use Terraform to define and deploy the infrastructure required for Jenkins and SonarQube. This setup includes:
- A project server for application builds.
- A SonarQube server for static code analysis, and other docker containers.
- A Jenkins server to orchestrate the CI/CD pipeline.
Sample main.tf
for Terraform Configuration
The Terraform script provisions DigitalOcean droplets and configures each server to install Docker, Jenkins, and SonarQube.
Here’s the core main.tf
configuration.
Put your digital ocean API key in terraform.tfvars
digitalocean_token = "<your key>"
This file provisions the DigitalOcean droplets for Jenkins, Project Server and SonarQube, uploads the provisioning scripts, and executes them remotely.
Step 2: Provisioning Scripts for Jenkins and SonarQube
After Terraform provisions the infrastructure, each server executes its specific provisioning script to install and configure Jenkins, SonarQube, and Docker.
Project Server Setup (setup_docker.sh
)
Jenkins Server Setup (setup_jenkins.sh
)
SonarQube Setup (setup_sonar.sh
)
After these setups, SonarQube will be accessible on port 9000, Project Server will be accessible on port 5001 and Jenkins will be accessible on port 8080 of their respective IPs.
Step 3: Configure SonarQube in Jenkins
- Install the SonarQube Scanner Plugin:
- In Jenkins, go to Manage Jenkins > Manage Plugins > Available.
- Search for SonarQube Scanner and install it.
- Restart Jenkins if required.
- Configure SonarQube in Jenkins:
- Go to Manage Jenkins > System.
- Scroll down to the SonarQube Servers section and click Add SonarQube.
- Provide the following details:
- Name: Give it a name like “SonarQube Server”.
- Server URL: Set this to your SonarQube server’s URL (e.g.,
http://<SonarQube_Server_IP>:9000
). - Server Authentication Token:
- In SonarQube, go to My Account > Security > Tokens and create a new token.
- Add this token in Jenkins by clicking on Add next to Server Authentication Token and pasting the token text.
- Save the configuration.
- Set up SonarQube Scanner:
- Go to Manage Jenkins > Tools.
- Scroll to the SonarQube Scanner section.
- Click Add SonarQube Scanner and configure:
- Name: Give it a name like “SonarQube Scanner”.
- Install Automatically: Check this box if you want Jenkins to manage the installation.
- Save the configuration.
Step 4: Configure the Publish Over SSH Plugin in Jenkins
- Install the Publish Over SSH Plugin:
- In Jenkins, go to Manage Jenkins > Manage Plugins > Available.
- Search for Publish Over SSH and install it.
- Restart Jenkins if prompted.
- Set Up SSH Credentials:
- Go to Manage Jenkins > Configure System.
- Scroll to the Publish over SSH section.
- Click Add to configure a new SSH server and provide:
- Name: Name of the remote server configuration (e.g., “Project Server”).
- Hostname: IP address or hostname of the server.
- Username: SSH user on the remote server.
- Remote Directory: Directory on the remote server to upload files or run commands.
- SSH Key or Password: Provide the private SSH key or password if required.
- Test the connection to ensure it’s working.
- Save the configuration.
Step 5: Configuring Jenkins Pipeline with SonarQube and Snyk
With Jenkins and SonarQube set up, the next step is to configure a Jenkins pipeline that includes:
- SonarQube Analysis: Runs SAST to check for vulnerabilities in the source code.
- Snyk Security Scan: Checks for vulnerabilities in open-source dependencies.
Creating the Jenkins Pipeline Script
In Jenkins, create a new pipeline job, and use the following script:
pipeline {
agent any
environment {
SCANNER_HOME = tool 'SonarQube Scanner'
SNYK_API_TOKEN = credentials('snyk-api-token')
}
stages {
stage('Clone Repository') {
steps {
git url: 'https://github.com/mattclemons/docker.git', branch: 'master'
}
}
stage('Build and Deploy') {
steps {
sshPublisher(publishers: [
sshPublisherDesc(configName: 'Project Server', transfers: [
sshTransfer(sourceFiles: 'Dockerfile', execCommand: 'docker-compose up -d', remoteDirectory: '/opt/1on1App')
])
])
}
}
stage('SonarQube Analysis') {
steps {
withSonarQubeEnv('SonarQube Server') {
sh "${SCANNER_HOME}/bin/sonar-scanner -Dsonar.projectKey=1on1App -Dsonar.sources=."
}
}
}
stage('Snyk Security Scan') {
steps {
sshPublisher(publishers: [
sshPublisherDesc(configName: 'Project Server', transfers: [
sshTransfer(execCommand: "docker run --rm -e SNYK_TOKEN='${SNYK_API_TOKEN}' -v /var/run/docker.sock:/var/run/docker.sock snyk/snyk snyk test --docker 1on1app_web")
])
])
}
}
}
post {
always {
archiveArtifacts artifacts: '**/ZAP_Report.html', allowEmptyArchive: true
sh "docker system prune -f"
}
}
}
This pipeline script automates the following steps:
- Clone Repository: Pulls the application code from GitHub.
- Build and Deploy: Deploys the application Docker container on the project server.
- SonarQube Analysis: Runs SonarQube to detect vulnerabilities in the code.
- Snyk Security Scan: Runs Snyk to check for vulnerabilities in the application dependencies.
Troubleshooting
Debug build the servers one at a time:
TF_LOG=DEBUG terraform apply -target=digitalocean_droplet.project_serverTF_LOG=DEBUG terraform apply -target=digitalocean_droplet.jenkins_server
TF_LOG=DEBUG terraform apply -target=digitalocean_droplet.sonarqube_server
Results
Edit some file in the repo and 5 minutes later the pipeline runs again, checks the code and so on.
Conclusion
This setup creates an effective CI/CD pipeline with integrated security scans. By running SAST and SCA scans early in the pipeline, developers can catch vulnerabilities before the code reaches production. The next steps might involve adding Dynamic Application Security Testing (DAST) using OWASP ZAP or setting up Infrastructure as Code (IaC) scanning for more robust security coverage. This foundational setup, however, provides a solid base for implementing DevSecOps principles in your CI/CD pipeline.
0 Comments