A Whole Day with Kubeconfig: Why CI/CD Pipelines in Jenkins Need AWS EKS or Other Kubernetes Cloud Tools for Success

A Whole Day with Kubeconfig: Why CI/CD Pipelines in Jenkins Need AWS EKS or Other Kubernetes Cloud Tools for Success

In the world of DevOps, Continuous Integration and Continuous Deployment (CI/CD) pipelines are the backbone of modern software development, ensuring rapid and reliable delivery of applications. Jenkins, a popular open-source automation server, is often at the heart of these pipelines. While it's possible to use local Kubernetes configurations for CI/CD, there are significant challenges that can prevent these pipelines from succeeding. This article explores why cloud-based Kubernetes tools like AWS EKS are essential for robust and efficient CI/CD pipelines and how to set up Jenkins to work with Kubernetes.

The Importance of Kubernetes in CI/CD

Kubernetes is an open-source platform designed to automate deploying, scaling, and operating application containers. It is known for its ability to manage complex, distributed applications across clusters of hosts. In a CI/CD pipeline, Kubernetes provides:

  • Container Orchestration: Efficiently managing the deployment and scaling of containers.

  • Self-Healing: Automatically restarting failed containers and rescheduling them on healthy nodes.

  • Service Discovery and Load Balancing: Distributing network traffic to ensure stable deployments.

However, managing Kubernetes clusters locally can introduce several challenges that cloud-based solutions address more effectively.

Challenges of Local Kubernetes Configurations

1. Scalability Issues

Local Kubernetes clusters, such as those created with Minikube or Kind (Kubernetes in Docker), are limited by the resources of the host machine. As your application grows and the demand on the CI/CD pipeline increases, these local resources can quickly become insufficient. This limitation can lead to:

  • Failed Builds: Insufficient resources to compile and build large applications.

  • Deployment Failures: Inability to deploy multiple replicas or scale applications effectively.

Cloud-based Kubernetes solutions, like AWS EKS, provide virtually unlimited resources, allowing for dynamic scaling based on workload requirements.

2. High Availability

In a local setup, the Kubernetes control plane (which manages the cluster) and worker nodes are all running on the same physical or virtual machine. This setup lacks redundancy, making it vulnerable to:

  • Downtime: If the host machine fails, the entire cluster goes down.

  • Maintenance Interruptions: Upgrading or maintaining the local machine can disrupt the CI/CD pipeline.

AWS EKS and similar services offer a managed control plane with built-in high availability. This ensures that the control plane is always operational, reducing downtime and maintenance interruptions.

3. Resource Management

Managing resources and workloads manually in a local Kubernetes environment can be tedious and error-prone. Common issues include:

  • Resource Allocation: Difficulty in managing CPU, memory, and storage resources efficiently.

  • Node Management: Manual intervention required to add or remove nodes based on workload.

Cloud-based solutions automate these tasks, providing features like auto-scaling and load balancing, which optimize resource usage and enhance the efficiency of CI/CD pipelines.

4. Integrated Services

Local Kubernetes setups require extensive configuration to integrate with other services, such as:

  • Authentication: Managing user access and credentials.

  • Logging and Monitoring: Setting up and maintaining logging and monitoring tools.

Cloud providers like AWS offer seamless integration with their ecosystem of services. For example, AWS EKS integrates with IAM for authentication, CloudWatch for logging and monitoring, and other AWS services, simplifying the management and enhancing the functionality of CI/CD pipelines.

5. Security Concerns

Security is a critical aspect of any CI/CD pipeline. Local environments are more vulnerable to security risks due to:

  • Manual Patching: The need for manual updates and patches, which can be overlooked or delayed.

  • Limited Security Features: Lack of advanced security features and compliance checks.

AWS EKS and other cloud-based Kubernetes services provide automated patching, compliance with industry standards, and advanced security features, such as network policies and encrypted communication, ensuring a more secure CI/CD pipeline.

Configuring Kubernetes in Jenkins for Local Development

Despite the challenges, setting up Kubernetes in Jenkins for local development is a valuable exercise. Here’s a practical example:

Example Jenkins Pipeline Script

pipeline {
    agent any

    environment {
        REPO = 'https://github.com/Rudraksh2003/vite-react-typescript-template-for-devops-practice'
        DATA_BACKEND_IMAGE = 'rudrakshladdha/my-node-app'
        DOCKER_USERNAME = 'rudrakshladdha'
        DOCKER_HUB_CREDENTIALS = 'b28-xyz-6ff' // Jenkins ID for Docker Hub credentials
        GITHUB_CREDENTIALS = '4d2-xyz-49b' // Jenkins ID for GitHub credentials
        KUBECONFIG = '8d9-xyz-210' // Jenkins ID for Kubeconfig
    }

    stages {
        stage('Clone Repository') {
            steps {
                script {
                    echo "Cloning repository..."
                    git branch: 'main', credentialsId: GITHUB_CREDENTIALS, url: REPO
                    echo "Repository cloned successfully."
                }
            }
        }

        stage('Build Data Backend Docker Image') {
            steps {
                script {
                    echo "Building Data Backend Docker Image..."
                    docker.build(DATA_BACKEND_IMAGE)
                    echo "Data Backend Docker Image built successfully."
                }
            }
        }

        stage('Push Data Backend Docker Image') {
            steps {
                script {
                    echo "Pushing Data Backend Docker Image..."
                    docker.withRegistry('https://index.docker.io/v1/', DOCKER_HUB_CREDENTIALS) {
                        def image = docker.image(DATA_BACKEND_IMAGE)
                        image.push()
                    }
                    echo "Data Backend Docker Image pushed successfully."
                }
            }
        }

        stage('Deploy to Kubernetes') {
            steps {
                withCredentials([file(credentialsId: KUBECONFIG, variable: 'KUBECONFIG_FILE')]) {
                    script {
                        echo "Deploying to Kubernetes..."
                        withEnv(["KUBECONFIG=$KUBECONFIG_FILE"]) {
                            sh 'kubectl apply -f deployment.yaml --validate=false'
                            sh 'kubectl apply -f service.yaml --validate=false'
                        }
                        echo "Deployment to Kubernetes completed."
                    }
                }
            }
        }
    }

    post {
        always {
            cleanWs()
        }
    }
}

Steps to Add Credentials for Local Kubeconfig

  1. Store Kubeconfig File:

    • Save your local kubeconfig file (typically located at ~/.kube/config) to a secure location that Jenkins can access.
  2. Add Kubeconfig to Jenkins Credentials:

    • Go to Jenkins Dashboard > Manage Jenkins > Manage Credentials.

    • Select the appropriate domain (e.g., Global).

    • Click on "Add Credentials."

    • Choose "Secret file" for the "Kind" field.

    • Upload your kubeconfig file and give it an ID (e.g., 8d9-xyz-210).

    • Save the credential.

Running the Jenkins Pipeline

  1. Access Jenkins:

    • Open your Jenkins dashboard in a web browser.
  2. Create a New Pipeline Job:

    • Click on "New Item."

    • Select "Pipeline" and give it a name.

    • Click "OK."

  3. Configure the Pipeline:

    • In the pipeline configuration page, scroll down to the "Pipeline" section.

    • Choose "Pipeline script" and paste the Jenkinsfile script provided above.

    • Click "Save."

  4. Run the Pipeline:

    • Go to the pipeline job page.

    • Click "Build Now" to trigger the pipeline.

    • Monitor the pipeline execution in the "Build History" section and view logs by clicking on the build number.

Configuring Jenkins with AWS EKS

To overcome the challenges of local Kubernetes configurations, let's configure Jenkins to use AWS EKS:

Steps to Set Up EKS Cluster

  1. Create an EKS Cluster:

    • Use the AWS Management Console or AWS CLI to create an EKS cluster.

    • Ensure you have the necessary IAM roles and permissions to create and manage EKS resources.

  2. Configure the Kubeconfig File:

    • Download the kubeconfig file for your EKS cluster.

    • Store the file securely and add it to Jenkins as described above.

Example Jenkins Pipeline Script for EKS

pipeline {
    agent any

    environment {
        REPO = 'https://github.com/Rudraksh2003/vite-react-typescript-template-for-devops-practice'
        DATA_BACKEND_IMAGE = 'rudrakshladdha/my-node-app'
        DOCKER_USERNAME = 'rudrakshladdha'
        DOCKER_HUB_CREDENTIALS = '563-xyz-356' // Jenkins ID for Docker Hub credentials
        GITHUB_CREDENTIALS = '012-xyz-456' // Jenkins ID for GitHub credentials
        KUBECONFIG = '98-xyz-7654' // Jenkins ID for Kubeconfig
        AWS_DEFAULT_REGION = 'us-west-2'
    }

    stages {
        stage('Clone Repository') {
            steps {
                script {
                    echo "Cloning repository..."
                    git branch: 'main', credentialsId:

 GITHUB_CREDENTIALS, url: REPO
                    echo "Repository cloned successfully."
                }
            }
        }

        stage('Build Data Backend Docker Image') {
            steps {
                script {
                    echo "Building Data Backend Docker Image..."
                    docker.build(DATA_BACKEND_IMAGE)
                    echo "Data Backend Docker Image built successfully."
                }
            }
        }

        stage('Push Data Backend Docker Image') {
            steps {
                script {
                    echo "Pushing Data Backend Docker Image..."
                    docker.withRegistry('https://index.docker.io/v1/', DOCKER_HUB_CREDENTIALS) {
                        def image = docker.image(DATA_BACKEND_IMAGE)
                        image.push()
                    }
                    echo "Data Backend Docker Image pushed successfully."
                }
            }
        }

        stage('Deploy to EKS') {
            steps {
                withCredentials([file(credentialsId: KUBECONFIG, variable: 'KUBECONFIG_FILE')]) {
                    script {
                        echo "Deploying to EKS..."
                        withEnv(["KUBECONFIG=$KUBECONFIG_FILE"]) {
                            sh 'kubectl apply -f deployment.yaml --validate=false'
                            sh 'kubectl apply -f service.yaml --validate=false'
                        }
                        echo "Deployment to EKS completed."
                    }
                }
            }
        }
    }

    post {
        always {
            cleanWs()
        }
    }
}

Running the Pipeline on EKS

  1. Follow the same steps as for local Kubernetes:

    • Access Jenkins.

    • Create a new pipeline job.

    • Configure the pipeline with the provided Jenkinsfile.

    • Run the pipeline and monitor the execution.

Conclusion

While local Kubernetes configurations can be valuable for learning and small-scale projects, they often fall short in handling the demands of robust CI/CD pipelines. Cloud-based Kubernetes tools like AWS EKS offer scalability, high availability, resource management, integrated services, and enhanced security, making them essential for successful CI/CD pipelines in Jenkins. By following the steps outlined in this article, you can set up Jenkins to work with both local and cloud-based Kubernetes environments, ensuring your CI/CD pipelines are reliable and efficient.

Did you find this article valuable?

Support Rudraksh.tech by becoming a sponsor. Any amount is appreciated!