Containerization and DevOps: with Docker and Kubernetes

07 Feb 2025

Drupal is a great development system, but it needs extra solutions for certain tasks. Docker and Kubernetes are powerful tools that help with automation. You can use them for various purposes, including testing, development, deployment, and managing web applications in production.
If you want to improve DevOps through automation, these two tools will be very useful in reaching your goals. The world of DevOps is changing fast, which is why automation and integration are crucial for high efficiency and scalability. 
Today, the Golems Drupal development team will look into DevOps automation and deployment using Docker and Kubernetes.

Using Docker for Automation: Features

This solution opens up great opportunities for development and testing. With Docker, you can provide high-quality containerization and package your application and all its dependencies into a single unit, ensuring consistency between production and development environments.
If you need to automate the creation of Docker images, use the Docker Files solution. This will significantly simplify reproducing the environment using different machines.
Docker also allows you to use continuous deployment and integration. You need to use the appropriate Docker containers in special pipelines. As a result, you will get an isolated and entirely consistent environment for building, testing, and deploying your web applications.
Your continuous integration and deployment workflow can automate the creation and push of images to the registry.
Docker containers provide a great opportunity to automate the deployment of environments for testing or development. Using the Compose tool, you can easily define multi-container applications and their dependencies. As a result, you will deploy large environments using just one command.

Using Kubernetes for Automation: Features

Kubernetes is an excellent tool for solving issues related to container orchestration. It is a powerful and multifunctional platform that allows you to automate the necessary processes of scaling, deploying, and managing container-type applications.
You can use manifests to define the desired state of the application. In this case, Kubernetes will ensure that the actual state fully matches the desired one.
You can automate the application's scaling processes based on the resources used and other metrics and characteristics. Kubernetes provides excellent built-in support to balance the created load. The tool will distribute traffic evenly depending on the number of instances of your application.

Docker Containers and Kubernetes Fundamentals

To ensure the integration of these solutions, you will need to use Docker images. They will act as building blocks for containers in the corresponding Kubernetes modules. With Kubernetes Deployments, you will ensure deployment management and scaling for your containers.
For Kubernetes configurations, you will need ConfigMaps and Secrets. With their help, you can manage configuration data separately or independently from the main application code. As a result, you can automate the deployment of configurations along with the application.
Using Kubernetes Jobs, you can automate one-time or periodic tasks. CronJobs are also often used for similar purposes. Kubernetes manifests help define these tasks, after which Kubernetes takes over all the processes for their execution.

Example of a complex project that uses Jenkins

For a more understandable and illustrative example, we decided to use Jenkins as a basis to organize a seamless workflow with Docker and Kubernetes integration. Jenkins allows you to efficiently automate the build, testing, and deployment of Docker containers in an exceptional Kubernetes environment.
In this case, the infrastructure can be divided into three key nodes.

  1. Image building node. This node must automatically create Docker images with each change and push them to the principal repository. GitHub will be used here. We configured it so that the webhook for starting the node ensures seamless integration with the control system;
  2. Testing node. After creating the Docker image, the automatic upload to the node for subsequent testing begins. Automated tests are used here, which allows you to ensure the functionality and integrity of the entire container application. If the test is successful, then the image is considered completely ready for subsequent deployment;
  3. Deployment node. This is the final stage, which includes deploying the Docker image to the used Kubernetes cluster. This solution's peculiarity is that it does not use direct interaction with the cluster. Instead, the process occurs through a separate deployment node. The node actively interacts with the Kubernetes cluster running in another zone, ensuring maximum security and optimal separation of interests.

You only need to install "kubectl" and then copy the "admin.conf" file from the primary node to the additional node. This will activate and ensure effective interaction with your cluster.
Thanks to the presented solutions, you can achieve excellent results with Drupal.

How the workflow is carried out with the Jenkins pipeline

In our example, the Jenkins pipeline will perform three interrelated tasks.
Image creation
You start this workflow using a webhook. This happens every time specific changes are sent to the repository. This way, you can ensure the automatic creation of a Docker image and send it to the Docker Hub. This is where distribution and centralized storage will take place.
Image testing
This is the next stage launched after a successful build. The task pulls the image from the Hub and sends it to the test node. Thanks to automated testing, the application's functionality and other aspects can be checked in a container. If the tests are successful, you can launch the next task—deployment.
Deployment task
This is the final stage, where the deployment of your Docker image to the corresponding Kubernetes cluster begins. When performing this task, the Kubernetes manifest is pulled from the repository, which determines the final configuration of your deployment. Thanks to the configuration, the creation of the final Kubernetes deployment, and exposure to external traffic. As a result, the solution becomes available to your end users.

With Drupal and modern, effective automation, optimization, and deployment tools, you can achieve outstanding results and guarantee excellent application quality.

In conclusion

With the integration of Jenkins, Docker, and Kubernetes, our example project today has created a truly high-quality and reliable DevOps pipeline that is perfectly developed on Drupal. This allows you to automate the complete web development life cycle and ensure high-quality software performance.
Docker and Kubernetes will become your indispensable assistants and open additional perspectives and new opportunities. The main thing is to learn how to work with them correctly and extract the full potential of such solutions.
Optimization and automation help the workflow provide almost instant integration and consistently deliver excellent software. Thanks to integration automation, development processes can reach new flexibility, quality, and efficiency levels.
If you have a Drupal system as a basis, you will see how much easier and faster the main processes and stages of development will be performed. This is truly one of the most powerful and multifunctional systems for creating websites and web applications. However, you need to learn how to use the full potential of Drupal and implement additional solutions, special tools, functional modules, and so on. You can do a considerable part of the work even using the free version of Drupal. However, some paid and additional modules will significantly expand your capabilities for programming and creating web projects of a higher quality. So the choice is always yours.

Comments

An
Anonymous