DevelopmentDevOps

Why are organizations moving towards containerized deployments with docker?

Share with:

FacebookTwitterGoogleLinkedInPinterest

Docker Container Comparison with Virtual Machines.

Docker uses host OS instead of guest os. It contains a docker engine. With the help of this docker engine, Docker containers are formed, and these containers have an application running inside them. The required binaries and libraries are packaged into the same container.

Let us understand why we need Docker?

Most common problems-

The most common problem while developing an application is that the application works on a developer’s machine but not in the testing and production environments due to different environments.

Let’s look at the next problem:

Before preceding the next problem, let’s understand what microservices are?

Suppose we have an extensive application that is split down into small individual services that communicate with each other over the network. These services are called microservices.

Problems in adopting microservices:

you can see multiple virtual machines running on top of host machines. Each virtual machine contains dependency of one microservice

Disadvantage:

Resources, ram, processor, and disk space is not fully utilized in this architecture. In the example, there are only 5 microservice, but what if there are more of them? Consider an application containing 50 microservices. In this case, it doesn’t make sense to deploy microservice on a Virtual machine.

How docker solve this problem?

There is a host machine, and on the top of the host machine, there is a virtual machine, and on the virtual machine, there are docker containers that contain dependency for one microservice per docker container.

So what’s the difference?

Docker containers are a lightweight alternative of virtual machines, i.e., in docker containers, we don’t need to pre-allocate disk space or ram. It allocates all the resources depending on the requirement.

Solution:

For the first problem :

Docker containers can be used throughout the complete SDLC cycle to provide consistency in computing so the same environment will be present in dev, test, and prod.

docker in detail:

here developer writes a docker file that contains instructions for application requirements, dependencies. This Dockerfile produces a Docker image, and Docker containers are the instance of these images. These images are then uploaded to Docker Hub. Basically, docker hub is just like GitHub, where we can store images, and later they can be pulled for further usage, thus giving the same environment for dev, testing, and production.

Docker example?

In the previous example, we saw that the images were stored on the docker hub as well the images were huge in size, and the testing dev and production team were pulling these images from the hub, which require huge bandwidth to save that network bandwidth we use the following the workflow.

In the following workflow, Dockerfile is written by the developer, and the code is pushed to git repo from git repo CI server like Jenkins pulls the code and build images. These images will be deployed in the testing, dev, and production environment. This solves the first problem.

Tags
Photo of Kaustubh

Kaustubh

I look after Technology at Thinkitive. Interested in Machine Learning, Deep Learning, IoT, TinyML and many more areas of application of machine learning.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close