Why is DOCKER needed?
If you are working in software engineering, software application development or any computer/technology related industry you would have heard these buzzwords: docker, virtual machines, compartmentalization, containerization, micro-service architecture etc.
There were days in the near past when a software product’s technological architecture and stack were relatively (much!) simpler. But in the present day, the software products are getting highly complex in terms of the technologies they use and the hardware requirements they have!
Any modern enterprise application or even a high-user intensive mobile/web application will be a hot & sour soup of ingredients like web APIs, DB (SQL/NoSQL), cloud (AWS/Azure/Google Cloud), analytics DB, Logging Tools, Background Services, API based hooks & endpoints, funky frameworks, task queues, a web frontend for interactive dashboards, complicated reporting, notifiers; and to support everything a dozen programming/ scripting languages! This is just about software; it has to be then integrated continuously to code repositories and has to be deployed on a large set of diverse devices/servers with complex configurations having different operating systems. Additionally, it will also have to be deployed over virtual-machines and cloud-platforms with different environments.
Everyone in the industry is busy revamping their technology stacks; monolith architecture is doomed and deemed to be a legacy! Like startups; even MNC’s are keen (and forced) to take the maximum advantage of the cloud services offered by some of the best tech giants and new toolset invented by small innovative start-ups due to the increasing demands and complexity in both software and hardware requirements as well as to make the entire product more scalable, robust, secure and manageable.
In a software development life cycle, the product feature goes through development, testing, staging and production phases. Generally, dev systems are different from testing & staging environments which in turn is different from the actual production environment. This requires an intensive level of code, environment and deploy related settings. Moreover, in a typical microservice architecture each service or application is deployed on different machines (Azure VM or AWS EC2 instance). You will have to create new VM instances everytime you set-up new services; it isn’t optimum as that will waste so much memory, resources and money.
In simple terms, what we need is,
1) a clean way of providing separation of concerns
2) an efficient, cheaper & automated way to ship production quality code to different machines/environments for an enterprise application.
3) a less tedious, easily manageable and more scalable way of managing micro-services architecture
What is DOCKER?
Docker is the binding glue and the missing piece between Dev and DevOps!
Docker provides a way to ship code, database, configuration, API and other important building blocks of a service or an application into a secure container which can communicate via the network with similar other containers. It is a more efficient, secure and hassle-free way of distributing a complex product requirement into mutually independent, lightweight, portable, self-sufficient and horizontally scalable small units.
Docker is an open source project designed to easily create smaller containers from any existing application or service. To understand how Docker works, it’s important to understand these important concepts:
Docker Engine is a client-server application comprising of a daemon-process, a REST API and a CLI. You can communicate with the daemon-process through CLI. REST API is used internally by CLI to control daemon-process.
A docker image contains binaries, libraries, source code and configurations that make your application. This can be termed as a blueprint for your application.
A docker registry is an image repository where docker images are stored. It is the GitHub of docker images. DockerHub is a public registry for docker images; you can pull or push your images from DockerHub. You can also create your private registries.
Containers are the fundamental building blocks of docker toolkit. A container is nothing but a process and a running instance of an image. There can be one or more containers running off the same image.
Containers are the image executables which are running on host VM. They are spawned as processes or daemons. It can run independently and can also communicate with other container processes in the host VM. Communication with an external application can also be configured through parent VMs network by doing appropriate protocol/ port settings between host VM and running container process.
Docker container is not miniature VM! A VM has it’s own kernel space whereas a container will share it’s kernel space with host machine.
Docker is application specific while VM isn’t! A lot of tools, use containers as smaller servers; which is nothing but the tiny VM version of the parent VM. Docker does things differently; a container is like a process; and not a VM. Docker internally uses namespaces, union file systems and control-group features to sandbox a process from other processes. So logically docker provides a way to support two or more versions of a software having conflicting dependencies or even two or more different conflicting software products!
If a system or a VM has a docker installed then one or more images can be built and executed as containers on it, irrespective of host machine’s OS or internal configurations as long as sufficient RAM and SSD/HD space are available in the host machine.
This much theory and understanding are more than enough to get us started with Docker’s initial setup and basic commands. There are advanced concepts like Docker Compose and Docker Swarm but they are for another day! I will cover them in deep in my next blog post.
How to set up DOCKER?
Docker was originally built for Linux. So, Docker supports almost all the flavours of Linux such as Ubuntu, RHEL, CentOS and Fedora. I am not so sure about Amazon Linux.
Docker also supports MacOS and recently it has made huge progress on Windows platforms like Windows Server 2016 and Windows 10 (professional & enterprise editions only).
Docker comes in two variants: 1) Docker-CE and 2) Docker-EE. CE stands for community edition; this is a free version and good enough for small to mid-size projects. EE stands for enterprise edition; it is a paid version.
Please refer – https://docs.docker.com/install/linux/docker-ce/ubuntu/
The reason why I am not providing download/install commands over here is that Docker is very hot in terms of updates and fixes they are pushing; so it’s always good to refer official docker website and repository to use the latest stable version in your system.
Explore docker commands
I hope that this post helped you in understanding all the important concepts about docker and got you kickstarted on building your first docker container.
Subscribe and follow this blog for more insights.