What Is Docker?

Mikayel Dadayan
8 min readFeb 11, 2021

--

what is docker, docker, container

When an application gets bigger and bigger, more codes are added and our application becomes monolithic. It becomes one big monster. And we need to make sure it’s all working, all the parts are working. This makes things harder to manage and monitor. At the same time, we need to make sure that our project not only works on our laptop where we have a certain version of OS, a certain version of node, and many other things.

Most often when people trying have their project runs on a different machine they encounter errors, because just it runs one computer doesn’t necessarily mean it will run on the other. This is what we call the environment. We need a way for us to be able to run our programs and our apps in all the environments possible and this is when docker comes in. Docker can package an application and its dependencies in a virtual container that can run on any server.

What exactly is docker?

Well, docker in the end is a container technology It’s a tool for creating and managing containers.

What exactly does it mean, what’s a container in software development and why might we want to use it?

Well, a container in software development is a standardized unit of software, which basically means it’s a package of code and the dependencies and tools required to run that code.

So for example, if you’re building a node.js application (node.js is a javascript runtime that could be used to execute javascript code on a server) with a container built with docker, you could have your application source code in that container as well as the node.js runtime and any other tools that might be needed to run that code.

docker container

The advantage is that the same container with the same node.js code and the same node.js tool so the same node.js runtime with always the same version will always and that’s the key thing will always give you the exact same behavior and result. There are no surprises because it’s all baked into the container it’s always the same.

Why would we want independent standardized application packages in software development?

Well for one very good and typical example and one of the main use cases of docker: We of course often have different development and production environments and here’s a simple example. Let’s say you created a nodejs application and you wrote some code there which requires node.js version 14.3 to run successfully (node.js code which uses a feature called top-level await) and this is a feature that would not work in older versions of nodejs. We need node.js 14.3 or higher to execute this code successfully.

The problem is that we might have that version installed on our local environment (development environment ) but if we then take this application and we deploy it onto some remote machine ( onto a server ) where it should be hosted so that the entire world can reach it, then on that remote machine we might have an older version of node.js installed maybe 14.1 or 12 and all of a sudden the code which worked locally on our machine doesn’t work there anymore and depending on what’s going wrong it can take quite some time to figure out what the problem was. So having the exact same development environment as we have it in production can be worth a lot and that is something where docker and containers can help you. You can lock a specific node version into your docker container and therefore ensure that your code is always executed with that exact version and all of a sudden that potential problem is gone and can’t occur anymore because your application runs in that container which brings its own node.js version.

Another example different development environments within a team or company.

Let’s say we’re in a big team. You’re working there and another guy is working there and now you and that guy working on the same project, for example, that same node.js application again. Now because you haven’t worked with nodejs for some time or because you didn’t need to update you still have an older version of node.js installed on your system and that guy has the latest version so he wrote this code with top-level weight and when he shares the code with you it doesn’t work for you now obviously.

Docker vs Virtual machines

If you’ve been in the software development area for some time you might also think to yourself why docker and why containers. Isn’t that a problem so that reproducible environment thing? Isn’t that a problem that we can solve with virtual machines?

So virtual machines are machines with virtual operating systems encapsulated in their own shell independent from our host operating system. Isn’t that a solution? Well, kind of with virtual machines we have our host operating system windows or mac os or Linux and then on top of that, we install the virtual machine. So a computer inside of our computer so to say. This virtual machine has its own operating system the virtual operating system which runs inside of that virtual machine let’s say window. Then in this virtual machine. Since it is like a computer it’s just virtually emulated, but inside of that virtual machine, we can then install extra tools. We can install whatever we want in there because it is just another machine even though it just exists virtually but we can install all the libraries dependencies and tools which we need and then also move our source code there and hence since it’s an encapsulated virtual machine with everything our program needs and all the tools being installed there we kind of have the same result as with docker and containers.

virtual machines vs. containers

We have an encapsulated environment where everything is locked in and we could then have multiple such environments for different projects or we could share our virtual machine configuration with a colleague to ensure that we’re working in the same environment.

This works but there are a couple of problems. One of the biggest problems is the virtual operating system and in general, the overhead we have with multiple virtual machines every virtual machine is really like a stand-alone computer a stand-alone machine running on top of our machine, and therefore if we have multiple such machines, especially we have a lot of wasted space and resources because every time a brand new computer has to be set up inside of our machine and that, of course, eats up memory CPU and of course also space on our hard drive. And that really can become a problem if you have more and more virtual machines on your system because you have a lot of things that are always the same and still duplicated, especially the operating system.

You might be using windows on all your virtual machines and still, it’s installed separately in every machine and that, of course, wastes a lot of space. In addition, you might have a lot of other tools installed in every virtual machine which your application doesn’t need directly, but which still are set up as a default and that can be a problem so to sum it up.

What are Docker desktop, Docker hub, Docker compose, and what do we do with these tools?

In the end, we installed the docker engine. We installed that no matter if you installed the docker desktop, if you installed it directly on Linux or if you used the docker toolbox. This docker-engine was simply set up in that virtual machine that hosts Linux which is simply required to run docker. The virtual machine really only there because your operating system doesn’t natively support docker(windows or macOS). If it would, we wouldn’t need the virtual machine, because the idea was to not use virtual machines but even with that it’s just there to run docker, and then your containers will run in that virtual machine. So we still will be working with containers. anyways we installed the docker engine.

In the end docker desktop really just a tool that made sure the docker engine was installed and that it works. It includes a so-called daemon a process that keeps on running and ensures that docker works. So to say the heart of docker and contains a command-line interface and you also got that with the docker toolbox. A command-line interface is a tool we will use to run commands, to create images and containers.

Docker hub is a service that allows us to host our images in the cloud on the web. So that we can easily share them with our systems or with our people.

Docker-compose is a tool that kind of builds upon docker which makes managing more complex containers or multi-container projects easier.

Images and Containers

What is the difference and why do we need both?

Containers, in the end, are small packages that contain both your application (website, node server) and also the entire environment to run that application.

docker image

When we are working with docker, we also need images, because images will be the templates or the blueprints for containers. It’s actually the image that will contain the code and the required tools to execute the code, and it’s the container that then runs and executes the code. We can create an image with all setup instructions and all our code once, but then we can use this image to create multiple containers based on that image. So for example, if we talk about a node.js webserver application we can define it once but run it multiple times on different machines and different servers. The image is a shareable package with all the setup instructions and all the code, and the container will be the concrete running instance of such an image.

The core fundamental concept of docker is that images are the blueprints (templates) that contain the code and the application, and containers are than the running application․

--

--

Mikayel Dadayan
Mikayel Dadayan

No responses yet