In recent years, the technology industry has witnessed a remarkable surge in the adoption of containers, with an increasing number of applications and services being ‘containerized.’ Containers, and Docker in particular, have been lauded for their ability to provide unparalleled isolation and security to applications. However, while containers offer substantial benefits, they are not a silver bullet solution to all security challenges.

Security considerations

It is crucial to understand that the security of a container hinges heavily on its configuration. If misconfigured, a container could potentially provide an avenue for attackers to escape its confines and wreak havoc on the host system or other containers. Articles on Docker escapes provide valuable insights into such vulnerabilities and underscore the importance of proper container configuration and management.

Benefits of containerization

However if configured proprely Docker containers are a great option to run various services and daemons isolated inside. Docker offers obvious benefits including:

  • Isolation: Each Docker container runs in its own isolated environment, which means they don’t interfere with each other or with the host system. This makes it easier to run different applications with different dependencies on the same machine.

  • Portability: Docker containers can run on any machine that has Docker installed. This makes it easy to move applications between different environments (e.g., from a developer’s laptop to a testing environment to a production server).

  • Version Control and Component Reuse: Docker can track versions of images, roll back to previous versions, and reuse images for other projects.

  • Scalability: Docker containers can be easily scaled up or down based on application requirements, and they work seamlessly with orchestration tools like Docker Swarm and Kubernetes.

  • Speed: Docker containers are lightweight and start quickly, since they don’t require a full operating system to run.

  • Consistency: Docker ensures consistency across multiple development and release cycles by standardizing the environment. This makes debugging and troubleshooting easier since the environment is the same everywhere.

  • Developer Productivity: Docker simplifies setup and tear down of development environments, making developers more productive.

  • Continuous Deployment and Testing: Docker provides a consistent environment throughout the development lifecycle, which makes it perfect for a Continuous Integration / Continuous Deployment (CI/CD) setup.

  • Microservices Architecture: Docker fits very well into a microservices architecture pattern, as each microservice can be run in its own container and scaled independently.

  • Security: Though it’s not a security tool, Docker does offer some security benefits. For example, Docker containers provide some degree of isolation from each other and from the host system, which can limit the impact of a security vulnerability. However, Docker security is complex and needs to be managed carefully.

Installing Docker

In order to play around with docker, I have spin up a Virtual Machine using Digital Ocean droplet. I am using Ubuntu 22.04 as it appears to be the most popular distribution for spinning up docker containers, however, I think that any other distribution docker will run just as well. Ubuntu is often a popular choice due to its ease of use, wide community support and extensive documentation. My Ubuntu 22.04 VM will be solely dedicated for docker to test and play around with it.

The official Docker installation guide also provides detailed instructions for installing Docker on Ubuntu. However, I took an easier path and installed the docker packages provided by Ubuntu:

sudo apt-get install docker.io docker-compose

After this you should be able to run docker commands using sudo:

sudo docker version

Post-install considerations

In order to run docker commands without sudo, it is a good idea to add our user to the docker group:

sudo adduser $USER docker

Basic commands

To pull an image from docker repository we can use docker pull. Let us pull the CentOS image:

docker pull centos

To list locally available images you can run:

docker images

To start a CentOS image in daemon mode and give it the name centos:

docker run -d -t --name centos centos

NB! One thing to note is that if you have not pulled the image previously, docker will find one and pull it from the repository automaticully at this stage.

To list running docker containers:

docker ps

We should see the CentOS container running:

CONTAINER ID   IMAGE                                  COMMAND       CREATED             STATUS              PORTS                                           NAMES
dcd7a3fcc6f3   centos                                 "/bin/bash"   2 weeks ago         Up About a minute                                                   centos

To enter the shell of our CentOS container simply use:

docker exec -it centos bash

Now you can run commands inside your centos container shell:

[root@dcd7a3fcc6f3 /]# uname -a
Linux dcd7a3fcc6f3 5.15.0-78-generic #85-Ubuntu SMP Fri Jul 7 15:25:09 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
[root@dcd7a3fcc6f3 /]# id
uid=0(root) gid=0(root) groups=0(root)

To stop the container use:

docker stop centos

With this post I barely scratched the surface of docker containers. The subject is so wide, that there are even some Docker certifications.

In the upcoming posts, I’ll delve deeper into Docker-related topics, with a focus on practical applications that you can use for personal needs. This might include setting up services such as a Wireguard or OpenVPN server for a VPN connection. Or maybe something else? To tell you the truth at this point I am not the biggest fan of running everything in containers and prefer VMs for my servers. Some of the lightweight services I might run in the containers.

The possibilities with Docker are limitless you can basicully run anything in a container. You can even run an Ubuntu within the Ubuntu VM. But is there really a point of doing so?

Stay tuned to explore these and more exciting Docker possibilities.