I once worked at a job where we built a complex web app using Docker containers and Kubernetes. The containers crashed frequently and almost never left behind logs. In the very few times you could still access the container, none of the terminal commands would work!*. The premise of Docker sounded great, but in practice it seemed like more effort than it was worth.
*Pro tip: Many docker container images use Alpine Linux, a minimal distro with very few preinstalled utilities. You can install packages you need after logging into the container with the
apk
command!
Recently I bought a tiny computer to host WordPress and I complained to my friend about how many separate software packages and configurations were needed just to get it up and running. They mentioned that it would be easier if I dockerized it. I was reluctant to do that, being reminded of my previous experience, but I eventually decided to tear everything down and start over. I’m glad I did because sitting down and finally learning how to properly use Docker turned an annoying experience into an amazing new tool to use!
Now I’m obsessed with shoving everything I make into containers for better or for worse…
What is Docker and how does it work?
Docker relies on the paradigm of “containers”, as in shipping containers.
A container in Docker parlance roughly corresponds to a filesystem. When you make a Docker container, you can add and remove files and install whatever programs you want into the container’s filesystem. Then the Docker software running on your computer can use the container to execute the commands you specify using the data and programs that were installed on that container.
Think of an 18 wheeler truck–the drivers can attach and detach different shipping containers to the back of the cab. This modular design lets you use the same truck engine to haul different types of payloads.
Ok… But what’s great about Docker?
The part where Docker shines is that you can have multiple containers running in parallel and nicely organized into isolated, sand-boxed environments. This is great for making sure everything behaves in a standardized, reproduce-able way.
You can even bring up multiple copies of the same container, or even cooler, reproduce the setup you have on one container on a completely different computer in just a few minutes. If your computer dies and you want to recreate your WordPress set up in another machine, just install Docker on it and copy the container image over and restart it!
Why are you so excited about this, you freak?
There’s two unintended benefits of containerizing your projects:
- The Dockerfile and/or compose.yaml you generate to create a container become somewhat self-documenting, which is great for storing in a git repo and making it easier to remember what you did to get your program up and running.
- Container environments are very easy to bring up and tear down, so it encourages experimentation and ease of testing new changes without disturbing running instances.
I think these points–especially the second one, are huge for me. The ease of experimentation make development faster and more pleasant. If something is temporary or too much of a pain, I’m definitely not going to muster the motivation to get it done.
For my home server, I want to host lots of small and unrelated web projects that are little experiments and Docker gives me an obvious path forward for managing this mess. Docker will also help with the times where I set something up and then completely forget what I did when I come back to it in a year or so.
Docker and WordPress
WordPress requires you to install a web server (usually Apache is the most popular). I decided to go with nginx because I’ve never used it before and I wanted to learn. I also needed to install php, a mysql database, and the php-fpm plugin so that nginx and php can communicate. After that I had to create a user account in mysql and make a database called “wordpress”.
Imagine doing all that and then you decide you want to try out a different version of WordPress, so you have to install new versions of everything and redo all those steps. And, then you decide you liked the older version better, so have to undo everything and redo the old configuration!
Let’s say I have a running instance of WordPress and I want to modify the setup and test things out. I want to add phpmyAdmin, certbot, and fail2ban containers for database administration, SSL certifications, and IP blocking spam bots, respectively. Then I create a second instance of my WordPress docker compose that have these new containers. Now I can experiment with this new setup without disturbing the original instance. This will also help minimize downtime when applying the new changes to the original.
The Cons to Docker
Not everything is sunshine and rainbows. Docker is a layer of abstraction on top of installing programs to your computer the “regular” way and having everything work as you’ve already become accustomed to. With this extra layer of indirection, you now have to figure out what happens when things get weird.
For starters, what happens if you want two containers to talk to each other or share files? The first is simple; by default all docker containers end up on the same network and can ping each other using their container name as a LAN host name. And for the second, you can mount volumes, which are files or folders on the host machine, into different areas of the container’s filesystem to share data.
However, things only get weirder from here. What if you want to run a cron job? The cron job won’t run if the container isn’t running, so you need to make sure to run a command that will keep it alive. When a tutorial mentions using systemctl
or system V
to restart a background service, how would you do it with Docker? And so on…
Overall, I still think Docker is worth learning, but there’s many new things to trip over and discover as you get to the more esoteric features.
And this is to go even further… beyond!
After I got WordPress up and running, I decided I wanted this blog to be on its own subdomain, instead of on the main page.
I was able to find something called Nginx Proxy Manager (NPM) to use as a reverse proxy and route requests from blog.slackerparadise.com to my WordPress container. The NPM container is someone else’s creation that I was able to download and get running in less than an hour. The best kind of work is work you didn’t have to do yourself!
With NPM installed, that leaves my main url and other subdomains free for upcoming ideas in the future. And because of Docker, I have everything contained in its own compose namespace separate from the WordPress containers.
I’ve got the Docker brainrot now. I’m only like three months into this rabbit hole with no signs of stopping any time soon.