Why use Docker?

Published by marco on

Use Case

Let’s imagine we’re working on a PHP web site together, using PostgreSql as a database.

Without Docker

Without something like Docker, I’ll write a readme.md that tells you which PostgreSql to install (maybe latest, whatever), how to configure the Apache server (or Nginx, whatever) and make sure the document root, extensions, modules, etc. are all lined up for this project.

In order to write this readme, I had to set it up on my machine and carefully write down instructions matching what I did, leaving nothing out.

I probably won’t get it right. Or maybe one of those settings conflicts with settings required by other projects I’m working on. Maybe I get lost in a configuration/setup rabbit hole for days. Maybe you do.

Your local development environment can get pretty overloaded with dependencies and configuration, some of it for projects you’re no longer even working on. A hack you made to your environment for one project may subtly affect other projects.

Use a VM?

What’s a solution? Run all of the stuff you need for your project in a VM. One VM per project.

That’s a lot of work and maintenance, too. We either have to pass around giant VM images every time one of us makes an update or keep updating that readme.md and apply the changes ourselves.

Scripting to the rescue!

But what if we could turn that readme.md into a script that you just run to create the environment needed for that project? Instead of maintaining the software, you just run the script and it installs and launches everything you need (web server, extensions, database, file-mappings, etc.) on the fly?

That’s basically what Docker does.

Instead of writing instructions for you to execute, I write instructions for Docker to execute. Instead of hoping that my instructions also work for you, I’m running the exact same script you are. If you add new functionality (e.g. that depends on a new PHP extension), your pull request includes not only your software changes, but the configuration changes for the environment to ensure that I also have that extension enabled when I test or run your code.

When to use Docker?

You work alone? Doesn’t matter. It’s still better to work this way because you’re not messing up your machine for each project. Also, what if you need to work on a different machine? What if your drive dies and you need to restore? All of this is super-easy with a script-based environment like Docker.

Final Example

I have a PHP project that I only very occasionally work on. It used to be a nightmare to get everything set up and running again after 6 months or a year, just for a small bug fix. WHY ISN’T PHP-DEBUGGING WORKING?

Now? Now I have a Docker compose file that pulls up a database server, a web server with all required extensions and a content container. Debugging is configured right in the scripts, so I never have to fight with that again. When I need to work on this project, I just docker-compose up and I’m off and running. Anyone who downloads my code can do the same thing.[1]


[1] One small caveat from a lesson I learned recently: do not pull “latest” images. Instead, pin the version of the image to the version you want to use.