Setting up development environment with docker


This summer I finally decided to learn and integrate docker into my development process. I had to use a bunch of different tools, such as postgres, influxdb, go and nodejs, and last thing I wanted was to install all that locally.

I won’t go into yet another “docker tutorial”: there are hundreds of them already (if you’re into go, this one looks good, although a bit outdated). Instead I’ll explain some issues I faced trying to move all the dependencies into containers and how I solved them, with a focus on installing vendor libraries within a container and sharing them with host through a volume.

After reading some (mostly enthusiastic) guides and reviews, I had rather high expectations from a development environment based on docker containers:

  1. Easy deployment with all necessary dependencies bundled together, not conflicting with any other project.
    This where docker excels, which is not surprising, as this is its primary use case.
  2. Minimal requirements for host machine setup. Surely I don’t want to install a compiler or database on host, because of 1.
    This might be more difficult; for example, Intellij IDEA wants go compiler on the host. But it’s not really a fault of docker itself.
  3. Libraries, such as node_modules, should be downloaded and compiled within container, but stay accessible from host to help my IDE see definitions and provide autocomplete.
    This is related to a previous point and seems obvious to me, but I’ve seen numerous tutorials on the internet that advise to install node_modules on host and copy it to docker container - which kind of eliminates all the benefits of using docker for isolated environments.
  4. An ability to debug those libraries, by modifying and/or inserting some print statements in their source.
    This also seems a natural thing to have, but we’ll see it’s not that easy to achieve with docker if we still want point 3.

While 1 and 2 are solved pretty trivially with docker and docker-compose (and I can live with go compiler on host for my IDE, as it does not interfere with running real projects), I quickly faced a problem of sharing libraries, such as glide dependencies for go and node_modules for nodejs, between host and container.

The thing is that if you choose a noble way of managing libraries from within a container, it will break code synchronization with host in a following scenario, which seems rather common to me:

  • Assume a fresh nodejs project which you have just cloned, so there’s no node_modules directory on host, but there’s dockerfile that will run npm install. Project directory is mapped as a volume, so code will be synchronized and updated immediately when changing it on host.
  • After running docker-compose, you will unfortunately discover that those dependencies have vanished from the container, because they were overwritten by a volume from a host machine, that does not contain node_modules. Everything is broken.
  • The (non-obvious) solution suggested in some articles and on stackoverflow is to declare another volume in docker-compose specifically for node_modules directory, like this:
    version: '3'
    services:
      server:
        build: ./
        volumes:
          - .:/app # syncs all your code, but removes node_modules
          - node_modules:/app/node_modules # brings node_modules back
    volumes:
      node_modules:

This looks suspiciously, but after rebuilding a project with docker-compose, the dependencies will be present in a container, and you app will (hopefully) work.

However, this solves the problem only partially: there is still no node_modules directory on host! I’m surprised there was no mention of this problem in any article on the subject that I read. Even now I have hard time believing people are developing like that, because I feel like I’m blind without being able to see the sources of vendor libraries. Writing this post, I had to verify again that the problem really does exist and it’s not just in my head.

Anyway, the workaround I came up with is, besides declaring an additional volume, to manually copy directory with dependencies from container to host, like this:

# get the ID of (running) container that you will copy dependencies from.
# replace `server` with a service name you use in docker-compose.
docker-compose ps -q server
# copy node_modules from this container to current directory.
docker cp CONTAINER_ID:/app/node_modules . 

Finally, IDE is happy, and you can navigate to library sources.

Still, we’re not done. Every time any dependency is changed, this directory needs to be copied again. And what if we need to modify some library? Its files on host are not synced with container, so it just wouldn’t work. The obvious hack, which I’m not happy about, is to connect to container (with something like docker-compose exec bash), and edit library files from within, using your favorite console editor .


To conclude, while docker indeed helps to solve the most crucial problems of environment isolaton on local machine, it comes with a cost. Out of 4 expected features, I only managed to get 3 - and let’s not forget all that was just to set up a local environment. I didn’t even touch a serious production-ready deployment yet, but I already know that when I will, I’ll face a whole new world of problems with orchestration and stuff.

I believe it depends for every specific case whether it’s worth to jump on a docker hype train, or go with traditional VMs, or just manage everything on host: every solution comes with tradeoffs. Personally I think that although docker is not mature enough yet from a developer’s perspective, it is still worth trying to work with - at least to understand what cool developers on the internet are mainly talking about now.

comments powered by Disqus