How to deal with node_modules in a Dockerized Node application?

Posted on Leave a comment

There are tens of thousands of blogs, articles and forum threads out there on containerizing a Node.js application. We have everything from outright bad advices to informed and tested opinions. This short post is about a very specific aspect of Dockerinzing a Node app, something that is usually not addressed or given as an after-thought in these articles.

Once you have a Node app up and running in Docker, things are well until you – like all good developers – start adding or removing NPM packages. Nasty things happen, such as:

  • packages get installed on your local (host) filesystem but are not available inside the container
  • packages get installed on your container’s filesystem but are not available on your local
  • there’s a mismatch between versions of the same package in host and container
  • permission issues while installing or removing packages

All these can be solved by correctly using Docker volumes. A volume is basically a way to make parts of container’s filesystem available to host and vice-versa. Not understanding volumes properly may also lead to one of the issues listed above.

I had a similar experience in one of my projects. See the fix:
https://github.com/universalnative/un-website/commit/0eaaca98adac46bf9b1553e66d6acd9d731db43d

As noted in the stackoverflow answer cited in the commit:

When docker builds the image, the node_modules directory is created within the worker app directory, and all the dependencies are installed there. Then on runtime the worker app directory from outside docker is mounted into the docker instance (which does not have the installed node_modules), hiding the node_modules you just installed. You can verify this by removing the mounted volume from your docker-compose.yml.

https://stackoverflow.com/a/32785014/1775160

This is a powerful thing to remember. Once you know how volumes work, you’ll be able to better troubleshoot your Node app when things are wrong. For example, when I recently installed a new NPM package in my local I knew I had to do something like this to make it available inside the container as well:

https://github.com/universalnative/un-website/commit/3fa9c9a79c35a3ddbf39079d16b7dbca12105c0d#diff-4e5e90c6228fd48698d074241c2ba760

Why didn’t I just mount my host node_modules to container’s node_modules? Here’s the explanation:

An alternative approach is to mount host node_modules as a volume in container, but that will override container’s own node_modules folder with host’s. Keeping things independent allows for cleaner and easier troubleshooting of installed/missing packages.

This the approach I prefer, which of course is not certified gold. I works for me well. Besides, yarn installing packages each time container comes up does so incrementally (only new packages are fetched and installed). A win-win 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.