All posts by Steven Uray

Monolith to Dockerlith: Learnings from Migrating a Monolith to Docker


Docker Logo

NodeJS Logo


Like everyone’s monolith, Earnest’s is complex and was once a little bit out of control. Earnest’s monolith is involved in all major aspects of the business. It is responsible for accepting loan applications from clients, making a decision for each of those applicants, and even supporting clients after loans have been signed. These wide-ranging responsibilities mean that the monolith is really five different applications in one body of code.

Over 100 developers have contributed to its codebase since inception. The complexity of so many revisions and updates made it difficult to set up and maintain. Beyond standard npm libraries, almost a dozen different dependencies from a database to a mock SQS queue to a Hashicorp Vault instance needed to be set up correctly for it to work completely on a developer’s computer. Engineering teams had come to expect that getting this application set up on a new computer would take at least a week, and would require the assistance of multiple developers who had been at the company long enough to acquire the necessary tribal knowledge.

As an engineering team, Earnest needed a way to ensure everyone had a consistent local environment. It needed to be quick and easy to set up. Finally, the local environment should have greater parity with the CI, staging, and production environments. In order to accomplish these objectives, I turned to Docker, Docker Compose, and The Go Script pattern.

Dockerlith Architecture

I started the solution by addressing the shared node_modules folder between all five applications. All application containers shared the same node_modules folder inside a Docker volume. Any of these containers could be started in any order and update the npm dependencies. Therefore it became necessary to ensure only one container could write to node_modules at a time.

While there are many ways to control the start up order of Docker containers, I chose to create a bash script that would lock file descriptors at runtime and then executed this script in the entrypoint of each container. After this script ran, it would invoke the application’s process and the application container would be usable by the developer. An application container’s Docker Compose file looks like this:

Dockerlith Docker Compose 1

Here is the entrypoint script for each application container.

So: the first time a user starts up the application containers, one of them will grab the lock and install the dependencies. The rest of the containers will wait for it to finish, see that the dependencies are valid before turning the control over to the grunt startup tasks. Dependencies are automatically checked and updated, but subsequent start ups will occur quickly and without calls to “npm install” until the dependencies change.

In the event of a container shutdown, networking failure, or Docker daemon shutdown, the lock on the file descriptor is released automatically. Developers can restart the Docker containers and continue with their workflow to recover from this unexpected failure.

In addition to the container startup synchronization system, there is a Docker image that contains the correct versions of node, npm, and other programs. Docker Compose links the application containers, a Postgres container, a mock Amazon SQS queue, and other supporting containers.

I have implemented The Go Script pattern as the last piece of the puzzle to make setting up the application, starting it, and running the tests one-step commands. This pattern is used by almost every project at Earnest, and it’s implementation in this project brings it in line with the rest of Earnest’s tooling. Developers new to the project can become productive quickly, and all developers can keep their focus on high-level goals instead of low-level implementation details.

Accomplishing these goals was time-consuming and difficult, but worth it. Team morale improved as a longstanding pain point in the daily life of Earnest software developers was eliminated. Earnest’s software developers reported that this tool increased their efficiency by an average of 32.5% when working on the monolith. On an average workday, this tool is used around 200 times by the engineering team.

 Dockerlith Usage

This was originally a post on Earnest’s engineering blog, but has been cross posted to my blog as I am the original author and did the work described in the post.

Dino Rampage (Global Game Jam 2015)

I attended Global Game Jam 2015 at the UT Austin. Austin has a thriving game development community, and I estimate at least 60-70 people turned up for the weekend. Corporate sponsors made it rain free Red Bull and pizza at various times, providing welcome refreshment/energy.

I had the pleasure of working with Tom Long, Stuart Imel, Kez Laczin, and Keith McCormic. Our game was Dino Rampage, where the player controls a schoolbus in a doomed attempt to escape a dinosaur park. It can be played here

End Run Version 2

I decided to pivot the End Run project in a new direction. The new goal of the game is to put players in an environment where every major instrument in the song has a visual that appears when it plays. The drums, for example, have fireworks appear if the player moves on a beat when there are drums. The prototype level is complete and ready to play! I recommend downloading the windows version for higher resolution and more responsive controls, but there is also a web version available.

Download Windows

Website Link

End Run

End Run is a rhythm/arcade prototype I created during May 2014 and refined in between my summer travels. Players move once per beat toward the end of the level while dodging obstacles.  After getting settled into Austin, I chose to pivot the game in a direction that would allow players to visualize and interact with the music on a deeper level.

The first three levels are refined and complete, and can be played below!

Download Windows
Screenshot Gallery:

Bitcoin’s Transaction Volume

Bitcoin is fascinating to me.  One of the cool parts about it is that every transaction using it has public components. There is also a booming industry that permits the trading of bitcoin for more traditional currencies such as dollars or euros. We can combine public transaction data with public trading data to see how much value flows across the bitcoin network. has a nice chart that does this.

Unfortunately, the above chart tracks the data in daily increments. This is problematic because the amount of value sent over the network varies greatly day to day. It also varies by the day of the week, weekdays have about a 20-40% higher volume than weekend days. I decided to take the daily data from the chart and bin it into weeks. I think this clarifies the dataset considerably. Lets compare the daily values to the weekly values visually.

Bitcoin Transaction Volume Snapshot Bitcoin USD Transaction Volume

It is worth noting that there were huge outlier days in the 5th and 7th weeks. I was able to average out the outlier in the 5th week, but three consecutive days of very abnormal volume in the 7th week prevented me from repeating the same method there. I began to wonder if there might be some external factors driving the numbers behind transaction volume. The price of bitcoin seemed to be one possible candidate.

Bitcoin USD Transaction Volume With Price

I was surprised to see that despite considerable setbacks in the price of bitcoin over the previous 6 months, the transaction volume appears to be holding steady, or even rising. Transaction volume seems to rise during periods when the price of bitcoin is going up or down considerably, so I added exchange volume that to the secondary Y axis.

Bitcoin USD Transaction Volume With Exchange Volume

Exchange volume appears considerably more correlated to transaction volume than price. During periods of high price movement and thus higher trading volume, transaction volume tends to rise. This could be due to people moving money into and out of exchanges at a greater rate during these time periods. A final comparison I think would be appropriate would be to compare the transaction volume from this year to the transaction volume of the same period last year.

Bitcoin USD Transaction Volume 2013 to 2014

It seems that in the timespan of months, trends in transaction volume can appear but only vaguely. Over the timespan of years trends appear clearly and dramatically. The amount of value the bitcoin network transmitted on a weekly basis is up an average of 277% year over year.  The exchange rate of bitcoin has risen and fallen greatly over the past year, but the amount of value the network transmits, and thus it’s real value to society, appears to be gaining ground steadily.




-Bitcoin Average did not collect price or exchange volume data for June 26th, 2014. I averaged data between June 25th, 2014 and June 27th, 2014 to produce an estimation of June 26th, 2014.

-The graph with the weekly exchange volume does not include data from Chinese exchanges, which I believe to be mostly useless when studying the real economic activity of Bitcoin.

-On May 25th, 2014 the daily transaction volume was $179,516,928. I took the average of every other day in Week 5, $54,934,653, and used that to remove the outlier.

-On June 11th – June 13th, 2014, the daily transaction volume was exceptionally high, distorting that week’s average greatly. I could not think of a reasonable way to remove these outliers, and therefore left them in.

Feel free to use this work in parts or in whole, provided you give me attribution!

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.