Brandon Konkle
Brandon Konkle

Principal Engineer, type system nerd, Rust enthusiast, supporter of social justice, loving husband & father, avid comic & manga reader, 日本語を勉強してる。

I’m a Software Architect with more than 15 years of experience creating high performance server and front-end applications targeting web and mobile platforms, & today I lead a team at Formidable Labs.

Share


Tags


Brandon Konkle

The Bazaar: Build System Blues

Welcome back to my series on the wonderful front end Bazaar of technologies! In my last post, Selecting Packages, I talked about the incredible array of choice that we're presented with on the front end, and I encouraged readers to embrace the bazaar and craft their own architecture rather than going with a boilerplate or framework. It's not always easy evaluating whether a package is right for you, and I offered some things to look at when selecting the building blocks for your app's foundation.

One of the next big things you need to think about when constructing your own architecture is the build system. Front end code and tooling has become dramatically more capable in recent years, but browsers must focus on stability and backwards compatibility. The innovations in JavaScript are necessarily slow to appear in browsers. That doesn't mean that you can't use them, however. Transpiling and post processing has turned JavaScript into merely a build target, allowing developers to build applications with higher-level languages like ES'15, ClojureScript, PureScript, and more and transform that into cross-browser compliant JavaScript.

In addition, most front end stacks have other types of pre- and post-processing to do for the application's CSS, HTML, and whatever else. As our tools become more capable and complex, the build process has grown into a primary part of our development workflow - an often tangled and confusing part.

The Evolution of the Build

In earlier days, Node & front end developers often wrote their build process using a Makefile, or a Rakefile for Rubyists. Many Node developers wrote light systems using npm scripts. Builds were often difficult to understand and maintain, with front end developers simply allowing their back end colleagues to roll the tasks into their build.

Eventually, "Cowboy" Ben Alman shared Grunt, his solution for a task-based, batteries included build system that focused on configuration over code. You selected the plugins you wanted, created a monolithic config file, and hoped everything worked just the way you wanted. In most cases it did work - while our builds were simple and small. As our front end applications became more complex, however, we began to run into problems. Configuration wrangling became tedious, we had to construct fragile temp file workflows, and we had to tweak or rewrite plugins to get them to do exactly what we wanted. Many of us longed for code rather than configuration, so that we could easily construct the exact workflow we needed for our app.

Eric Schoffstall, better known as "Contra", made waves by releasing Gulp. With its focus on Node streams, small modules, and interacting with libraries directly through the Node API, Gulp was the more idiomatic solution that many of us in the Node community were eagerly awaiting. For a time, this too worked quite well. Eventually, however, our builds became large and opaque and almost as difficult to maintain as they were in Grunt. Errors in particular were very challenging to manage, with third-party tools like "gulp-plumber" popping up to try to patch the holes. Through our trials and tribulations with Gulp, many of us learned that it may be easier just to work with the Node libraries we depend on directly. The experience with Gulp left us much more capable of doing this, because in many cases we were already interacting with libraries directly.

Though other build tools like Broccoli and Brunch continued to sprout, many developers have come back around to npm scripts as the ideal solution. Scripts in npm provide an easy system for launching commands within the context of the current Node environment. Combined with plain Node scripts kept in a scripts folder, this strategy allows infinite customization while still keeping the familiar interface of npm run. It takes more effort to construct your build this way, but your build system is tailored precisely to your application's needs.

On the other side of the coin, many developers in search of an easier build experience have gone towards Webpack - an opinionated, configuration-driven build tool that grew from a simple browser bundler into a full-featured build tool with a dev server and hot reload features.

What does your application need?

The build system you select can strongly influence the way your application grows over time. It's important to take some time to imagine how your application will expand in the future so that you can pick the system that offers the right balance between convenience and customization.

If you're looking for something quick and easy, and you're okay with occasionally changing the way you write portions of your application to suit what the build system wants, then your best bet at the moment is likely Webpack. Used with "presets", such as Henrik Joreteg's hjs-webpack, Webpack can actually be quick and easy. You may be in the dark when things go wrong, however, because error messages can sometimes be cryptic and difficult to work out - especially if you're using a preset that you didn't put together yourself.

If you want precision, customization, and idiomatic Node, then you'll probably want to construct a custom build system for your application using npm scripts. If you're unafraid of Node IO, streams, and processes, then you likely have what it takes to dive in and create your own pipeline.

Building Your Build System

If you decide to choose your own adventure, it can be a little daunting at first to figure out how to reproduce some of the features you've come to rely on in build tools. The combination of the npm "scripts" configuration section and plain Node CLI scripts actually provides everything you need in most cases, however. Below, I'll cover a few techniques I use to make this approach worthwhile.

Context

When a command in scripts is run, the path includes node_modules/.bin, meaning that you can include packages in your devDependencies to gain access to their shell command in an npm script. This allows you to use things like Babel or Webpack without requiring the user to globally install them.

Subtasks

I commonly group tasks into "subtasks" using : as a separator. This makes it easier to find and work on related commands. Here's an example, including some other techniques we'll cover:

"scripts": {
  "build": "NODE_PATH=$NODE_PATH:src NODE_ENV=$npm_package_config_env babel src -s -d build --copy-files --color=always",
  "build:all": "npm run parallel -- 'npm run build' 'npm run commons -- build'",
  "build:prod": "npm run build --my-package:env=production",
  "build:watch": "npm run build -- -w",
},

Your "top level" command without the separator can be a default action, or it can be a combination of actions. You can run other npm scripts, such as subtasks, simply by invoking them as you would on the command line.

Config Variables

To make it easy to modify your config at runtime but provide sensible defaults, you can use npm's config settings block. It allows you to define variables that can be set via command line flag and accessed via a shell variable.

"config": {
  "env": "development"
},

In this config, env defaults to the string "development". I use it in the config like this:

"build": "NODE_PATH=$NODE_PATH:src NODE_ENV=$npm_package_config_env babel src -s -d build --copy-files --color=always",

The $npm_package_config_env environment variable is set by npm, and it defaults to the value you provided in your config declaration. To override this, you pass command-line flags:

"build:prod": "npm run build --my-package:env=production",

In this case, I'm using the npm package name for my package, which is "my-package" for the example. I specify ":env" to indicate that is the variable I'm setting, then I provide a value of "production". When the "build" task is called, it will receive a NODE_ENV value of "production".

Passing Extra Arguments

To enable watch mode, I pass an additional argument to my npm run invocation inside the scripts block:

"build:watch": "npm run build -- -w",

The -w flag is placed behind a double dash, --, so that npm doesn't swallow the argument. The double-dash tells npm that the remaining arguments should not be handled by npm, and they are handed off to the command being invoked. So, in this case, the -w is passed on to whatever is run in the npm run build command, which is the Babel invocation. Babel receives the -w flag and goes into watch mode.

Running Commands in Parallel

One of the advantages of a build tool can be easy parallel execution. Well, you're in luck, because there's a fantastic unix command to enable this as well - parallel. To install on OS X using Homebrew:

$ brew install parallel

I create a new command in my scripts declaration to use parallel just the way I like it:

"parallel": "parallel --no-notice --line-buffer --halt now,fail=1 :::",

You'll want to dig into the manual to fully pick that command line apart, but essentially it runs the commands passed to it in parallel, and fails the command immediately if one of the subcommands fails.

Here's how I use it:

"build:all": "npm run parallel -- 'npm run build' 'npm run commons -- build'",

In this example, I'm passing subcommands to parallel using the extra arguments feature of npm scripts. It will run my main build and my commons build concurrently, and fail the command if either build fails.

Secondary Builds

In my example here, I have a secondary build that I'm calling out to. This could be a standard library, or a common tools project, or anything that you use and build along with multiple projects. For my purpose, I call out to a secondary build like this:

"commons": "NODE_PATH=$NODE_PATH:build NODE_ENV=$npm_package_config_env node scripts/commons.js",

This calls a special script that I've written inside the scripts directory of my repository. Here's what it looks like:

This is rather specific to my use case, but what it does is save the location of the commons library in a .localrc file, and then call npm run commands within the context of the commons library.

Responding to Filesystem Events

The best way I've found to watch the filesystem and respond to events is chokidar-cli. It's a command line interface wrapped around the excellent chokidar watcher library. I find it better than the default watcher for mocha tests:

"test:watch": "chokidar -t 300 --silent 'src/**/*.{js,jsx}' -c 'npm test'",

With this command, I'm watching the src directory for JavaScript and JSX files, and then running npm test on change after a 300ms debounce.

Simple Conditionals

For my clean command, I want to avoid error messages when the directories I'm removing don't exist. I use simple inline shell if statements for that:

"clean": "if [ -d build ]; then rm -r build; fi && if [ -d dist ]; then rm -r dist; fi",

Dev Server

In my own personal setup, I use a quick custom Express server with Webpack middleware included for locally developing on applications. Here's what the dev command looks like:

"dev": "NODE_PATH=$NODE_PATH:node_modules:build NODE_ENV=$npm_package_config_env node scripts/dev-server.js",

This calls out to another file in my scripts directory:

The special bit here is my app.get('*') call. This makes sure every call not handled further up the pipeline simply returns the contents of index.html, which enables easy client-side routing.

Putting it all Together

The strategies above may be all you need to put together the perfect custom build strategy for your application, but as it grows complex your needs will grow as well. As you grow your build process, I'd encourage you to take a Unix-influenced approach every step of the way, creating small decoupled pieces for your build that you can re-arrange later to suit your needs. Take advantage of Node's excellent support for unix pipes and handle your "streaming" by using actual unix processes working together, rather than tying your build process directly to Node or a build tool.

With this approach, as your build grows you can use a variety of different tools and techniques without too much effort adapting to each, because your build process is modular and generic. It will take some extra effort up front, but in the long run I think you'll be glad you took full control of your build process.

If you'd like to talk about putting together your own build process, or chat about anything else web dev related with a great group of people, find me in the Denver Devs and Reactiflux communities online.

Thanks for reading!

I’m a Software Architect with more than 15 years of experience creating high performance server and front-end applications targeting web and mobile platforms, & today I lead a team at Formidable Labs.

View Comments