Integrating Auth0 into a Webpack Project

I’ve got a nice webpack-based React application moving towards “completion” (and I put that in quotes because I think a project is never really completed). One of the things I want to do is to integrate Auth0 – I like the presentation of the sign-in project. This article is not about how to configure Auth0 – they do an excellent job of that. Rather, it is about how to get Auth0 working in a Webpack environment. The example webpack project that they provide is, quite simply, wrong (UPDATE: Auth0 corrected the issues within 2 days of this blog being written. Another thing to love about Auth0 – a responsive team!). Here is how to really do it.

Install required components

Along with webpack, you also need a few uncommon loaders:

npm install --save-dev auth0-lock transform-loader json-loader brfs packageify ejsify

The auth0-lock package is the actual Auth0 UI. The json-loader is for including JSON files in the package. The final three packages (brfs, packageify and ejsify) are the same packages used by the Browserify version of the auth0-lock build. That leaves transform-loader, which allows you to use any browserify plugin within webpack. This basically allows you to use code designed for browserify within webpack.

Wondering where Auth0 went wrong with the sample? They left off brfs, packageify and ejsify from the devDependencies in the package.json file.

Adjust the webpack.config.js to compile Auth0

I work on a PC, not a Mac. As a result, the loaders provided within the sample did not work. The path separator is different between a PC and a Mac. Here are my loaders:

loaders: [
    // Javascript & React JSX Files
    { test: /\.jsx?$/, loader: jsxLoader, exclude: /node_modules/ },

    // Auth0-Lock Build
    { 
        test: /node_modules[\\\/]auth0-lock[\\\/].*\.js$/, 
        loaders: [ 'transform-loader/cacheable?brfs', 'transform-loader/cacheable?packageify' ]
    },
    { 
        test: /node_modules[\\\/]auth0-lock[\\\/].*\.ejs$/, 
        loader: 'transform-loader/cacheable?ejsify' 
    },
    { 
        test: /\.json$/, 
        loader: 'json' 
    }
]

Note that my standard loader excludes the node_modules directory. The auth0-lock build explicitly maps the build.

Use Auth0Lock in your React code

I have the following in a React login controller:

import Auth0Lock from 'auth0-lock';

// and later, within the component

    /**
     * Lifecycle event - called when the component is about to mount -
     *  creates the Auth0 Lock Object
     */
    componentWillMount() {
        if (!this.lock)
            this.lock = new Auth0Lock('rKxvwIoKdij6mwpsSvqi7doafDiGR3LA', 'shellmonger.auth0.com');
    }

    /**
     * Event Handler to handle when the user clicks on Sign In button
     *
     * @param {SyntheticEvent} event the button click
     * @returns {boolean} the result
     */
    onClickedLogin(event) {
        const authOptions = {
            closable: true,
            socialBigButtons: true,
            popup: true,
            rememberLastLogin: true,
            authParams: {
                scope: 'openid email name'
            }
        };

        /**
         * Callback for the authentication pop-up
         * @param {Error} err - the error (or null)
         * @param {Object} profile - the user profile
         * @param {string} token - the JWT token
         */
        const authCallback = (err, profile, token) => {
            this.lock.hide();       // Hide the auth0 lock after the callback

            if (err) {
                console.error('Auth0 Error: ', err);
                this.setState({ error: err });
                return;
            }

            console.info('token = ', token);
            console.info('profile = ', profile);
        };

        this.lock.show(authOptions, authCallback);

        // Click is handled here - nowhere else.
        event.preventDefault();
        return false;
    }

Obviously, this does not actually do anything other than print stuff on the console. You will want to store the token and use that to access any resources you want. I’ve got a full redux store update happening when I get a valid login back.

Creating JavaScript Libraries with TypeScript and WebPack

I’ve been on a tear recently learning React, Redux and other pieces for developing a nice UI-driven application. I needed to switch attention this week. I wanted to write a JavaScript library. The library had to have several features before I even got to the code.

  • Written in TypeScript
  • Released as an ES5.1 UMD library
  • Usable as a named define, AMD and CommonJS library

I like using Webpack – it takes care of a lot of packaging for me. First off, let me introduce my basic library – and it is very basic. I have already done the following:

git init
npm init --yes

I’ve also altered the package.json file so that it has all my relevant bits in it. I’ve create a src directory with two TypeScript classes in it. The first one is called TestClass.ts and it will be exposed as a library entry-point. The second one is called SubClass.ts and it will not be exposed as part of the library. This allows me to test private classes. You can find the src directory on my GitHub repository.

On to the actual mechanics of the code. Let’s start by looking at my TypeScript config files – I have two. The first is the tsconfig.json file:

{
  "compileOnSave": false,
  "compilerOptions": {
    "declaration": false,
    "jsx": "react",
    "module": "commonjs",
    "noImplicitAny": true,
    "preserveConstEnums": true,
    "removeComments": false,
    "sourceMap": true,
    "target": "es5"
  },
  "exclude": [
    "node_modules"
  ]
}

Most of the options are fairly common in TypeScript libraries. The important settings (and why):

  • “target” must be ES5 – my target language
  • “compileOnSave” is set to false to prevent the atom-typescript plugin from compiling on the fly
  • “sourceMap” is set to true so that I can integrate source-maps into the library
  • “declarations” is set to not emit the .d.ts files

Why don’t we emit the .d.ts files? Because one file is generated per source file. Generally you want one file per library so that it is easy to load in a browser context. As a result, I’ll deal with the .d.ts file later on – probably as part of a gulp workflow.

Pretty much everything else is up to you. I highly recommend looking over the tsconfig.json file docs and the TypeScript compiler options for all options.

My other file is tslint.json – I’m going to use tslint to check my files prior to compilation, but I like some changes to the linting rules.

{
  "rules": {
    "quotemark": [ true, "single" ]
  }
}

I like to use single-quotes instead of double-quotes in my code. There is a big list of tslint rules that you can change – you should really understand them and make a reasonable decision based on your coding style.

On to Webpack. For webpack, we need to install some loaders:

npm install --save-dev webpack ts-loader tslint-loader typescript tslint

One thing to note – use ts-loader, not typescript-loader. Typescript-loader has not been updated for a while and is out of date with respect to its peer dependency – typescript. I want to use the latest version of typescript and ts-loader seems to keep up to date with the latest changes in typescript.

Now, onto the webpack.config.js:

var webpack = require('webpack'),
    path = require('path'),
    yargs = require('yargs');

var libraryName = 'MyLib',
    plugins = [],
    outputFile;

if (yargs.argv.p) {
  plugins.push(new webpack.optimize.UglifyJsPlugin({ minimize: true }));
  outputFile = libraryName + '.min.js';
} else {
  outputFile = libraryName + '.js';
}

var config = {
  entry: [
    __dirname + '/src/TestClass.ts'
  ],
  devtool: 'source-map',
  output: {
    path: path.join(__dirname, '/dist'),
    filename: outputFile,
    library: libraryName,
    libraryTarget: 'umd',
    umdNamedDefine: true
  },
  module: {
    preLoaders: [
      { test: /\.tsx?$/, loader: 'tslint', exclude: /node_modules/ }
    ],
    loaders: [
      { test: /\.tsx?$/, loader: 'ts', exclude: /node_modules/ }
    ]
  },
  resolve: {
    root: path.resolve('./src'),
    extensions: [ '', '.js', '.ts', '.jsx', '.tsx' ]
  },
  plugins: plugins,

  // Individual Plugin Options
  tslint: {
    emitErrors: true,
    failOnHint: true
  }
};

module.exports = config;

There are, as is usual with webpack configurations, two areas to take a look at. The output section is the bit that emits this as a library. Note that I make the library property the name of the library – this is also the name of the global defined object containing the library. In this case, I expect to be able to use MyLib.TestClass in a browser when it is loaded. The other area is the module section. This contains a preLoader for running lint and a standard loader for compiling the TypeScript.

Note that I’ve created a block that determines whether -p was used on webpack – if it was, I generate the optimized version of the library. If not, then I generate the debug version of the library. There are several ways to accomplish this; I just picked one I knew would work.

My eventual library will contain React components, so I’ve set things up to compile JSX embedded in TypeScript. This includes setting the jsx flag in the tsconfig.json and adding the match for .tsx in the module section of the webpack.config.js file.

My final section is in the package.json file:

{
  "name": "example-typescript-webpack",
  "version": "0.1.0",
  "description": "An example Webpack deployment for using TypeScript as a library",
  "main": "dist/MyLib.min.js",
  "scripts": {
    "prepublish": "webpack --debug; webpack -p"
  },
  "keywords": [
    "example",
    "typescript"
  ],
  "author": "Adrian Hall <adrian@shellmonger.com>",
  "license": "MIT",
  "devDependencies": {
    "ts-loader": "^0.8.1",
    "tslint": "^3.5.0",
    "tslint-loader": "^2.1.0",
    "typescript": "^1.8.2",
    "webpack": "^1.12.14",
    "yargs": "^4.2.0"
  }
}

Line 5 is important – it points to my production library and allows a developer to import the library into a CommonJS or AMD scenario. The scripts section is also interesting – the prepublish script (which is run with npm run prepublish) will generate both versions of the library into the dist directory. I’m using prepublish because I don’t need to check in the dist directory when publishing. When I run prepublish, the npm system will generate the right files for me and then publish those. Also, when running npm install locally, the same thing happens.

The net effect of that is that I can exclude the dist directory from my GitHub repository and still release a solid ES5 library.

Finally, let’s talk about testing. I want to do some preliminary testing against the library. I can test the CommonJS version easily enough with mocha and chai – here is my test/commonjs.tests.js file for that:

var expect = require('chai').expect;
var mylib = require('../dist/MyLib.min.js');

describe('TestClass', function () {
  it('is contained within MyLib as CommonJS', function () {
    expect(mylib).to.be.an('object');
    expect(mylib.TestClass).to.not.be.null;
  });

  it('can be instantiated', function () {
    var t = new mylib.TestClass('foo');
    expect(t).to.be.defined;
  });
});

I’ve also written in the past about running tests in the browser, so I can do the AMD and Global versions there. In short, there is absolutely no reason to not test all three versions of the library before releasing.

You can find my template library on my GitHub Repository.

Gulp and Webpack – Better Together

I’ve used gulp as a workflow engine prior, but I’d pretty much given up on using it because webpack did so much of what I needed. However, I was flying back from my vacation and I was reminded why I still need it. Not everything in my workflow is actually involved in creating bundles. In particular, some of my stuff is loaded from CDN – things like core-js and some icon fonts. When I am developing without the Internet (like on an airplane), I’d like to still use them. I need to copy libraries that I normally grab from the CDN and place them into the local public area. That requires something other than webpack.

This begs the question – how can one convert the build I had been doing with webpack into something that gulp runs. Well, it turns out that there is a recipe for that. Here is my new Gulpfile.js:

 var eslint = require('gulp-eslint'),
    gulp = require('gulp'),
    gutil = require('gulp-util'),
    webpack = require('webpack'),
    webpackConfig = require('./webpack.config.js');

var files = {
    client: [ 'client/**/*.js', 'client/**/*.jsx' ],
    server: [ 'server/**/*.js' ]
};

gulp.task('build', [
    'webpack:build'
]);

gulp.task('lint', [
    'server:lint',
    'webpack:lint'
]);

gulp.task('server:lint', function () {
    return gulp.src(files.server)
        .pipe(eslint())
        .pipe(eslint.format())
        .pipe(eslint.failAfterError());
});

gulp.task('webpack:lint', function () {
    return gulp.src(files.client)
        .pipe(eslint())
        .pipe(eslint.format())
        .pipe(eslint.failAfterError());
});

gulp.task('webpack:build', function (callback) {
    webpack(webpackConfig, function (err, stats) {
        if (err)
            throw new gutil.PluginError('webpack:build', err);
        gutil.log('[webpack:build] Completed\n' + stats.toString({
            assets: true,
            chunks: false,
            chunkModules: false,
            colors: true,
            hash: false,
            timings: false,
            version: false
        }));
        callback();
    });
});

The task you want to look at is the webpack:build task. This simply calls the webpack() API. Normally, the stats.toString() call will contain a whole host of information many hundreds of lines long – I only want the summary, so I’ve turned off the things I don’t want to see.

I’ve also added two tasks for checking the files with a eslint. I tend to run linters separately as well as together with the client. My webpack configuration still specifies that the linting is done as part of the build. This allows me to continue to use the development server. However, now I can run linting separately as well.

Now that I have this in place, I can rig my server to do a development build. Here are all the pieces:

Step 1: Install the libraries

I use font-awesome, material design icons and core-js in my project:

npm install --save font-awesome mdi core-js

Step 2: Create a task that copies the right files into the public area

Here is the code snippet for copying the files to the right place:

var eslint = require('gulp-eslint'),
    gulp = require('gulp'),
    gutil = require('gulp-util'),
    webpack = require('webpack'),
    webpackConfig = require('./webpack.config.js');

var files = {
    client: [ 'client/**/*.js', 'client/**/*.jsx' ],
    server: [ 'server/**/*.js' ],
    libraries: [
        './node_modules/font-awesome/@(css|fonts)/*',
        './node_modules/mdi/@(css|fonts)/*',
        './node_modules/core-js/client/*'
    ]
};
var destination = './public';

gulp.task('libraries:copy', function () {
    return gulp.src(files.libraries, { base: './node_modules' })
        .pipe(gulp.dest(destination));
});

Note that I’m not interested in copying all the files from the packages into my web area. In general, the package contains much more than you need. For example, font-awesome contains less and sass files – not really needed in my project. Take a look at what comes along with the package and only copy what you need. You can find out about the syntax of the filename glob by reading the Glob primer in node-glob.

Step 3: Update your configuration to specify the locations of the libraries.

I added the following to the config/default.json:

{
    "port": 3000,
    "env": "development",
    "base": "/",
    "library": {
        "core-js": "//cdnjs.cloudflare.com/ajax/libs/core-js/2.0.2/core.min.js",
        "mdi": "//cdn.materialdesignicons.com/1.4.57/css/materialdesignicons.min.css",
        "font-awesome": "//maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css"
    }
}

Lines 5-9 specify the normal locations of the libraries. In this case, they are all out on the Internet on a CDN somewhere. In my config/development.json file, I specify their new locations:

{
    "env": "development",
    "base": "https://grumpy-wizards.azurewebsites.net/",
    "library": {
        "core-js": "core-js/client/core.min.js",
        "mdi": "mdi/css/materialdesignicons.min.css",
        "font-awesome": "font-awesome/css/font-awesome.min.css"
    }
}

When I import the config, I can read the library location with config.get('library.core-js'); (or whatever the library is).

Step 4: Update the home page configuration

In server/static/index.js, I have a nice function for loading a HTML file. I want to replace the libraries as I do the env and base configuration:

function loadHtmlFile(filename) {
    var contents = '', file = path.join(__dirname, filename);
    if (!Object.hasOwnProperty(fileContents, filename)) {
        contents = fs.readFileSync(file, 'utf8'); // eslint-disable-line no-sync
        fileContents[filename] = contents
            .replace(/\$\{config.base\}/g, config.get('base'))
            .replace(/\$\{config.env\}/g, config.get('env'))
            .replace(/\$\{config.library.font-awesome}/g, config.get('library.font-awesome'))
            .replace(/\$\{config.library.mdi}/g, config.get('library.mdi'))
            .replace(/\$\{config.library.core-js}/g, config.get('library.core-js'))
            ;
    }
    return fileContents[filename];
}

I’ve got a relatively small number of libraries, so the overhead of a templating engine is not worth it right now. However, if I grew the number of libraries more, I’d probably switch this over to a template engine like EJS. I also need to update my index.html file to match:

<!DOCTYPE html>
<html>

<head>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>Grumpy Wizards</title>
    <link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700"/>
    <link rel="stylesheet" href="${config.library.mdi}"/>
    <link rel="stylesheet" href="${config.library.font-awesome}">
</head>

<body>
    <div id="pageview"></div>

    <script>
        window.GRUMPYWIZARDS = {
            env: '${config.env}',
            base: '${config.base}'
        };
    </script>
    <script src="${config.library.core-js}"></script>
    <script src="vendor.bundle.js"></script>
    <script src="grumpywizards.js"></script>
</body>
</html>

Step 5: Update package.json to copy the libraries to the right place before running nodemon

I added a new script to my package.json:

  "scripts": {
    "build": "gulp build",
    "prenodemon": "gulp libraries:copy",
    "nodemon": "nodemon --watch server ./server.js",
    "start": "node ./server.js"
  },

With all this done, I now have two modes:

  • In development mode, run by npm run nodemon, I copy the libraries to the right place and then serve those libraries locally
  • In production mode, run by NODE_ENV=production npm start, I serve the libraries from a CDN, saving my bandwidth

If I change the libraries that are copied into the public area, I will have to stop and restart the server. That is a relatively rare thing (I only have three libraries), so I’m willing to make that a part of my workflow when it happens.

As always, grab the latest source from my GitHub Repository.

Splitting Vendor and App Javascript Files with Webpack

It’s been a couple of weeks since I last messed with my Webpack configuration as I was learning Redux. I’ve noticed, however, that my code is growing ever larger. Part of that is the sheer number of libraries I have. It’s probably fairly common, but here is what I have:

If the webpack numbers are to be believed, this accounts for approximately 90% of the code base. The real kicker is that these rarely change – certainly not as much as my main codebase during development. Even when my application is in production, I can place these libraries on a CDN. The question is, of course, how do I build two JavaScript webpack bundles? I want one for the libraries I use and one for my code.

It turns out it is relatively simple.

Step 1: Update your webpack.config.js to support vendor bundles

Here is my new webpack.config.js with the appropriate lines highlighted:

'use strict';

/* global __dirname */
var config = require('config'),
    path = require('path'),
    webpack = require('webpack');

var jsxLoader = (config.get('env') === 'development') ? 'react-hot!babel' : 'babel';

var configuration = {
    devtool: 'source-map',
    entry: {
        app: [ path.join(__dirname, 'client/app.jsx') ],
        vendor: [
            'history',
            'isomorphic-fetch',
            'material-ui',
            'md5',
            'radium',
            'react',
            'react-dom',
            'react-redux',
            'react-router',
            'react-tap-event-plugin',
            'redux',
            'redux-logger',
            'redux-promise',
            'redux-thunk'
        ]
    },
    module: {
        loaders: [
            // JavaScript and React JSX Files
            { test: /\.jsx?$/, loader: jsxLoader, exclude: /node_modules/ },
            { test: /\.jsx?$/, loader: 'eslint', exclude: /node_modules/ },
        ]
    },
    output: {
        path: path.join(__dirname, 'public'),
        publicPath: '/',
        filename: 'grumpywizards.js'
    },
    plugins: [
        new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.bundle.js'),
        new webpack.optimize.UglifyJsPlugin({ mangle: false, compress: { warnings: false }}),
        new webpack.NoErrorsPlugin(),
        new webpack.DefinePlugin({ 'process.env.NODE_ENV': `"${config.env}"` })
    ],
    resolve: {
        modulesDirectories: [ 'node_modules' ],
        extensions: [ '', '.js', '.jsx' ]
    },
    target: 'web',

    // Loader options
    eslint: {
        failOnWarning: false,
        failOnError: true
    }
};

if (config.env === 'development') {
    configuration.entry.app.unshift(
        'webpack/hot/dev-server',
        'webpack-hot-middleware/client'
    );
    configuration.plugins.push(new webpack.HotModuleReplacementPlugin());
}

module.exports = configuration;

Lines 13-30 are the new entry point. The old entry was just an array which is equivalent to { app: <array> }. In addition to the app entrypoint, I’ve created a vendor entrypoint which has a list of libraries that I want to bundle into the vendor bundle file. Line 44 does the actual bundling of the vendor bundle – the app bundle is handled just like before.

Note that I had to change my usage of material-ui. I included specific sub-components like this before:

import { Card, CardHeader } from 'material-ui/lib/card';

Material-UI exports everything from the main library entrypoint as well as through sub-modules, so this import becomes:

import { Card, CardHeader } from 'material-ui'

It does mean that the entire material-ui library will be included in the vendor bundle, but the bundle is easier to maintain.

Finally, because we moved the app into an object, the update for the development environment needs to be adjusted to ensure the dev server and hot module loading middleware are added properly.

If you run webpack -p (or npm run build if you have it configured) then you will see two bundles created. The new bundle is vendor.bundle.js. Note the sizes of the two bundles. The vendor.bundle.js is probably a significant part of your app.

Step 2: Add the vendor bundle to your index.html

Now that I have a vendor bundle, I need to load it. Since my code is written in ES2015, I need to load it after the existing core-js library and before my app code:

<!DOCTYPE html>
<html>

<head>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>Grumpy Wizards</title>
    <link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700"/>
    <link rel="stylesheet" href="//cdn.materialdesignicons.com/1.4.57/css/materialdesignicons.min.css"/>
    <link rel="stylesheet" href="//maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css">
</head>

<body>
    <div id="pageview"></div>

    <script>
        window.GRUMPYWIZARDS = {
            env: '${config.env}',
            base: '${config.base}'
        };
    </script>
    <script src="//cdnjs.cloudflare.com/ajax/libs/core-js/2.0.2/core.min.js"></script>
    <script src="vendor.bundle.js"></script>
    <script src="grumpywizards.js"></script>
</body>

</html>

Since I’m using serve-static to serve the JavaScript bundles from my NodeJS server, the vendor.bundle.js file won’t be re-loaded unless I actually change it. This will make reloading quicker and will ultimately be better for my users as well.

As always, you can find the code on my GitHub Repository.

Hot Reloading with Webpack

I’ve created an awesome webpack configuration over the last few posts. I am handling both the ES2015 and SCSS stylesheet loads together with supporting CDNs for libraries and linting of the source files and stylesheets. However, there is still a tooling issue. I need to stop the server and rebuild every time I make a change to anything. I’d really rather this happen automatically so that the public files are built for me as soon as I save a file.

There happen to be two mechanisms of doing this – the “no-code” way and a mechanism that involves code. The idea is that I won’t be restarting anything at the end of this blog post – it will all happen for me.

The No Code Way of Rebuilding

My application is split into two parts – a server side built on top of ExpressJS and a client side that is loaded from the ExpressJS server. I need to deal with both of these elements. Building the public files automatically is easily done with Webpack and is a great first step. Here it is:

webpack --watch

I’ve got this established in my package.json file:

  "scripts": {
    "clean": "rimraf public",
    "pretest": "eslint server/src/**/*.js",
    "test": "mocha --require ignore-styles --recursive --compilers js:babel-register --reporter spec server/test",
    "watch": "webpack --watch",
    "prestart": "webpack -p",
    "start": "node ./bin/www"
  },

I open up two windows to use this mechanism. In the first window, I run npm start – this runs my server and builds the initial files. Once my server is listening for connections, I switch to my other window and run npm run watch. The Webpack process runs whenever I save a file in my editor. I can see the lint output straight away and change things. Then, when I’m ready to test my code, I can reload the browser and be testing straight away.

Of course, this doesn’t help me when I’m changing the server. I’m going to look at another utility for that functionality: nodemon. I can alter my watch command like this:

  "scripts": {
    "clean": "rimraf public",
    "pretest": "eslint server/src/**/*.js",
    "test": "mocha --require ignore-styles --recursive --compilers js:babel-register --reporter spec server/test",
    "watch": "webpack --watch",
    "nodemon": "nodemon --watch server --watch config --watch bin ./bin/www",
    "prestart": "webpack -p",
    "start": "node ./bin/www"
  },

In this case, nodemon is going to restart my server when anything used by the server changes – that includes three directories for me. That also means that the server will restart if I am changing the tests under server. However, I don’t expect to be doing live tests and mocha tests at the same time. I still need two windows. In the server window, I’m going to run npm run nodemon.

Something a little more complex…

The above mechanism is a good solution, but it does have its drawbacks. Firstly, I have to reload my page to see changes. I’m developing a single-page application and that means I need to restart the application when I’m checking stylesheets. I would much rather just the stylesheet be laoded. Secondly, it takes two windows. I’d rather the watching and reloading happen within the server process during development and then be served normally afterwards. This requires some code changes.

Firstly, let’s talk about NODE_ENV. This is an environment variable that you can use to decide what to do. I have configured my config directory to bring it into my configuration. My config/default.json now looks like this:

{
    "port": 3000,
    "env": "development",
    "auth": {
        "clientid": "NOT-SET",
        "secret": "NOT-SET",
        "domain": "NOT-SET"
    }
}

My config/custom-environment-variables.json is also changed to map NODE_ENV to the proper place:

{
    "port": "PORT",
    "env": "NODE_ENV",
    "auth": {
        "clientid": "AUTH0_CLIENTID",
        "secret": "AUTH0_CLIENTSECRET",
        "domain": "AUTH0_DOMAIN"
    }
}

On to the webpack.config.js. I needed to make some changes to the output and plugins sections:

    output: {
        path: path.join(__dirname, 'public'),
        publicPath: '/',
        filename: '[name].js'
    },
    plugins: [
        new ExtractTextPlugin('grumpywizards.css')
    ],

I’ve moved the place where the files are stored in the output.path property. This involves removing the same directory from the output.filename and the ExtractTextPlugin. In addition, I’ve defined the output.publicPath property – this is the location on the server where the files are going to be served.

Once you have done these changes, you should be able to run npm run prestart and the files will be generated in the same place as normal. Nothing has changed yet – just moving around some paths. Now for the magic.

In the server/src/app.js, I’m using the following call to set up /public as my static file area:

        app.use(staticFiles('public', {
            dotfile: 'ignore',
            etag: true,
            index: false,
            lastModified: true
        }));

The plan is to replace this with a special webpack middleware controller that will serve up the same files, but will reload them when they change. However, I only want this to happen in development. In production, I want the original static file serving to happen:

    if (config.env === 'development') {
        var compiler = webpack(webpackConfig);

        app.use(devServer(compiler, {
            publicPath: webpackConfig.output.publicPath || '/'
        }));
    } else {
        app.use(staticFiles('public', {
            dotfile: 'ignore',
            etag: true,
            index: false,
            lastModified: true
        }));
    }

Save that and run npm run nodemon. Once the log says the following:

Child extract-text-webpack-plugin:
    chunk    {0} extract-text-webpack-plugin-output-filename 5.38 kB [rendered]
        [0] ./~/css-loader?sourceMap!./~/postcss-loader!./~/sass-loader?sourceMap!./~/stylelint-loader!./client/src/components/Page.scss 3.87 kB {0} [built]
        [1] ./~/css-loader/lib/css-base.js 1.51 kB {0} [built]
webpack: bundle is now VALID.

You can load your web browser as soon as the bundle is valid.

I only need one window with this new configuration – the second window is gone and I have one command to run when I want to start the server. However, I’m still reloading the page to bring in the changes.

Hot Reloading

This is where it gets interesting. There are facilities within webpack to actually notify the browser when files have changed. This functionality is called Hot Module Reloading, or HMR for short. HMR requires changes to your webpack configuration and additional modules to work. This took a little time to master, so here is the recipe. Firstly, you need to make some changes to your webpack.config.js:

var webpack = require('webpack');

module.exports = {
    devtool: 'source-map',
    entry: [
        'webpack/hot/dev-server',
        'webpack-hot-middleware/client',
        path.join(__dirname, 'client/src/app.jsx')
    ],
    output: {
        path: path.join(__dirname, 'public'),
        publicPath: '/',
        filename: 'grumpywizards.js'
    },
    plugins: [
        new webpack.HotModuleReplacementPlugin(),
        new ExtractTextPlugin('grumpywizards.css')
    ],

I’ve added the definitions for the webpack hot middleware client to my entry. The entry becomes an array now. That means that the [name] tag in the output.filename property is meaningless, so I’ve replaced it with the full pathname of the output. Finally, I’ve added the HotModuleReplacementPlugin to the list of plugins.

Now, onto the server/src/app.js file:

        var compiler = webpack(webpackConfig);

        app.use(devServer(compiler, {
            publicPath: webpackConfig.output.publicPath || '/',
            stats: { colors: true }
        }));
        app.use(hotServer(compiler, {
            log: console.log
        }));

There is another piece of middleware that is explicitly the hotServer. This is provided by the npm package webpack-hot-middleware. Don’t try and use Winston in place of console.log here – it doesn’t work.

Once you have done this and restarted your server from scratch, you will note that there are some messages on the browser console that are interesting:

[HMR] Waiting for update signal from WDS...
[HMR] connected
client.js:106 [HMR] bundle rebuilding
client.js:108 [HMR] bundle rebuilt in 408ms
process-update.js:25 [HMR] Checking for updates on the server...
process-update.js:59 [HMR] The following modules couldn't be hot updated: (Full reload needed)
process-update.js:64 [HMR]  - ./client/src/components/Header.scss

This tells you that the hot module reloading is working, but the bundling is not. Not to worry – we can use that NODE_ENV environment variable to turn off the ExtractTextPlugin when the NODE_ENV is development. We can also turn off hot reloading when the NODE_ENV is not development. This is possible because the webpack.config.js is just another JavaScript file.

I’ve done some other optimizations around the code as well. For those that want to know:

  • I’ve separated out the index page and made the loading of the CSS optional based on the NODE_ENV
  • I’ve separated out the static pages into their own module
  • I’ve also adjusted the webpack.config.js to set up the right configuration based on NODE_ENV

Between all this, I am now hot-reloading all my files in development, but still serving up a combined CSS file in production. Check out the work in my GitHub Repository.

Integrating Stylesheet Linting with Webpack

I converted my eslint setup to Webpack in the last article, so I figured it was time to go all the way and integrate my full lint and test environment into the Webpack workflow. I’m not sure about test (since my test files are completely separate), but linting can be incorporated easily. Since I’d already done eslint, sass-lint incorporation was next.

Unfortunately, herein lies problems. Firstly, sass-lint has issues – most notably, it has problems with certain rules around indentation that I happen to like. So, what are the alternatives? Well, there is scss-lint, which seems to be out of maintenance (one of my requirements to use a library is that it is actively maintained), and there is stylelint. I was very satisfied with stylelint from a command line point of view – it corrected all of the issues that I had with sass-lint.

Stylelint didn’t have a loader. Fortunately, those are relatively easy to write, so I wrote one. You can find it published on npmjs.com.

Now that I have the loader, how do I use it? I need to install it from npmjs.com:

npm install --save-dev stylelint-loader stylelint

My loader has stylelint as a peerDependency. You have to install those separately so I am installing stylelint at the same time. This allows one to use alternate versions of stylelint – you aren’t tied to the version provided withe the loader.

Onto configuration. As with all things Webpack, I am configuring this within the webpack.config.js file:

    module: {
        preLoaders: [
            // Javascript
            { test: /\.jsx?$/, loader: 'eslint', exclude: /node_modules/ },
            // Stylesheets
            { test: /\.s(a|c)ss$/, loader: 'stylelint' }
        ],
        loaders: [
            // Javascript
            { test: /\.jsx?$/, loader: 'babel', exclude: /node_modules/ },
            // Stylesheets
            { test: /\.css$/, loader: ExtractTextPlugin.extract( 'style', 'css?sourceMap') },
            { test: /\.s(a|c)ss$/, loader: ExtractTextPlugin.extract( 'style', 'css?sourceMap!sass?sourceMap') },
            // Font Definitions
            { test: /\.svg$/, loader: 'url?limit=65000&mimetype=image/svg+xml&name=public/fonts/[name].[ext]' },
            { test: /\.woff$/, loader: 'url?limit=65000&mimetype=application/font-woff&name=public/fonts/[name].[ext]' },
            { test: /\.woff2$/, loader: 'url?limit=65000&mimetype=application/font-woff2&name=public/fonts/[name].[ext]' },
            { test: /\.[ot]tf$/, loader: 'url?limit=65000&mimetype=application/octet-stream&name=public/fonts/[name].[ext]' },
            { test: /\.eot$/, loader: 'url?limit=65000&mimetype=application/vnd.ms-fontobject&name=public/fonts/[name].[ext]' }
        ]
    },

This should be familiar syntax by now – the preloader doesn’t actually do any bundling, so it’s an ideal place to put linters. I need to configure my stylelint-loader – it requires me to specify the Stylelint configuration file location so that it can be passed into the library:

    stylelint: {
        configFile: path.join(__dirname, './.stylelint.config.js')
    },

One of the nice things about stylelint is that they also publish standard configurations to npm. So I can do this:

npm install --save-dev stylelint-config-suitcss

And then include this with some overrides in my .stylelint.config.js file:

module.exports = {
    extends: [
        'stylelint-config-suitcss'
    ],
    rules: {
        'function-url-quotes': 'single',
        'indentation': [ 4, { warn: true } ],
        'string-quotes': 'single'
    }
};

Adding linting was relatively straight forward since I’d done most of the hard work when I did eslint. As always, find the code on my GitHub Repository. Both eslint and stylelint are run automatically when my project is built, allowing for an instant check on coding style and common errors.

Using eslint with Webpack

This morning I realized my eslint configuration was not running. I had set up the wrong command in the pretest script of my package.json and as a result none of the files were being run through eslint – it was just silently failing. My old gulp configuration had a gulp.src() definition that used globbing to find the files, so eslint was always fed a list of files rather than having to do a recursive search. I discovered this quite by accident. I had added a stage-1 proposal for static class properties to my BabelJS configuration and not updated by eslint configuration to compensate for this. I expected an error – I didn’t get one. What doesn’t happen is relevant too.

Wouldn’t it be nice if Webpack could do some pre-tests on my files prior to loading them?

Well, it turns out that this is possible. Like many things in Webpack, it turned out to be really easy. First of all, I needed a new loader for eslint:

npm install --save-dev eslint-loader

This should be fairly common now. There is an extensive list of pluggable loaders available for Webpack, just like there are plugins for grunt, gulp and browserify. It seems every single tool must be extensible these days – something I really like about the ecosystem. Now, on to the configuration within webpack.config.js:

var ExtractTextPlugin = require('extract-text-webpack-plugin');

module.exports = {
    entry: {
        grumpywizards: './client/src/app.jsx'
    },
    devtool: 'source-map',
    module: {
        preLoaders: [
            // Javascript
            { test: /\.jsx?$/, loader: 'eslint', exclude: /node_modules/ }
        ],
        loaders: [
            // Javascript
            { test: /\.jsx?$/, loader: 'babel', exclude: /node_modules/ },
            // Stylesheets
            { test: /\.css$/, loader: ExtractTextPlugin.extract( 'style', 'css?sourceMap') },
            { test: /\.scss$/, loader: ExtractTextPlugin.extract( 'style', 'css?sourceMap!sass?sourceMap') },
            // Font Definitions
            { test: /\.svg$/, loader: 'url?limit=65000&mimetype=image/svg+xml&name=public/fonts/[name].[ext]' },
            { test: /\.woff$/, loader: 'url?limit=65000&mimetype=application/font-woff&name=public/fonts/[name].[ext]' },
            { test: /\.woff2$/, loader: 'url?limit=65000&mimetype=application/font-woff2&name=public/fonts/[name].[ext]' },
            { test: /\.[ot]tf$/, loader: 'url?limit=65000&mimetype=application/octet-stream&name=public/fonts/[name].[ext]' },
            { test: /\.eot$/, loader: 'url?limit=65000&mimetype=application/vnd.ms-fontobject&name=public/fonts/[name].[ext]' }
        ]
    },
    externals: {
        'react': 'React',
        'react-dom': 'ReactDOM'
    },
    output: {
        filename: 'public/[name].js'
    },
    eslint: {
        failOnWarning: false,
        failOnError: true
    },
    sassLoader: {
        includePaths: [ 'client/style' ]
    },
    plugins: [
        new ExtractTextPlugin('public/grumpywizards.css')
    ]
};

Lines 9-12 pull in a new section called preloaders. These are still a list of loaders but they are run before the loaders section. There is another section called postLoaders – you can guess what that does. I also needed to configure what happens when eslint fails. By default, things just carry on. I want to ensure that the build carries on if there is a warning (the warnings will still be printed), but an error stops the build. Unfortunately, loaders operate on a per-file basis, so this is really “stop on the first error”.

There was one other change to support Webpack I needed to make – to my .eslintrc.js. I use require() to bring in the stylesheets within my JSX files. I could switch them all over to use import (the ES6 method) but I like the require method as it puts a visual distinction for me within my code. I needed to add the commonjs environment to the .eslintrc.js file in client/src:

var OFF = 0, WARN = 1, ERROR = 2;

module.exports = exports = {
    env: {
        'es6': true,        // We are writing ES6 code
        'browser': true,    // for the browser
        'commonjs': true    // and use require() for stylesheets
    },
    ecmaFeatures: {
        'jsx': true,
        'modules': true
    },
    plugins: [
        'react'
    ],

Don’t forget to also set up your eslint settings for stage-1 proposals. You can get the code so far from my GitHub Repository.