React, Redux and Routing

In the process of dealing with my application, I’ve got a curious combination. My React application is using react-router to handle page transitions within the single-page application. It does this by putting the page that is currently being shown in the hash of the location. As a result, the pages end up looking like http://localhost:3000/#/home?q=982hihu.

However, I’m using Redux as my application state store. The idea is that all my application state is in the state store. Except for the react-router state – that is stored in the location hash.

This is obviously less than ideal. I want my application state to be in the redux store – not in other places. More importantly, this is something that others have had to deal with as well as there are packages out there that handle this situation. I’m going to integrate the react-router with Redux using react-router-redux.

Step 1: Update the Store

First off, install the react-router-redux package:

npm install --save-dev react-router-redux

Now update the client/redux/store.js with the code necessary to integrate the library:

import { createStore, combineReducers, applyMiddleware } from 'redux';
import createLogger from 'redux-logger';
import thunkMiddleware from 'redux-thunk';
import promiseMiddleware from 'redux-promise';
import { createHistory } from 'history';
import { syncHistory, routerReducer } from 'react-router-redux';

import { requestAuthInfo } from './redux/actions/auth';
import * as reducers from './redux/reducers';

// Combine the reducers for the store
const appReducers = combineReducers({
    ...reducers,
    routing: routerReducer
});

// Move the react-router stuff into Redux
export const history = createHistory();
const reduxRouterMiddleware = syncHistory(history);

// Apply all the middleware for Redux
const middlewares = applyMiddleware(
    reduxRouterMiddleware,
    thunkMiddleware,
    promiseMiddleware,
    createLogger()
);

// Create the Redux Store
export const store = reduxStore(appReducers, middlewares);
// reduxRouterMiddleware.listenForReplays(store);

// Dispatch the initial action
store.dispatch(requestAuthInfo());

There are a couple of wrinkles here. Firstly, you may not be familiar with using spread operators to combine objects. Let’s say I have the following:

const a = { b: 1, c: 2 };
const d = {
    ...a
    d: 3
};
// d === { b: 1, c: 2, d: 3 };

This is an excellent way of merging objects with ES2015 shorthand. The result is that I have three reducers – the two I provide and the routeReducer that react-router-redux provides.

The second wrinkle is that I actually need to link history to both the Router object and to the store. it’s the go-between for the two pieces.

After those two wrinkles, the store set up is fairly simple. I’m intending on setting up replay at some point so I’ve put a reminder to listen for replays for the store. It’s not important or needed right now.

Step 2: Update the Router

My main app.jsx file contains the router. I’ve updated it to listen to the exported history object from the store:

import React from 'react';
import ReactDOM from 'react-dom';
import injectTapEventPlugin from 'react-tap-event-plugin';
import { Provider } from 'react-redux';
import { Router } from 'react-router';
import { StyleRoot } from 'radium';
import AppRoutes from './pages';
import { store, history } from './redux/store';

// Needed for onTouchTap - Can go away when react 1.0 release
// Check this repo: https://github.com/zilverline/react-tap-event-plugin
injectTapEventPlugin();

let pageStyle = {
    bottom: 0,
    left: 0,
    margin: 0,
    padding: 0,
    position: 'fixed',
    right: 0,
    top: 0
};

let onUpdate = () => { window.scrollTo(0, 0); };

// render the page
ReactDOM.render(
    <Provider store={store}>
        <StyleRoot style={pageStyle}>
            <Router history={history} onUpdate={onUpdate}>
                {AppRoutes}
            </Router>
        </StyleRoot>
    </Provider>,
    document.getElementById('pageview')
);

This code is fairly simple. Line 8 brings in both the store and the history that we export from the redux/store.js module. Line 30 links the router to our redux-enabled history object.

Step 3: Update the routing actions

Normally, one would use LinkTo from the react-router object to update the route. With redux in the mix, one will dispatch an action to update the routing context. Here is the Home.jsx component view that shows the general form of the new linkage:

import Radium from 'radium';
import React from 'react';
import { connect } from 'react-redux';
import { routeActions } from 'react-router-redux';

@Radium
class Home extends React.Component {
    /**
     * React property types
     * @type {Object}
     * @readonly
     */
    static propTypes = {
        dispatch: React.PropTypes.func.isRequired,
        phase: React.PropTypes.string.isRequired
    };

    /**
     * Render the component
     * @returns {JSX.Element} the rendered component
     * @overrides React.Component#render
     */
    render() {
        const dispatch = this.props.dispatch;
        const page1 = () => { return dispatch(routeActions.push('/page1')); };

        return (
            <div id="homePage">
                <h1>{'Home'}</h1>
                <ul>
                    <li><button onClick={page1}>Page 1</button></li>
                </ul>
            </div>
        );
    }
}

/*
** Link the Chrome component to the Redux store
*/
export default connect(
    (state) => {
        return {
            phase: state.auth.phase
        };
    })(Home);

Line 4 brings in the routeActions – a set of action creators provided by the react-router-redux library. The main action creator here is push which pushes the requested route onto the history and hence changes the page that you are on.

A Small Issue – the page route

One of the small issues I had was in the reload case. Let’s say I go to http://localhost:3000 – this loads my application, exactly as expected. then I click on the Home page – this changes the URL to http://localhost:3000/home and the home page is displayed. If I then click the reload button, I get an error message.

The problem is that the route (without the hash) is sent to the server and the server does not know how to handle it. Basically, I need to ensure that any unknown routes cause the server to load the initial page. This is done in my server/static/index.js module – this module serves my home page:

var config = require('config');
var express = require('express');
var fs = require('fs');
var path = require('path');
var serveStatic = require(`./static.${config.get('env')}.js`);

var fileContents = {};
var router = express.Router(); // eslint-disable-line new-cap

/**
 * Load the specified HTML file, caching the contents
 * in the fileContents object.
 * @param {string} filename the file name to load
 * @returns {string} the contents of the file
 */
function loadHtmlFile(filename) {
    var contents = '', file = path.join(__dirname, filename);
    if (!Object.hasOwnProperty(fileContents, filename)) {
        contents = fs.readFileSync(file, 'utf8'); // eslint-disable-line no-sync
        fileContents[filename] = contents
            .replace(/\$\{config.base\}/g, config.get('base'))
            .replace(/\$\{config.env\}/g, config.get('env'))
            .replace(/\$\{config.library.font-awesome}/g, config.get('library.font-awesome'))
            .replace(/\$\{config.library.mdi}/g, config.get('library.mdi'))
            .replace(/\$\{config.library.core-js}/g, config.get('library.core-js'))
            ;
    }
    return fileContents[filename];
}

// Home Page
router.get('/', (request, response) => {
    return response
        .status(200)
        .type('text/html')
        .send(loadHtmlFile('index.html'));
});

// Service static files - different for dev vs. prod
serveStatic(router);

// Ensure that anything not routed is captured here
router.get(/.*/, (request, response) => {
    return response
        .status(200)
        .type('text/html')
        .send(loadHtmlFile('index.html'));
});

module.exports = router;

The highlighted lines get called AFTER everything else. This means that anything that isn’t already responded to will return the home page. Of course, this means that if you go to http://localhost:3000/foo (or any other page that doesn’t have a route), then you will get a blank page unless you have a default route. Have a default route. You can do this easily in your route definition:

const routes = (
    <Route path="/" component={Chrome}>
        <IndexRoute component={Home}/>
        <Route path="home" component={Home}/>
        <Route path="page1" component={PageOne}/>
        <Redirect path="*" to="/" />
    </Route>
);

The Redirect needs to be the very last route. It basically says “anything that hasn’t already been matched, redirect to the home page”. Pretty cool.

Wrap Up

Another day – another React library. I like the modularity, but I’m thinking that there must be a Yeoman generator or something to enable a lot of this scaffolding code. I’m just cringing at adding yet another module or library to my code.

However, that’s reality in the React world. As always, my code is on my GitHub Repository.

Testing ExpressJS Web Services

Let’s say you have a web application written in NodeJS and you want to test it. What’s the best way to go about that? Fortunately, this is a common enough problem that there are modules and recipes to go along with it.

Separating Express from HTTP

ExpressJS contains syntactic sugar to implement a complete web service. You will commonly see code like this:

var express = require('express');

var app = express();
// Do some other stuff here
app.listen(3000);

Unfortunately, this means that you have to be doing HTTP calls to test the API. That’s a problem because it doesn’t lend itself to easily being tested. Fortunately, there is an easier way. It involves separating the express application from the HTTP logic. First of all, let’s create a web-application.js file. Here is mine:

import bodyParser from 'body-parser';
import compression from 'compression';
import express from 'express';
import logCollector from 'express-winston';
import staticFiles from 'serve-static';

import logger from './lib/logger';
import apiRoute from './routes/api';

/**
 * Create a new web application
 * @param {boolean} [logging=true] - if true, then enable transaction logging
 * @returns {express.Application} an Express Application
 */
export default function webApplication(logging = true) {
    // Create a new web application
    let webApp = express();

    // Add in logging
    if (logging) {
        webApp.use(logCollector.logger({
            winstonInstance: logger,
            colorStatus: true,
            statusLevels: true
        }));
    }

    // Add in request/response middleware
    webApp.use(compression());
    webApp.use(bodyParser.urlencoded({ extended: true }));
    webApp.use(bodyParser.json());

    // Routers - Static Files
    webApp.use(staticFiles('wwwroot', {
        dotfiles: 'ignore',
        etag: true,
        index: 'index.html',
        lastModified: true
    }));

    // Routers - the /api route
    webApp.use('/api', apiRoute);

    // Default Error Logger - should be added after routers and before other error handlers
    webApp.use(logCollector.errorLogger({
        winstonInstance: logger
    }));

    return webApp;
}

Yes, it’s written in ES2015 – I do all my work in ES2015 right now. The export is a function that creates my web application. I’ve got a couple of extra modules – an api route (which is an expressjs router object) and a logging module.

Note that I’ve provided a logging parameter to this function. Setting logging=false turns off the transaction logging. I want transaction logging when I am running this application in production. That same logging gets in the way of the test results display when I am running tests though. As a result, I want a method of turning it off when I am testing.

I also have a http-server.js file that does the HTTP logic in it:

import http from 'http';

import logger from './lib/logger';
import webApp from './web-application';

webApp.set('port', process.env.PORT || 3000);

logger.info('Booting Web Application');
let server = http.createServer(webApp());
server.on('error', (error) => {
    if (error.syscall !== 'listen') {
        throw error;
    }
    if (error.code) {
        logger.error(`Cannot listen for connections (${error.code}): ${error.message}`);
        throw error;
    }
    throw error;
});
server.on('listening', () => {
    let addr = server.address();
    logger.info(`Listening on port ${addr.family}/(${addr.address}):${addr.port}`);
});
server.listen(webApp.get('port'));

This uses the Node.JS HTTP module to create a web server and start listening on a TCP port. This is pretty much the same code that is used by ExpressJS when you call webApp.listen(). Finally, I have a server.js file that registers BabelJS as my ES2015 transpiler and runs the application:

require('babel-register');
require('./src/http-server');

The Web Application Tests

I’ve placed all my source code in the src directory (except for the server.js file, which is in the project root). I’ve got another directory for testing called test. It has a mocha.opts file with the following contents:

--compilers js:babel-register

This automatically compiles all my tests from ES2015 using BabelJS prior to executing the tests. Now, for the web application tests:

/// <reference path="../../typings/mocha/mocha.d.ts"/>
/// <reference path="../../typings/chai/chai.d.ts"/>
import { expect } from 'chai';
import request from 'supertest';

import webApplication from '../src/web-application';

describe('src/web-application.js', () => {
    let webApp = webApplication(false);

    it('should export a get function', () => {
        expect(webApp.get).to.be.a('function');
    });

    it('should export a set function', () => {
        expect(webApp.set).to.be.a('function');
    });

    it('should provide a /api/settings route', (done) => {
        request(webApp)
            .get('/api/settings')
            .expect('Content-Type', /application\/json/)
            .expect(200)
            .end((err) => {
                if (err) {
                    return done(err);
                }
                done();
            });
    });
});

First note that I’m creating the web application by passing the logging parameter of false. This turns off the transaction logging. Set it to true to see what happens when you leave it on. You will be able to see quite quickly that the test results get drowned out by the transaction logging.

My http-server.js file relies on a webApp having a get/set function to store the port setting. As a result, the first thing I do is check to see whether those exist. If I update express and they decide to change the API on me, these tests will point that out.

The real meat is in the third (highlighted) test. This uses supertest – a WebAPI testing facility that pretends to be the HTTP module from Node, listening on a port. You send requests into the webApp using supertest instead of the HTTP module. ExpressJS handles the request and sends the response back to supertest and that allows you to check the response.

There are two parts to the test. The first is the construction of an actual request:

    request(webApp)
        .get('/api/settings')

Supertest uses superagent underneath to actually do the requests. Once you have linked in the ExpressJS application, you can send a GET, POST, DELETE or any other verb. DELETE is a special case because it is a reserved word – use del() instead:

    request(webApp)
        .del('/tables/myTable/1')

You can add custom headers. For example, I do a bunch of work with azure-mobile-apps – I can test that with:

    request(webApp)
        .set('ZUMO-API-VERSION', '2.0.0')
        .get('/tables/myTable/1')

Check out superagent for more examples of the API here.

The second part of the request is the assertions. You can assert on anything – a specific header, status code or body content. For example, you might want to assert on a non-200 response:

   request(webApp).get('/api/settings')
       .expect(200)

You can also expect a body. For example:

    request(webApp).get('/index.html')
        .expect(/<html>/)

Note the use of the regular expression here. That pattern is really common. You can also check for a specific header:

    request(webApp).get('/index.html')
        .expect('X-My-Header', /value/);

Once you have your sequence of tests, you can close out the connection. Since superagent and supertest are asynchronous, you need to handle the test asynchronously. That involves passing in a parameter of ‘done’ and then calling it after the test is over. You pass a callback into the .end() method:

    request(webApp).get('/index.html')
        .expect('X-My-Header', /value/)
        .end((error) => {
            done(error);
        });

Wrapping up

The supertest module, when combined with mocha, allows you to run test suites without spinning up a server and that enables you to increase your test coverage of a web service to almost 100%. With this, I’ll now be able to test my entire API surface automatically.

MVC, MVVM and Frameworks

I’ve been writing a whole bunch about MVC architectures – client side and server side.  But I hit a problem.  You see, MVC and MVVM are pretty simple concepts.  Here is a typical diagram that I see when looking at MVC descriptions:

blog-08012015-1

It’s nice and simple.  The controller loads the model and passes some form of data to the View.  The problem is this – where is the user and where is the data?  How do these actually interact?  This is actually a key point in understanding the architecture and the place that frameworks – any framework – occupies in the architecture.  I think the following is a much more representative architectural diagram:

blog-08012015-2

This makes more sense to me.  The user submits a request to a dispatcher.  The dispatcher decides which controller to pass the request to.  The controller asks the adapter to give it one or more models to complete the request.  In the case of MVVM, these models are transitioned into a View-Model, usually through some sort of data binding.  This new model (or view-model) is passed into the View rendering system, which renders the appropriate view and kicks it back to the core dispatcher so that the dispatcher can respond to the user.

It’s much more messy than the plain MVC (or MVVM) design pattern.  However, it’s implementable and I can see the pieces I need to implement in order to achieve the desired web application.  This architecture can be implemented on the client side or the server side and both sides have frameworks that assist.

Frameworks provide some sort of functionality that allow you to ignore the boiler-plate code that inevitably comes with writing such an architecture.  Most frameworks have some sort of dispatcher (normally called a router, but they do much more than that) and most frameworks have some sort of adapter logic (mostly called an ORM or Object Relational Mapper).  In between, frameworks enforce a pattern for controllers, models and views that can be used to enforce consistency across the application.

On the server side, I have two go-to languages – C# and JavaScript.  I use ASP.NET as my framework of choice on the C# server side.  I can map my visual directly to ASP.NET:

  • ASP.NET provides the dispatcher, with an ability to configure a route map in it’s startup class.
  • The Controller class can be inherited to create custom controllers
  • The Model is a plain-old class
  • The View is handled by Razor syntax
  • The Adapter is generally handled by Entity Framework.

For server-side JavaScript, the mapping is a little more messy:

I haven’t really gotten into Models and Adapters, although I can see libraries such as Mongoose (for MongoDB) playing a part there.  However, there are Node/Express MVC frameworks out there – I want to investigate Locomotive and SailsJS at some point, for example.

On the client side, things are definitely more messy.  There are a host of different frameworks – Angular, Aurelia, Ember, Knockout, Meteor, React / Flux, along with a host of others.  I’ve found the TodoMVC site to have a good list of frameworks worth looking at.  Some of these are being upgraded to handle ES2015 syntax, some are already there and some are never going to be there.

One thing to note about frameworks.  They are all opinionated.  ASP.NET likes the controllers to be in a Controllers namespace.  Angular likes you to use directives.  Aurelia likes SystemJS and jspm.  Whatever it is, you need to know those opinions and how they will affect things.

The web isn’t the only place one can use frameworks.  The MVC architecture is not limited to web development – it’s constant across applications of any complexity.  For example, you can see MVC in WinForms, Mobile applications, Mac OSX Applications and Linux Applications.

I want my application to be rendered client-side, which means I need to take a look at client-side frameworks.  My working list is:

I’m not going to bother with Backbone, Meteor, Knockout or any other older or smaller framework.  This is my own time and I don’t want to spend a ton of time on investigation.  I pretty much know what Aurelia can provide.  To investigate the others I needed a small site I could implement – something that wasn’t “just another task organizer” (TodoMVC).  To that end, I decided to create a three page application.

  • Page 1 – the home page – will get loaded initially and contain a Polymer-based carousel
  • Page 2 – a linked page – will load data from the Internet (the Flickr example from the Aurelia app)
  • Page 3 – an authenticated page – will load data from the local server if the user is authenticated

In addition to the three pages, I’m going to ensure that the navigation is separated logically from the actual pages and that – where possible – the pages are written in ES2015.  I want separation of concerns, so I expect the models to be completely separate from the views and controllers.

Each one will be implemented on top of a Node/ExpressJS server that serves up just the stuff needed.  In this way, I will be able to see the install specifics.  You can see my starter project on my GitHub Repository.  I hope you will enjoy this series of blog posts as I cover each framework.