Implementing a User Preferences Service in NodeJS with Azure Table Storage

One of the great things about cloud services is that you can take advantage of price competitive solutions and utilize the best technology for the problem rather than shoe horning it into an existing solution. Take, for example, user preferences. I want to store a series of preferences for my applications on a per-user basis. I already have a SQL instance, so the right way to do this is to create a user preferences table and store a single row per user with a JSON blob that I can return to the user at any point.

Not so fast. Is a SQL Service really the best way to store JSON documents? You aren’t taking advantage of any of the relational database functionality – it’s just a store. Why not use a NoSQL service in the cloud instead?

In this article, I’m going to implement a user preferences store in my NodeJS application based on the Azure Table Storage – a simple NoSQL store.

The Code

My application is a NodeJS/ExpressJS application. I have authentication built in that provides a username in To implement my API, I’m going to implement an ExpressJS Router that provides all the functionality I need. To link in a Router, I need to do the following in my main code:

webApp.use(staticFiles(config.webRoot, config.options.staticFiles));

webApp.get('/api/settings', function (req, res) {

webApp.use('/api/userprefs', userprefs());

webApp.listen(config.webPort, function () {
  console.log('Listening on port ' + config.webPort);

Line 25 is my JSON Web Token parser and provides the req.user object (including the all important Line 26 links in my user preferences object. The userprefs is a function that is required like this:

var userprefs = require('./userprefs');

Now, on to the userprefs.js file – this is where the entire API is created:

var express = require('express'),
    bodyParser = require('body-parser'),
    azureStorage = require('azure-storage');

module.exports = exports = function () {
  var router = express.Router(),
      store = azureStorage.createTableService(),
      tableName = 'userprefs',
      partitionKey = 'P1';

  // Create the table
  store.createTableIfNotExists(tableName, function (error, result, response) {
    if (error) {
      throw error;
    if (result) {'[userPrefs] Created new Azure Table Store');

The top of the file brings in an important aspect – azure-storage – this is the API for accessing the Azure Storage API. I’ve followed the simple NodeJS library tutorial to create a store object and I’ve named my table and a partition key. I also create the table if it doesn’t exist.

   * Middleware for this router that does an authentication check - the
   * req.user MUST be set and it MUST contain an id - if it doesn't, then
   * we cannot access the specifics of the user preferences store
  router.use(function authCheck(req, res, next) {
    if (!req.user || ! {
    return next();

The next step is to put in a simple authorization check – this goes before any of the routes in my router. Basic version – if I’m not authenticated, send a 401 Unauthenticated status.

   * Middleware - handles the GET /api/userprefs - sends an object.  If
   * the user does not have any user prefs, sends an empty object
  router.get('/', function (req, res) {
    // Retrieve the object from the userprefs store
    store.retrieveEntity(tableName, partitionKey,, function (err, result, response) {
      if (err && err.statusCode === 404) {'[userPrefs] No Userprefs for user',, ': sending empty object');

      if (err) {
        console.error('[userPrefs] Retrieval for user',, 'failed: ', err);
        res.status(500).send('Internal Server Error (Storage)');

      // Remove the system fields and ensure the content type is correct
      delete response.body.PartitionKey;
      delete response.body.RowKey;
      delete response.body.Timestamp;
      delete response.body['odata.metadata'];
      response.headers['content-type'] = 'application/json;charset=utf-8';

      // The response gives us all the information we need

I’m implementing two interfaces – a GET of /api/userprefs will get the user preferences for the authenticated user. Since I’m always operating on the authenticated user, I’ve decided to use the user ID as the row key. The Azure Table Storage API gives me just about everything and I could just return the statusCode, headers and body from the API. However, I want the response to be the same as the what was sent to the API service, so I need to remove the meta-data from the response before I send it to the user.

   * Middleware - handles the POST /api/userprefs - body must be a json
   * object.  If the object is not a JSON object, then return 400 Bad Request
   * If the object is a JSON object, then replace the original user prefs and
   * send a 200 Success
   */'/', bodyParser.json(), function (req, res) {
    // check out req.body - must be there and be an object
    if (!req.body || typeof req.body !== 'object') {
      res.status(400).send('Invalid Body');

    // Construct a new object from the parameters within the body
    var entGen = azureStorage.TableUtilities.entityGenerator,
        tableObject = {
          PartitionKey: entGen.String(partitionKey),
          RowKey: entGen.String(

    // Iterate over the parameters and add them to the tableObject
    // Types are inferred when not specified
    for (var key in req.body) {
      // Skip the system properties
      if (key === 'PartitionKey' || key === 'RowKey' || key === 'odata.metadata' || key === 'Timestamp')
      tableObject[key] = { '_': req.body[key] };

    // Store the tableObject in the table store
    store.insertOrReplaceEntity(tableName, tableObject, function (error, result, response) {
      if (error) {
        console.error('[userPrefs] insertOrReplaceEntity: ', error);
        res.status(500).send('Internal Server Error (Storage)');

      // The response gives us all the information we need

The POST interface is a little more complex. The code gets fed a JSON object within the body (and that is decoded by bodyParser.json()). If the code receives a valid object, then it is converted into the form that Azure Table Storage requires and then inserted. If the row already exists, the Azure Table Storage API will automatically replace the existing row. As with the GET version, the response returned from the Azure Table Storage API is “correct” – in this case, no adjustments are required.

  return router;

Finally, I return the router. This router can be “mounted” just like any other express middleware.

Testing the Code

I like to test code locally before committing it to Azure. To do this, I downloaded the Azure Storage Emulator. This can be easily installed on a Windows PC (sorry Mac users – you are out of luck). Once installed, it is located at C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator so make sure you add that to your path. To initialize the storage emulator, run:

AzureStorageEmulator.exe init

Once the storage emulator is initialized, you can start it:

AzureStorageEmulator.exe start

You will also need to set some environment variables. I have a small PowerShell script with the following:

$env:AZURE_STORAGE_ACCOUNT = "devstoreaccount1"
$env:AZURE_STORAGE_CONNECTION_STRING = "UseDevelopmentStorage=true"

The account, access key and connection string are all defined by the Azure Storage Emulator and cannot be changed.

You can now start your project and use Postman to send GET and POST requests to /api/userprefs to see the results. Don’t forget that you need to send an Authorization header or otherwise get set.

Going to Production

I’ve walked through creating an Azure App Service Web App with continuous deployment before. I’m going to assume that your application has been deployed properly via continuous deployment. Once there, add a new service, select Data + Storage and then a Storage Account:


Once you click on that, you will be asked what deployment model you want. You want to change it from Classic to Resource Manager. Click on Create.

In the Create storage account page, give the storage account a unique name. Most importantly, click on the Select existing under Resource group, then select the name of your web apps resource group:


The other options (redundancy, diagnostic logging and region) are up to you. I’d suggest placing the storage in the same region as your web application.

Once the storage account is created, open up the storage account and select Settings then Access Keys:


You want to copy the connection string for one of the keys. I use KEY1. Now, go back to your web app and click on Settings then Application Settings. You want to add an App setting for AZURE_STORAGE_CONNECTION_STRING – copy the connection string in from the storage account.


Don’t forget to click on the save button. Once saved, restart your web site so that the code will read in the new app setting. At this point, you should be able to use the userprefs API endpoint just like you did in your test – it’s just a different URI to access.