30 Days of Zumo.v2 (Azure Mobile Apps): Day 24 – Push with Tags

I introduced push as a concept in the last article, but I left a teaser – push to a subset of users with tags. Tags are really a meta-thing that equates to “interests”, but it’s really the way you would implement such things as “push-to-user” and “push-to-group”. They can literally be anything. Before I can get there, though, I need to be able to register for tags.

Dirty little secret – the current registration API allows you to request tags, but it actually ignores the tags. There is actually a good reason for this – if you allow the client to specify the tags, they may register for tags that they aren’t allowed to. For example, let’s say you implement a tag called “_email:”. Could a user register for a tag with someone elses email address by “hacking the REST request”. The answer, unfortunately, was yes. That could happen. Don’t let it happen to you.

Today I’m going to implement a custom API that replaces the regular push installations endpoint. My endpoint is going to define two distinct sets of tags – a whitelist of tags that the user can subscribe to (anything not an exact match in the list will be thrown out); and a set of dynamic tags based on the authentication record.

The Client

Before I can do anything, I need to be able to request tags. I’ve got an Apache Cordova app and can do requests for tags simply in the register() method:

    /**
     * Event Handler for response from PNS registration
     * @param {object} data the response from the PNS
     * @param {string} data.registrationId the registration Id from the PNS
     * @event
     */
    function handlePushRegistration(data) {
        var pns = 'gcm';
        var templates = {
            tags: ['News', 'Sports', 'Politics', '_email_myboss@microsoft.com' ]
        };
        client.push.register(pns, data.registrationId, templates);
    }

The registration takes an object called “templates”, which contains the list of tags as an array. All the other SDKs have something similar to this. You will notice that I’ve got three tags that are “normal” and one that is special. I’m going to create a tag list that will strip out the ones I’m not allowed to have. For example, if I list ‘News’ and ‘Sports’ as valid tags, I expect the ‘Politics’ tag to be stripped out. In addition, the ‘_email’ tag should always be stripped out since it is definitely not mine.

Note that a tag cannot start with the $ sign – that’s a reserved symbol for Notification Hubs. Don’t use it.

The Node.js Version

The node.js version is relatively simple to implement, but I had to do some work to coerce the SDK to allow me to register a replacement for the push installations:

var express = require('express'),
    serveStatic = require('serve-static'),
    azureMobileApps = require('azure-mobile-apps'),
    authMiddleware = require('./authMiddleware'),
    customRouter = require('./customRouter'),
    pushRegistrationHandler = require('./pushRegistration');

// Set up a standard Express app
var webApp = express();

// Set up the Azure Mobile Apps SDK
var mobileApp = azureMobileApps({
    notificationRootPath: '/.push/disabled'
});

mobileApp.use(authMiddleware);
mobileApp.tables.import('./tables');
mobileApp.api.import('./api');
mobileApp.use('/push/installations', pushRegistrationHandler);

Line 6 brings in my push registration handler. Line 13 moves the old push registration handler to “somewhere else”. Finally, line 19 registers my new push registration handler to take over the right place. Now, let’s look at the ‘./pushRegistration.js’ file:

var express = require('express'),
    bodyParser = require('body-parser'),
    notifications = require('azure-mobile-apps/src/notifications'),
    log = require('azure-mobile-apps/src/log');

module.exports = function (configuration) {
    var router = express.Router(),
        installationClient;

    if (configuration && configuration.notifications && Object.keys(configuration.notifications).length > 0) {
        router.use(addPushContext);
        router.route('/:installationId')
            .put(bodyParser.json(), put, errorHandler)
            .delete(del, errorHandler);

        installationClient = notifications(configuration.notifications);
    }

    return router;

    function addPushContext(req, res, next) {
        req.azureMobile = req.azureMobile || {};
        req.azureMobile.push = installationClient.getClient();
        next();
    }

    function put(req, res, next) {
        var installationId = req.params.installationId,
            installation = req.body,
            tags = [],
            user = req.azureMobile.user;

        // White list of all known tags
        var whitelist = [
            'news',
            'sports'
        ];

        // Logic for determining the correct list of tags
        installations.tags.forEach(function (tag) {
            if (whitelist.indexOf(tag.toLowerCase()) !== -1)
                tags.push(tag.toLowerCase());
        });
        // Add in the "automatic" tags
        if (user) {
            tags.push('_userid_' + user.id);
            if (user.emailaddress) tags.push('_email_' + user.emailaddress);
        }
        // Replace the installation tags requested with my list
        installation.tags = tags;

        installationClient.putInstallation(installationId, installation, user && user.id)
            .then(function (result) {
            res.status(204).end();
        })
            .catch(next);
    }

    function del(req, res, next) {
        var installationId = req.params.installationId;

        installationClient.deleteInstallation(installationId)
            .then(function (result) {
            res.status(204).end();
        })
            .catch(next);
    }

    function errorHandler(err, req, res, next) {
        log.error(err);
        res.status(400).send(err.message || 'Bad Request');
    }
};

The important code here is in lines 33-50. Normally, the tags would just be dropped. Instead, I take the tags that are offered and put them through a whitelist filter. I then add on some more automatic tags (but only if the user is authenticated).

Note that this version was adapted from the Azure Mobile Apps Node.js Server SDK version. I’ve just added the logic to deal with the tags.

ASP.NET Version

The ASP.NET Server SDK comes with a built-in controller that I need to replace. It’s added to the application during the App_Start phase with this:

            // Configure the Azure Mobile Apps section
            new MobileAppConfiguration()
                .AddTables(
                    new MobileAppTableConfiguration()
                        .MapTableControllers()
                        .AddEntityFramework())
                .MapApiControllers()
                .AddPushNotifications() /* Adds the Push Notification Handler */
                .ApplyTo(config);

I can just comment the highlighted line out and the /push/installations controller is removed, allowing me to replace it. I’m not a confident ASP.NET developer – I’m sure there is a better way of doing this. I’ve found, however, that creating a Custom API and calling that custom API is a better way of doing the registration. It’s not a problem of the code within the controller. It’s a problem of routing. In my client, instead of calling client.push.register(), I’ll call client.invokeApi(). This version is in the Client.Cordova project:

    /**
     * Event Handler for response from PNS registration
     * @param {object} data the response from the PNS
     * @param {string} data.registrationId the registration Id from the PNS
     * @event
     */
    function handlePushRegistration(data) {
        var apiOptions = {
            method: 'POST',
            body: {
                pushChannel: data.registrationId,
                tags: ['News', 'Sports', 'Politics', '_email_myboss@microsoft.com' ]
            }
        };

        var success = function () {
            alert('Push Registered');
        }
        var failure = function (error) {
            alert('Push Failed: ' + error.message);
        }

        client.invokeApi("register", apiOptions).then(success, failure);
    }

Now I can write a POST handler as a Custom API in my backend:

using System.Web.Http;
using Microsoft.Azure.Mobile.Server.Config;
using System.Collections.Generic;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using System.Security.Principal;
using Microsoft.Azure.Mobile.Server.Authentication;
using System.Linq;
using Microsoft.Azure.NotificationHubs;
using System.Web.Http.Controllers;

namespace backend.dotnet.Controllers
{
    [Authorize]
    [MobileAppController]
    public class RegisterController : ApiController
    {
        protected override void Initialize(HttpControllerContext context)
        {
            // Call the original Initialize() method
            base.Initialize(context);
        }

        [HttpPost]
        public async Task<HttpResponseMessage> Post([FromBody] RegistrationViewModel model)
        {
            if (!ModelState.IsValid)
            {
                return new HttpResponseMessage(HttpStatusCode.BadRequest);
            }

            // We want to apply the push registration to an installation ID
            var installationId = Request.GetHeaderOrDefault("X-ZUMO-INSTALLATION-ID");
            if (installationId == null)
            {
                return new HttpResponseMessage(HttpStatusCode.BadRequest);
            }

            // Determine the right list of tasks to be handled
            List<string> validTags = new List<string>();
            foreach (string tag in model.tags)
            {
                if (tag.ToLower().Equals("news") || tag.ToLower().Equals("sports"))
                {
                    validTags.Add(tag.ToLower());
                }
            }
            // Add on the dynamic tags generated by authentication - note that the
            // [Authorize] tags means we are authenticated.
            var identity = await User.GetAppServiceIdentityAsync<AzureActiveDirectoryCredentials>(Request);
            validTags.Add($"_userid_{identity.UserId}");

            var emailClaim = identity.UserClaims.Where(c => c.Type.EndsWith("emailaddress")).FirstOrDefault();
            if (emailClaim != null)
            {
                validTags.Add($"_email_{emailClaim.Value}");
            }

            // Register with the hub
            await CreateOrUpdatePushInstallation(installationId, model.pushChannel, validTags);

            return new HttpResponseMessage(HttpStatusCode.OK);
        }

        /// <summary>
        /// Update an installation with notification hubs
        /// </summary>
        /// <param name="installationId">The installation</param>
        /// <param name="pushChannel">the GCM Push Channel</param>
        /// <param name="tags">The list of tags to register</param>
        /// <returns></returns>
        private async Task CreateOrUpdatePushInstallation(string installationId, string pushChannel, IList<string> tags)
        {
            var pushClient = Configuration.GetPushClient();

            Installation installation = new Installation
            {
                InstallationId = installationId,
                PushChannel = pushChannel,
                Tags = tags,
                Platform = NotificationPlatform.Gcm
            };
            await pushClient.CreateOrUpdateInstallationAsync(installation);
        }
    }

    /// <summary>
    /// Format of the registration view model that is passed to the custom API
    /// </summary>
    public class RegistrationViewModel
    {
        public string pushChannel;

        public List<string> tags;
    }
}

The real work here is done by the CreateOrUpdatePushInstallation() method at lines 77-84. This uses the Notification Hub SDK to register the device according to my rules. Why write it as a Custom API? Well, I need things provided by virtue of the [MobileApiController] attribute – things like the notification hub that is linked and authentication. However, doing that automatically links the controller into the /api namespace, thus overriding my intent of replacing the push installation version. There are ways of discluding the association, but is it worth the effort? My thought is no, which is why I switched over to a Custom API. I can get finer control over the invokeApi rather than worry about whether the Azure Mobile Apps SDK is doing something wierd.

Wrap Up

I wanted to send two important messages here. Firstly, use the power of Notification Hubs by taking charge of the registration process yourself. Secondly, do the logic in the server – not the client. It’s so tempting to say “just do what my client says”, but remember rogue operators don’t think that way – you need to protect the services that you pay for so that only you are using them and you can only effectively do that from the server.

Next time, I’ll take a look at a common pattern for push that will improve the offline performance of your application. Until then, you can find the code on my GitHub Repository.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 22 – More on Custom APIs

I’ve been working on custom APIs for the last couple of days. Custom APIs are awesome for executing random pieces of code to do things like “complete an order” or “send a message”. One of the awesome things is that I get access to everything that Azure Mobile Apps has access to. This includes authentication information for the current user, the table controllers and the SQL instance. In ASP.NET, using these things is relatively straight forward. It’s “just” Entity Framework, and that is very well documented. Not so in the Node.js world. In this post, I’m going to go through some examples on what I can do with custom APIs through a series of samples.

Cleaning Up Offline Sync

When I was doing the offline sync items, I said that cleaning up the database for offline sync requires a separate process. In offline sync, records are marked as deleted by setting the deleted field to true. At the same time, the updatedAt field is set to the timestamp for the operation. I want to be able to do something like “delete all recorded where deleted is true and updatedAt is older than 7 days”. There are many mechanisms for doing this. I’m going to set this up as a custom API that is authenticated and requires the user to be in a valid list.

I want my API to be accessible via /api/cleanup – so I’m going to create a cleanup.js file with the following contents:

var allowedUsers = [
    'photoadrian@outlook.com'
];

var api = {
    /**
     * POST /api/cleanup - deletes records during cleanup
     */
    post: function (req, res, next) {

        // Authorization - you must be in the allowed set of users
        if (allowedUsers.indexof(req.azureMobile.user.emailaddress) == -1) {
            res.status(401).send('Unauthorized');
            return;
        }

        // Now execute the SQL Query
        var query = {
            sql: 'DELETE FROM TodoItem WHERE deleted = 1 AND updatedAt < DATEADD(d, -7, GETDATE())',
            parameters: []
        };

        req.azureMobile.data.execute(query).then(function (results) {
            response.json(results);
        });
    },
};

api.post.access = 'authenticated';

module.exports = api;

The major takeaway of this example is that the entire context of the request from the table controllers is available under req.azureMobile. That means you can access the authenticated user information using req.azureMobile.user, for example. Where did the req.azureMobile.user.emailaddress come from? It’s the custom middleware I wrote back in Day 10 that queried the authentication endpoint. It’s available to both custom APIs and Tables. It’s actually available to any endpoint that appears after the Azure Mobile Apps middleware has been added to the Express router.

The other big piece I am using is the req.azureMobile.data object. This is a direct link to the data provider. One of the methods there is an execute statement that “executes a statement against the native provider”. In other words, you need to know what your data provider is going to do with the statement. The default mssql provider just runs the provided SQL and returns the result as a promise.

Customized Searching

One of the things I can do with the custom API is to return an ExpressJS router object instead of an API object. This offers significantly more flexibility. A boiler plate looks like this:

var express = require('express'),
    bodyParser = require('body-parser');

module.exports = function (configuration) {
    var router = express.Router();

    // Set up the sub-routes here

    return router;
};

For example, let’s say I have two tables – one is called categories and one is called items. Each item is in one or more categories. There are a number of ways I can do this, but the basics are that I want to have a REST endpoint called /api/search/category. Here is some example code:

var express = require('express'),
    bodyParser = require('body-parser');

module.exports = function (configuration) {
    var router = express.Router();

    // Retrieve all records in the specified category
    router.get('/category/:category', function (req, res, next) {
        req.azureMobile.tables('items')
            .where({ category: req.params.category })
            .read()
            .then(results => res.json(results))
            .catch(next); // it is important to catch any errors and log them
    });

    return router;
};

I can issue a GET /api/search/category/foo via Postman and receive all the items that have the specified category.

WARNING: Don’t try to be too clever with nesting of promises. I’ve seen some code that does a query and then another query based on the results of that query. This ends up causing major delays in the custom code while the promises get resolved for each query you do, resulting is large waits for data. In general, compose your tables for mobile usage and try not to do more than one SQL operation per API call.

Offline Custom APIs

One of the drawbacks of custom APIs is that you have to be online to do them. How do you deal with Custom APIs in an offline sense? Well, first realize that not every custom API is suitable for being used offline. It depends on the situation, but you really need to understand what the custom API is trying to achieve and whether it is suitable to be made “offline-capable”. As an example, I can see an upload file operation being made offline-capable, but the search categories task being online only. The clean-up task is really an administrative task and you want to understand the timing of it. I wouldn’t even place that functionality within a mobile app.

One way to implement an offline custom API is to use an offline table. Insert the record to initiate the request (together with all of the information necessary to execute the custom API). In your table controllers insert method, just execute your custom API and then mark the record as deleted in the database. This has the added benefit that you know who executed the API, when it was initiated and when it was executed.

Wrap Up

I haven’t finished with Custom APIs yet, but this is the bulk of the work. You should be able to write relatively complex applications that combine data access, offline sync and custom workflows with ease. I’m going to move onto a new topic in the next post – push notifications.

Until then, you can find my code on my GitHub Repository.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 20 – Custom API

Thus far, I’ve covered authentication and table controllers in both the ASP.NET world and the Node.js world. I’ve got two clients – an Apache Cordova one and a Universal Windows one – and I’ve got two servers – a Node.js one and an ASP.NET one. I’ve looked at what it takes to bring in existing SQL tables. It’s time to move on.

Not every thing that you want to do can fit into a nice table controller. Sometimes, you need to do something different. Let’s take, for example, the application key. When we had Mobile Services, the API had an application key. It was meant to secure “the API” – in other words, only your applications could access the API. Others would need to know the application key to get into the API. This is insanely insecure and easily defeated. Anyone downloading your app and installing a MITM sniffer will be able to figure out application key. It’s in a header, after all. Then, all the attacked needed to do is use the REST endpoint with your application key and your API is as open as before. It’s trivial – which is why pretty much no-one who understands security at all will produce an API with an application key any more. It doesn’t buy you anything.

How about a secure approach? When you have a mobile app out there, you have to register it with the various app stores – the Google App Store, Apple iTunes or the Microsoft App Store. The only apps that can use the push notification systems (GCM for Google, APNS for Apple and WNS for Microsoft)re registered apps. So, use a Custom API to request a token. The token is sent via the push notification scheme for the device and is unique to the session. Add that token to the headers and then your API looks for that. This technique is really secure. But it relies on your application being able to receive push notifications and needs your application registered with the stores. In addition, push notifications sometimes take time. Would you want the first experience of your app to be a five minute delay for “registration”?

There is a middle ground. Use a Custom API to create a per-device token. The token can be used for only a certain amount of time before it expires, thus limiting the exposure. Each time the token expires, it must be re-acquired from the server. It isn’t secure – your API can still get hijacked. However, it makes the process much more costly and that, at the end, is probably enough.

Version 1: The Node.js Easy API

You can use the Easy API if you meet all the following criteria:

  • You have created the server with the Node.js Quickstart
  • You have not modified the main application code

If you followed Day 1, then this doesn’t apply to you. Easy Tables and Easy API are only available with a specially configured server that is deployed when you use the Quickstart deployment. Any other deployment pretty much doesn’t work.

Here is how to use Easy API after creating the server. Firstly, go to the Settings menu for your App Service and click on the Easy APIs option. (If you do not have access to Easy APIs, then this will also tell you – in which case, use Version 2 instead). Click on the + Add button and fill in the form:

day-20-p1

I’m only going to access this API via GET, so I’ve disabled the others. For the GET API, I’m enabling anonymous access. I can also select authenticated access. Easy APIs integrates with your regular mobile authentication – the same authentication token used for table access.

Once the API is created, click on the API and then click on Edit script. This will open Visual Studio Online. This will allow you to edit the script online. A blueprint has been implemented for me:

module.exports = {
    //"get": function (req, res, next) {
    //}
}

Not much there – next is my code. The version I’m going to use is this:

var md5 = require('md5');
var jwt = require('jsonwebtoken');

module.exports = {
    "get": function (req, res, next) {
        var d = new Date();
        var now = d.getUTCFullYear() + '-' + (d.getUTCMonth() + 1) + '-' + d.getUTCDate();
        console.info('NOW = ', now);
        var installID = req.get('X-INSTALLATION-ID');
        console.info('INSTALLID = ', installID);
        
        if (typeof installID === 'undefined') {
            console.info('NO INSTALLID FOUND');
            res.status(400).send({ error: "Invalid Installation ID" });
            return;
        }
        
        var subject = now + installID;
        var token = md5(subject);
        console.info('TOKEN = ', token);
        
        var payload = {
            token: token
        };
        
        var options = {
            expiresIn: '4h',
            audience: installID,
            issuer: process.env.WEBSITE_SITE_NAME || 'unk',
            subject: subject
        };
        
        var signedJwt = jwt.sign(payload, installID, options);
        res.status(200).send({ jwt: signedJwt });
    }
};

This won’t work yet – that’s because the md5 and jsonwebtoken modules are not yet available. I can install these through Kudu. Go back to the Azure Portal, select your App Service, then Tools, followed by Kudu. Click on the PowerShell version of the Debug console. change directory into site/wwwroot, then type the following into the console:

npm install --save md5 jsonwebtoken

Did you know You can download your site for backup at any time from here. Just click on the Download icon next to the wwwroot folder.

Version 2: The Node.js Custom API

If you aren’t a candidate for the Easy API, then you can still use Custom APIs and the same code. However, you need to add Custom API’s into your code. Place the code below into the api/createKey.js file. Add the npm packages to the package.json file.

In the Easy API version, there is also a createKey.json file. In the Custom API version, the authentication information is placed in the Javascript file, like this:

var md5 = require('md5');
var jwt = require('jsonwebtoken');

var api = {
    "get": function (req, res, next) {
        var d = new Date();
        var now = d.getUTCFullYear() + '-' + (d.getUTCMonth() + 1) + '-' + d.getUTCDate();
        console.info('NOW = ', now);
        var installID = req.get('X-INSTALLATION-ID');
        console.info('INSTALLID = ', installID);
        
        if (typeof installID === 'undefined') {
            console.info('NO INSTALLID FOUND');
            res.status(400).send({ error: "Invalid Installation ID" });
            return;
        }
        
        var subject = now + installID;
        var token = md5(subject);
        console.info('TOKEN = ', token);
        
        var payload = {
            token: token
        };
        
        var options = {
            expiresIn: '4h',
            audience: installID,
            issuer: process.env.WEBSITE_SITE_NAME || 'unk',
            subject: subject
        };
        
        var signedJwt = jwt.sign(payload, installID, options);
        res.status(200).send({ jwt: signedJwt });
    }
};

api.get.access = 'anonymous';

module.exports = api;

In addition, the custom API system must be loaded in the main server.js file:

var express = require('express'),
    serveStatic = require('serve-static'),
    azureMobileApps = require('azure-mobile-apps'),
    authMiddleware = require('./authMiddleware');

// Set up a standard Express app
var webApp = express();

// Set up the Azure Mobile Apps SDK
var mobileApp = azureMobileApps();
mobileApp.use(authMiddleware);
mobileApp.tables.import('./tables');
mobileApp.api.import('./api');

// Create the public app area
webApp.use(serveStatic('public'));

// Initialize the Azure Mobile Apps, then start listening
mobileApp.tables.initialize().then(function () {
    webApp.use(mobileApp);
    webApp.listen(process.env.PORT || 3000);
});

Once published (or, if you are doing continuous deployment, just checking the code into the relevant branch of your source-code control system), this will operate exactly the same as the Easy API version.

Version 3: The Node.js Custom Middleware

Both the Easy API and Custom API use the same underlying code to do the implementation. You have access to the whole Azure Mobile Apps environment (more on that in a later blog post). However, you are limited in the routes that you can use. You have four verbs (so no HEAD, for example) and very little in the way of variable routes. Sometimes, you want to take control of the routes and verbs. You maybe want to produce a composed API that has a two level Id structure or you are really into doing REST “properly” (which isn’t much, but there are some accepted norms). There are many constraints to the Easy API / Custom API route in Node.js – most notably that the routes are relatively simple. Fortunately, the Node.js SDK uses ExpressJS underneath, so you can just spin up a Router and do the same thing. I’ve placed the following code in the server.js file:

var express = require('express'),
    serveStatic = require('serve-static'),
    azureMobileApps = require('azure-mobile-apps'),
    authMiddleware = require('./authMiddleware'),
    customRouter = require('./customRouter');

// Set up a standard Express app
var webApp = express();

// Set up the Azure Mobile Apps SDK
var mobileApp = azureMobileApps();
mobileApp.use(authMiddleware);
mobileApp.tables.import('./tables');
mobileApp.api.import('./api');

// Create the public app area
webApp.use(serveStatic('public'));

// Initialize the Azure Mobile Apps, then start listening
mobileApp.tables.initialize().then(function () {
    webApp.use(mobileApp);
    webApp.use('/custom', customRouter);
    webApp.listen(process.env.PORT || 3000);
});

Note that I’m putting the custom middleware after I’ve added the Azure Mobile App to the ExpressJS app. Ordering is important here – if I place it before, then authentication and table controllers will not be available – I might need those later on. The customRouter object must export an express.Router:

var express = require('express');
var jwt = require('jsonwebtoken');
var md5 = require('md5');

var router = express.Router();

router.get('/createKey', function (req, res, next) {
    var d = new Date();
    var now = d.getUTCFullYear() + '-' + (d.getUTCMonth() + 1) + '-' + d.getUTCDate();
    console.info('NOW = ', now);
    var installID = req.get('X-INSTALLATION-ID');
    console.info('INSTALLID = ', installID);

    if (typeof installID === 'undefined') {
        console.info('NO INSTALLID FOUND');
        res.status(400).send({ error: "Invalid Installation ID" });
        return;
    }

    var subject = now + installID;
    var token = md5(subject);
    console.info('TOKEN = ', token);

    var payload = {
        token: token
    };

    var options = {
        expiresIn: '4h',
        audience: installID,
        issuer: process.env.WEBSITE_SITE_NAME || 'unk',
        subject: subject
    };

    var signedJwt = jwt.sign(payload, installID, options);
    res.status(200).send({ jwt: signedJwt });
});

module.exports = router;

The actual code here is identical once you get past the change to an ExpressJS Router – in fact, I can put the algorithm in its own library to make it easier to include. The advantage of this technique is flexibility, but at the expense of complexity. I can easily add any routing scheme and use any verb since I’m just using the ExpressJS SDK. It really depends on your situation as to whether the complexity is worth it. This technique is really good for producing composed APIs where you have really thought out the mechanics of the API (as opposed to Easy API which is really good for a one-off piece of functionality). My advice is to either use Custom Middleware or Custom APIs though – don’t mix and match.

Note that this technique does not put APIs under /api – the Azure Mobile Apps SDK takes this over (which is part of the reason why you shouldn’t mix and match).

Version 4: The ASP.NET Custom API

Finally, let’s talk about ASP.NET implementation. There is already a well-known implementation for APIs in ASP.NET, so just do the same thing! The only difference is some syntactic sugar to wire up the API into the right place and to handle responses in such a way that our application can handle them. To add a custom controller, right-click on the Controllers node and use Add -> Controller… to add a new controller. The Azure Mobile Apps Custom Controller should be right at the top:

day-20-p3

Here is the default scaffolding:

using System.Web.Http;
using Microsoft.Azure.Mobile.Server.Config;

namespace backend.dotnet.Controllers
{
    [MobileAppController]
    public class CreateKeyController : ApiController
    {
        // GET api/CreateKey
        public string Get()
        {
            return "Hello from custom controller!";
        }
    }
}

The important piece here is the [MobileAppController] – this will wire the API controller into the right place and register some handlers so the objects are returned properly. I expanded on this in a similar way to my Node.js example:

using System.Web.Http;
using Microsoft.Azure.Mobile.Server.Config;
using System.Web;
using System.Net;
using System;
using System.Security.Cryptography;
using System.Text;
using System.Diagnostics;
using System.IdentityModel.Tokens;
using System.Collections.Generic;
using Jose;

namespace backend.dotnet.Controllers
{
    [MobileAppController]
    public class CreateKeyController : ApiController
    {
        // GET api/CreateKey
        public Dictionary<string, string> Get()
        {
            var now = DateTime.UtcNow.ToString("yyyy-M-d");
            Debug.WriteLine($"NOW = {now}");
            var installID = HttpContext.Current.Request.Headers["X-INSTALLATION-ID"];
            if (installID == null)
            {
                throw new HttpResponseException(HttpStatusCode.BadRequest);
            }
            Debug.WriteLine($"INSTALLID = {installID}");

            var subject = $"{now}-{installID}";
            var token = createMD5(subject);
            var issuer = Environment.GetEnvironmentVariable("WEBSITE_SITE_NAME");
            if (issuer == null)
            {
                issuer = "unk";
            }
            Debug.WriteLine($"SUBJECT = {subject}");
            Debug.WriteLine($"TOKEN = {token}");

            var expires = ((TimeSpan)(DateTime.UtcNow.AddHours(4) - new DateTime(1970, 1, 1))).TotalMilliseconds;
            var payload = new Dictionary<string, object>()
            {
                { "aud", installID },
                { "iss", issuer },
                { "sub", subject },
                { "exp", expires },
                { "token", token }
            };

            byte[] secretKey = Encoding.ASCII.GetBytes(installID);
            var result = new Dictionary<string, string>()
            {
                { "jwt", JWT.Encode(payload, secretKey, JwsAlgorithm.HS256) }
            };

            return result;
        }

        /// <summary>
        /// Compute an MD5 hash of a string
        /// </summary>
        /// <param name="input">The input string</param>
        /// <returns>The MD5 hash as a string of hex</returns>
        private string createMD5(string input)
        {
            using (MD5 md5 = MD5.Create())
            {
                byte[] ib = Encoding.ASCII.GetBytes(input);
                byte[] ob = md5.ComputeHash(ib);
                StringBuilder sb = new StringBuilder();
                for (int i = 0; i < ob.Length; i++)
                {
                    sb.Append(ob[i].ToString("X2"));
                }
                return sb.ToString();
            }
        }
    }
}

Most of this code is dealing with the C#.NET equivalent of the Node code I posted earlier in the article. I’m using jose-jwt to implement the JWT signing. The algorithm is identical, so you should be able to use the same client code with either a Node or ASP.NET backend. Want it authenticated? Just add an [Authorize] annotation to the method.

Testing the API

In all cases, you should be able to do a Postman request to GET /api/createKey (or /custom/createKey if you are using the Node custom middleware technique) with a header for X-INSTALLATION-ID that in a unique ID (specifically, a GUID):

day-20-p2

If you don’t submit an X-INSTALLATION-ID, then you should get a 400 Bad Request error.

What are Custom APIs good for?

I use this type of custom API commonly to provide additional settings to my clients or to kick off a process. Some examples of simple Custom APIs:

  • Push to a Tag from a client device
  • Get enabled features for a client
  • Get an Azure Storage API Key for uploading files

The possibilities are really open to what you can dream up.

What are Custom APIs not good for?

Custom APIs are not good candidates for offline usage. There are ways you can queue up changes for synchronization when you are back online. In general, these end up being a hacked up version of a table controller – the client inserts a record into the offline table; when it syncs the backend processes the custom API during the insert operation. However, I cringe when writing that. A better idea would be to implement an offline queue mechanism. In any case, custom APIs are not good for an offline sync scenario.

Next Steps

I only covered the various server APIs this time. In the next article, I’ll take a look at calling the custom API from the clients and adjusting the request properties so that special headers can be inserted. After that, I’m going to cover accessing the Azure Mobile Apps data and authentication objects from within your custom API so that you can do some interesting things with data.

Until then, you can check all four implementations at my GitHub Repository.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 14 – Linking Existing Tables

Thus far, I’ve done a bunch on the server that is associated with a new database and a new table. Most of the time, however, I am linking to an existing database and an existing table. Azure Mobile Apps requires that tables provide certain fields. In this article, I’m going to look at linking an existing table and exposing it as a Mobile Table.

I’m going to assume you have already gotten the database linked to the Azure Mobile Apps via the Data Connections Blade. You need to provide a suitable user for accessing the database (the user must have enough rights to read and write data).

Step 1: Prepare the table

There are five fields that should be considered “system fields” in all mobile tables. These fields need to be named specifically and have specific types:

Field Name Field Type Default Value
id nvarchar(255) newid()
createdAt datetimeoffset(3) sysutcdatetime()
updatedAt datetimeoffset(3) null
version timestamp
deleted bit 0

You can create an empty table like this:

CREATE TABLE [dbo].[tableName]
(
  [id] [nvarchar](255) NOT NULL CONSTRAINT [DF_tableName_id]  DEFAULT (CONVERT([nvarchar](255),newid(),(0))),
  [createdAt] [datetimeoffset](3) NOT NULL CONSTRAINT [DF_tableName_createdAt]  DEFAULT (CONVERT([datetimeoffset](3),sysutcdatetime(),(0))),
  [updatedAt] [datetimeoffset](3) NULL,
  [version] [timestamp] NOT NULL,
  [deleted] [bit] NULL DEFAULT ((0))

  PRIMARY KEY NONCLUSTERED ([id] ASC) 
)

Similarly, you can use an ALTER TABLE statement to adjust the table instead. The major concern here is, of course, the id field – this field is commonly configured as an auto-incrementing number. Unfortunately, this is not suitable for mobile clients – two clients may asynchronously add a record and your service has to deal with this.

In addition to the table update, you need to create a trigger for setting the updatedAt record:

CREATE TRIGGER [TR_tableName_InsertUpdateDelete] ON [dbo].[tableName]
AFTER INSERT, UPDATE, DELETE AS
BEGIN
  SET NOCOUNT ON;
  IF TRIGGER_NESTLEVEL() > 3 RETURN;
  UPDATE [dbo].[tableName] 
    SET [updatedAt] = CONVERT (DATETIMEOFFSET(7),SYSUTCDATETIME())
    FROM INSERTED WHERE INSERTED.id = [dbo].[tableName].[id]
END

Step 2: Define the table

If you are using an existing table, you generally don’t want Azure Mobile Apps changing it for you. As a result, you should turn off dynamic schema and define the columns you want to be included in the view:

var table = require('azure-mobile-apps').table();

table.dynamicSchema = false;
table.columns = {
    column1: 'string',
    column2: 'number',
    column3: 'boolean',
    column4: 'datetime'
};

module.exports = table

The translation of the types are:

Azure Mobile Apps type SQL Server type
boolean bit
datetime datetimeoffset(7)
number float(53)
string nvarchar(max)

If you need your database locked down even more and your types don’t conform to these types, then you will need to switch over to an ASP.NET project. The ASP.NET project has access to Entity Framework, which allows you to generate more specific types. You do this by locking down the model. JavaScript is a little more forgiving on the model specification, but at the price of flexibility in the SQL types available to you.

You can, of course, do all the authentication, per-table middleware and per-operation middleware that I’ve been discussing throughout this series. This table definition only includes the definition of the actual table.

Alternative Schema

Let’s say your database isn’t in [dbo] – it might be in [myschema] instead. You can handle this easily by setting the data.schema setting. This is set within an App Setting – MS_TableSchema, so there are two ways of setting it. Firstly, you can alter the initializer of Azure Mobile Apps in your app.js:

var mobileApp = azureMobileApps({
    data: {
        schema: 'myschema'
    }
});

However, this doesn’t provide any flexibility. Let’s say you have a test version of your database in schema ‘test’ and the production database in ‘production’ – how do you do that? The answer is to set the App Settings:

  1. Log onto the Azure Portal and select your App Service
  2. Click on All Settings then Application Settings
  3. Under App settings, add a new app setting. The key should be MS_TableSchema and the value should be the schema name.

If you use deployment slots for deploying your site, you can make the app setting a deployment slot setting – this allows you to specify different schemas for (for example) staging vs. production automatically – when the slots are swapped, the app gets the replacement setting.

Next Time

It’s probably time I cover a killer feature of Azure Mobile Apps – Offline Sync. This is not just a feature to allow access to data when the network is not there – it’s also a feature used for performance – especially when there is a lot of data to transfer.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 11 – Table Access

There are times when accessing one table for your data is just not enough. Let’s take an example. I have a simple task list app right now. I’ve got it running on all the mobile platforms – iOS, Android and Windows. But I can only see my tasks. What if I want to show my tasks to someone else? I’m going to put together a fairly simple adjustment to my application during this article. Firstly, I’m going to add a “Friends” page to my Universal Windows application – where I can enter the email address of a friend and then they can see my tasks (with a different icon)

The Friends Table

To start with, I need a friends table. This is a new table called friend.js but is just like the todoitem.js table:

var azureMobileApps = require('azure-mobile-apps');

// Create a new table definition
var table = azureMobileApps.table();

// Require authentication
table.access = 'authenticated';

// CREATE operation
table.insert(function (context) {
    context.item.userId = context.user.emailaddress;
    return context.execute();
});

// READ operation
table.read(function (context) {
    context.query.where({ userId: context.user.emailaddress });
    return context.execute();
});

// UPDATE operation
table.update(function (context) {
    context.query.where({ userId: context.user.emailaddress });
    context.item.userId = context.user.emailaddress;
    return context.execute();
});

// DELETE operation
table.delete(function (context) {
    context.query.where({ userId: context.user.emailaddress });
    return context.execute();
});

module.exports = table;

I’ve also created a new Universal Windows client that processes this table. You can find the client at my GitHub repository. It has a People tab and a Tasks tab. The People tab allows you to enter email addresses in to the table so that other people can view your data. Note that this is in no way how I would write this code in reality. There are way too many mechanisms by which you can abuse this. However, it’s good to view as a proof of concept. I’m also not going to cover the .NET client – this is about the backend code, not the frontend.

The question for this blog post is this: How do I implement a mechanism whereby my table view is limited by another table? In this case, I want to view all records that I am allowed to see – either because I own them or because someone has given me permission to view them. This will be a change to the read() method in the todoitem.js table controller. I am splitting this into two halves. In the first half I will get a list of all the user IDs that I can view:

// READ operation
table.read(function (context) {
    return context.tables('friend')
        .where({ viewer: context.user.emailaddress })
        .select('userId')
        .read()
        .then(function (friends) {
            console.log('READ: Response from SQL Query = ', friends);
            var list = friends.map(function (f) { return f.userId; }).push(context.user.emailaddress);
            console.log('READ: List of user ids = ', list);
            
            // TODO: Adjust the query according to the requirements
            return context.execute();
        });
});

Let’s take this line by line. Line 18 gets a reference to the friend table. I then construct a query whereby all records where the current user is marked as a viewer. The table itself has just two columns – userId is the “owner” and viewer is the email address I place in the table in the UI. It reads “viewer can read the tasks for userId”. Once I’ve got that, I’m going to limit the returned columns to just userId and then finally (line 21) execute the query and read the results. The result of this is a Promise that resolves to the list of userIds. I’ll get an array of objects – each object will have one property (the userId field).

There are more methods that you can use that are “LINQ-like”. I’ve already introduced where, select and read. You can also use:

  • orderBy(field) and orderByDescending(field) to do sorting
  • skip(count) and take(count) to implement paging
  • includeTotalCount() to add a total number of records field to the results

Once I’ve resolved the promise, I need to convert the list to a list of email addresses – these become the list of email addresses to use as the viewership. Notice that I add the current users email address to the list.

Next, I need to adjust my search of the database to return tasks owned by anyone in the list of email addresses I’ve just created. This is actually quite hard to do. One of the methods of altering the query that is sent to the database is to provide a function:

// READ operation
table.read(function (context) {
    return context.tables('friend')
    .where({ viewer: context.user.emailaddress })
    .select('userId')
    .read()
    .then(function (friends) {
        var list = friends.map(function (f) { return f.userId; })
        list.push(context.user.emailaddress);
        context.query.where(function(list) { return this.userId in list; }, list);
        return context.execute().then(function (results) {
          // Process the results
          console.log('results = ', results);
          return results;
        });
    });
});

This is a different mechanism to altering the query. Underneath, the query is compiled to an AST by esprima, and then converted to SQL from the compiled form. You never have to worry about this until it bites you because you see JavaScript and assume you can use any JavaScript function in there where method. There is a limited set of things you can do within a function call. In the context of the function, this refers to the current item. You can pass in one or two arguments (such as lists) and anything you need to use must be passed in. You can’t rely on the context that the function appears to be running in – it’s not running a function. Once inside the function, there is a set of methods you can use:

  • toUpperCase()
  • toLowerCase()
  • trim()
  • indexOf()
  • concat()
  • substring() or substr()
  • replace()
  • getFullYear()
  • getUTCFullYear()
  • getYear()
  • getDate()
  • getUTCDate()
  • in

Obviously, most of these only work on specific datatypes – strings and dates, most normally. However, you can use in with arrays. You can also use math capabilities as well as boolean logic inside the function. I’m using this functionality to change the query to be “records I am allowed to view”. My final step in this rather complicated read logic is to change the results so that I return a computed “shared” flag:

// READ operation
table.read(function (context) {
    return context.tables('friend')
    .where({ viewer: context.user.emailaddress })
    .select('userId')
    .read()
    .then(function (friends) {
        var list = friends.map(function (f) { return f.userId; })
        list.push(context.user.emailaddress);
        context.query.where(function(list) { return this.userId in list; }, list);
        return context.execute().then(function (results) {
          results.forEach(function (item) {
            // Item is shared if the owner of item is not current owner
            item.shared = (item.userId !== context.user.emailaddress);
          });
          return results;
        });
    });
});

Here, the highlighted lines add in a computed field called shared that is true when the record does not belong to the current user. I also need to add in some logic to REMOVE the shared flag from the incoming records:

// CREATE operation
table.insert(function (context) {
    context.item.userId = context.user.emailaddress;
    delete context.item.shared;
    delete context.item.Shared;
    return context.execute();
});

// UPDATE operation
table.update(function (context) {
    context.query.where({ userId: context.user.emailaddress });
    context.item.userId = context.user.emailaddress;
    delete context.item.shared;
    delete context.item.Shared;
    return context.execute();
});

Next Steps

Today, I’ve gone through the more esoteric adjustments to the query you can do, doing sub-queries and manipulating the results. In the next article, I’m going to start looking at doing conflict resolution.

In the mean time, you can download my client and server:

30 Days of Zumo.v2 (Azure Mobile Apps): Day 10 – Middleware

I’ve covered the basics of table controllers in the last two posts, covering both per-table configuration and per-operation configuration. Today I’m going to talk about extensibility and middleware. Middleware is a fairly basic concept – it provides a capability of inserting custom code into the HTTP request/response pipeline. Express has a whole section on middleware and pretty much all Express modules are written as middleware. Here is a template:

function myMiddleware(request, response, next) {
    // Do something with request

    // Return something with response or next

    next();  // Call the next middleware
}

app.use(myMiddleware);

Azure Mobile Apps is written as a set of middleware, but it’s been constructed to be interchangable. Let’s look at a sample service. Here is the app.js:

var app = require('express')(),
    mobile = require('azure-mobile-apps')();

app.use(function (req, res, next) {
  console.log('express.use - azureMobile = ', typeof req.azureMobile !== 'undefined');
  return next();
});

zumo.use(function (req, res, next) {
  console.log('zumo.use - azureMobile = ', typeof req.azureMobile !== 'undefined');
  return next();
});

zumo.tables.import('./tables');
zumo.tables.initialize().then(() => {
  app.use(zumo);
  app.listen(3000);
});

And my sample table:

var table = require('azure-mobile-apps').table();

table.use(function (req, res, next) {
  console.log('table.use: azureMobile = ', typeof req.azureMobile !== 'undefined');
  return next();
}, table.operation);

table.read(function (context) {
  console.log('tableop: azureMobile = ', typeof context.req.azureMobile !== 'undefined');
  return context.execute();
});

module.exports = table;

If you run this locally and do a GET of the /tables/todoitem endpoint, you will see the following console output:

express.use - azureMobile = false
zumo.use - azureMobile = true
table.use: azureMobile = true
tableop: azureMobile = true

This gives you a very clear indication of what is available and the most appropriate place to add code. As an example, let’s say the majority of your code depends on the email address being available. For example, you are doing the personal table. You may want to add the email address to the user object. You can easily do this with a zumo.use middleware. Add a file authMiddleware.js with the following contents:

var authCache = {};

function authMiddleware(request, response, next) {
  if (typeof request.azureMobile.user.id !== 'undefined') {
    if (typeof authCache[request.azureMobile.user.id] !== 'undefined') {
      request.azureMobile.user.emailaddress = authCache[request.azureMobile.user.id];
      next();
    }
    request.azureMobile.user.getIdentity().then(function (userInfo) {
      if (typeof userInfo.aad.claims.emailaddress !== 'undefined') {
        authCache[request.azureMobile.user.id] = userInfo.aad.claims.emailaddress;
        request.azureMobile.user.emailaddress = authCache[request.azureMobile.user.id];
      }
      next();
    });
  } else {
    next();
  }
}

module.exports = authMiddleware;

This code does some primitive caching of the result from getIdentity(), but otherwise pulls out the email address just like I did in the Personal Table article. Instead of just using it, though, it puts it in the context for use later.

What’s this request.azureMobile? It’s the context and has all the same properties that I discussed in the table operations article.

Now that I have my auth middleware written, I can use this to adjust my original personal table. Firstly, I need to link it into the app.js:

var express = require('express'),
    serveStatic = require('serve-static'),
    azureMobileApps = require('azure-mobile-apps'),
    authMiddleware = require('./authMiddleware');

// Set up a standard Express app
var app = express();
var mobileApp = azureMobileApps({
    homePage: true,
    swagger: true
});

mobileApp.use(authMiddleware);
mobileApp.tables.import('./tables');
mobileApp.api.import('./api');

I can now adjust my table controller to make it much more simple:

var azureMobileApps = require('azure-mobile-apps');

// Create a new table definition
var table = azureMobileApps.table();

// Require authentication
table.access = 'authenticated';

// CREATE operation
table.insert(function (context) {
  context.item.userId = context.user.emailaddress;
  return context.execute();
});

// READ operation
table.read(function (context) {
  context.query.where({ userId: context.user.emailaddress });
  return context.execute();
});

// UPDATE operation
table.update(function (context) {
  context.query.where({ userId: context.user.emailaddress });
  context.item.userId = context.user.emailaddress;
  return context.execute();
});

// DELETE operation
table.delete(function (context) {
  context.query.where({ userId: context.user.emailaddress });
  return context.execute();
});

module.exports = table;

Note that I’m not using getIdentity() any more. For reference, here is what my read method used to look like:

table.read(function (context) {
    return context.user.getIdentity().then(function (userInfo) {
        context.query.where({ userId: userInfo.aad.claims.emailaddress });
        return context.execute();
    });
});

I’ve isolated my getIdentity() call into the middleware so that the email address is available everywhere.

Doing Some Authorization

You can also do some basic authorization using the same technique. For example (assuming the list is small), you can configure Azure AD to provide the list of groups in a claim. Here is how to do it. To configure AAD to produce the groups as claims:

  1. Log onto the Azure Portal and go to your Azure AD Configuration.
  2. Click on Manage Manifest (at the bottom of the page), then Download Manifest – this will download a JSON file.
  3. Edit the downloaded file with a JSON editor (I use Atom)
  4. Set the “groupMembershipClaims” value to “SecurityGroup”

    day-10-p1

  5. Save the file and upload it by using Manage Manifest -> Upload Manifest

  6. Click on the APPLICATIONS tab
  7. Click on your Web Application (not the Native Client)
  8. Scroll to the bottom under “permissions to other applications”
  9. Click on the Delegated Permissions and check the box next to “Read directory data”

    day-10-p2

  10. Click on Save

Now that you have configured the application to return groups, you need a group to return.

  1. Click on the back arrow to return to the top level of your directory
  2. Click on the GROUPS tab
  3. Click on ADD GROUP in the bottom banner
  4. Enter a name for the group, set the Group Type to Security, then click on the Tick.
  5. Add a member using the ADD MEMBER button in the bottom banner
  6. Click on the user you wish to add, then click on the + in a circle. Repeat for multiple users.
  7. Finally, click on the tick.
  8. Click on PROPERTIES and make a note of the Object ID

The Object ID is the GUID of the group and is what will show up in the groups list. After doing all this and re-running your client, the getIdentity() method will return the group information in the returned value, but it’s not in the most accessible manner:

day-10-p3

Note the item highlighted in user_claims. This is the following, when properly laid out:

"user_claims": [
    // Some other stuff
    { "typ": "groups", "val": "0d270bc3-6431-4d9c-8a05-ba7fa0b5463c" },
    { "typ": "groups", "val": "92d92697-1242-4d38-9c1d-00f3ea0d0640" },
    // More stuff
],

We can convert this into something usable using the following (this is the authMiddleware.js file):

var authCache = {};

/**
 * Reducer method for converting groups into an array
 * @param {string[]} target the accumulator for the array
 * @param {object} claim the current claim being processed
 * @returns {string[]} the accumulator
 */
function groupReducer(target, claim) {
    if (claim.typ === 'groups')
        target.push(claim.val);
    return target;
}
/**
 * Middleware for adding the email address and security groups to the
 * request.azureMobile.user object.
 * @param {express.Request} request the Express request
 * @param {express.Response} response the Express response
 * @param {function} next the next piece of middleware
 * @returns {any} the result of the next middleware
 */
function authMiddleware(request, response, next) {
    if (typeof request.azureMobile.user === 'undefined')
        return next();
    if (typeof request.azureMobile.user.id === 'undefined')
        return next();

    if (typeof authCache[request.azureMobile.user.id] === 'undefined') {
        request.azureMobile.user.getIdentity().then(function (userInfo) {
            var groups = userInfo.aad.user_claims.reduce(groupReducer, []);
            console.log('userInfo.aad.claims = ', userInfo.aad.claims);
            var email = userInfo.aad.claims.emailaddress || userInfo.aad.claims.upn;
            authCache[request.azureMobile.user.id] = {
                emailaddress: email,
                groups: groups
            };

            request.azureMobile.user.emailaddress = authCache[request.azureMobile.user.id].emailaddress;
            request.azureMobile.user.groups = authCache[request.azureMobile.user.id].groups;
            next();
        });
    } else {
        request.azureMobile.user.emailaddress = authCache[request.azureMobile.user.id].emailaddress;
        request.azureMobile.user.groups = authCache[request.azureMobile.user.id].groups;
        next();
    }
}

module.exports = authMiddleware;

Lines 9-12 are a reducer function for converting the object-version of the claims into an array. This is used in line 32 to convert the user_claims into an array that I can actually use. I’ve done the same thing as I did in the case of the email address. Now I can use that in my table controller to implement group-based authorization. For that, I need the object ID of the group I am interested in. For the purposes of this test, I’ve got a group called Administrators with an Object ID of 92d92697-1242-4d38-9c1d-00f3ea0d0640 – this was obtained from the Azure AD configuration screens in the Azure Portal. Authorization is easily done within the table controller with some code:

// DELETE operation
table.delete(function (context) {
    console.log('DELETE: context.user = ', context.user);
    // Authorization - if Administrators is not in the group list, don't allow deletion
    if (context.user.groups.indexOf('92d92697-1242-4d38-9c1d-00f3ea0d0640') < 0) {
        console.log('user is not a member of Administrators');
        context.res.status(401).send('Only Administrators can do this');
        return;
    }

    console.log('user is a member of Administrators');
    context.query.where({ userId: context.user.emailaddress });
    return context.execute();
});

If you log in as a member of the Administrators group, then you will be able to delete. You will see an error message for others.

Of course, this isn’t really convenient. I mean – who wants to code up the authorization for each group? Instead, let’s show off a little middleware for the operation. First the middleware, which I’ve put at the top of the file, but could just as easily be in its own file:

function isAdministrator(request, response, next) {
    if (request.azureMobile.user.groups.indexOf('92d92697-1242-4d38-9c1d-00f3ea0d0640') < 0) {
        response.status(401).send('Only Administrators can do this');
        return;
    }
    next();
}

I do the same check as before in this middleware. If the user is not a member of the required group, then the same response (401 Unauthorized) with a custom message is done. You may remember that I can integrate middleware at the table level using this:

table.use(isAdministrator, table.operation);

I can also do this on a per-operation basis. The table.use is just a shortcut for the individual operations. So:

table.delete.use(isAdministrator, table.operation);

Putting it together, I can code my delete script as follows:

// DELETE operation
function isAdministrator(request, response, next) {
    if (request.azureMobile.user.groups.indexOf('92d92697-1242-4d38-9c1d-00f3ea0d0640') < 0) {
        response.status(401).send('Only Administrators can do this');
        return;
    }
    next();
}

table.delete.use(isAdministrator, table.operation);
table.delete(function (context) {
    context.query.where({ userId: context.user.emailaddress });
    return context.execute();
});

Next Steps

Authorization is a major milestone for most real applications. However, we are still operating within a single table without any interactivity between tables. I’d like to adjust that with my next article – going both multi-table and providing linkage between the tables.

Until then, I’ve uploaded my code to my GitHub Repositories. To make it easy, I’ve separated server and client.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 8 – Table Controller Basics

Over the last six articles, I’ve covered just about everything there is to do about Authentication. It’s time to turn my attention to data. Specifically, SQL table data. Azure Mobile Apps has two SDKs on the server side – an ASP.NET 4.6 version and a Node.js version. The node.js version is also used for the “in-portal” version of Azure Mobile Apps. I’m going to do another series of Zumo Days explicitly about the ASP.NET version of Azure Mobile Apps, so stay tuned for that.

So, what is a Table Controller?

If you strip away all the “mobile” specific pieces of Azure Mobile Apps, it’s an OData v3 data source at its core. A table controller responds to the GET, POST, PUT and DELETE HTTP requests and gives you data within the SQL table that backs it.

The bad way of defining a Table Controller

You don’t need anything special to define a table controller. In you app.js file, do the following:

var express = require('express'),
    zumo = require('azure-mobile-apps');

var app = express(), 
    mobile = zumo();

mobile.tables.add('todoitem', tableDefinition);

// Rest of the app definition

We will get to the table definition later on. This is a bad way of defining a table. It doesn’t provide any separation of duties. It just mixes up the configuration of the tables in code. Don’t do this.

The good way of defining a Table Controller

Create a tables directory. For each table that you need to expose to your mobile applications, create a JavaScript file with the same name. For example, let’s look at the TodoItem table. This would be called todoitem.js, with the following contents:

var table = require('azure-mobile-apps').table();

module.exports = table;

In your app.js file, you need to import your table files:

var express = require('express'),
    azureMobileApps = require('azure-mobile-apps');

var app = express(),
    zumo = azureMobileApps({
    homePage: true,
    swagger: true
});

zumo.tables.import('./tables');
app.use(zumo);
app.listen(process.env.PORT || 3000);

Defining a Table

There are several options that you can define on a table – the tableDefinition from earlier. Here are all the options:

  • name – the name of the endpoint. You don’t need to set this – it’s set from the name of the file (in import mode) or the first parameter of the add() method.
  • databaseTableName – the name of the SQL table you are using for this endpoint. It’s assumed to be the same as name if you don’t see it.
  • schema – the schema that the SQL table is in. By default, that’s the dbo schema, or whatever is set in the MS_SchemaName app setting.
  • access – one of ‘anonymous’, ‘authenticated’ or ‘disabled’ – controls access to this endpoint.
  • dynamicSchema – a boolean. If true, turns dynamic schema on (more on this later).
  • columns – a definition of the model to export if dynamicSchema is not turned on.
  • maxTop – the maximum number of rows to return in one page of results.
  • softDelete – true if soft delete is turned on.

You can define all of these in your table definition file like this:

var table = require('azure-mobile-apps').table();

table.name = 'todoitem';
table.schema = 'dbo';
table.databaseTableName = 'Tasks';
table.access = 'disabled';
table.dynamicSchema = true;
table.maxTop = 1000;
table.softDelete = false;

module.exports = table;

Dynamic or Static Schema?

The node.js Server SDK defines four data types for you to use:

  • string converted to NVARCHAR(255)
  • number converted to FLOAT(53)
  • boolean converted to BIT
  • datetime converted to DATETIMEOFFSET(7)

The “converted to” indicates what column type the value will be in the SQL Azure table. If you turn dynamicSchema on by setting the property to true in the table definition, then the SDK will accept any field and create fields if necessary. This is great during development because you don’t have to think about what your tables are going to look like – you just send data to the backend and it will store it. However, this is really problematic during production – malicious users can overwhelm your database and store data secretly, using up your data allotment. You will want to lock down the schema before going to production. Let’s take a look at a more typical todoitem schema:

var table = require('azure-mobile-apps').table();

table.dynamicSchema = false;
table.columns = {
    userid: 'string',
    text: 'string',
    complete: 'boolean',
    due: 'datetime',
    alert: 'number'
};

module.exports = table;

This shows off all the data types. There is one more wrinkle here. The Azure Mobile Apps SDK creates tables defined with dynamic schema turned on automatically on first insert. That table creation doesn’t happen automatically when you are using static schema. You have to initialize your database like this:

var express = require('express'),
    azureMobileApps = require('azure-mobile-apps');

var app = express(),
    zumo = azureMobileApps({
    homePage: true,
    swagger: true
});

zumo.tables.import('./tables');
zumo.tables.initialize().then(() => {
    app.use(zumo);
    app.listen(process.env.PORT || 3000);
});

The initialize() method creates the database, returning a Promise that resolves when the database creation is complete. In this case, I don’t start listening for requests until the database is created. If I don’t do that then there is the possibility that I receive a request that I cannot fulfill – the SDK will return a HTTP 500 status (Internal Server Error) in this case. If you have dynamic schema turned on then initialize() becomes a no-op. As a result, you should ALWAYS include the initialize() call to ensure your users never get a 500 Internal Server Error. In fact, I recommend you never use dynamic schema – it’s a bad idea unless you are just playing around.

Seeding Data

Since you are running initialize() all the time now, you may want to store some data by default:

var table = require('azure-mobile-apps').table();

table.dynamicSchema = false;
table.columns = {
    userid: 'string',
    text: 'string',
    complete: 'boolean',
    due: 'datetime',
    alert: 'number'
};

table.seed = [
  { userid: '', text: 'My First Item', complete: false, due: null, alert: '7' }
];

module.exports = table;

When you have a seed property, the data contained within the array will be pushed into the table when the table is created. If the table already exists (and so initialize() skips over the table), then seeding doesn’t happen.

Controlling Access

You can control access to the table as a whole with the access property:

  • ‘authenticated’ means you must be logged in to access the endpoint. If you are not authenticated, a 401 Unauthorized response will be returned
  • ‘anonymous’ means anyone can access the endpoint.
  • ‘disabled’ means the table is not visible. The server will return a 405 Method Not Allowed response.

Normally, you would only use authenticated as an access value at the table level. You can also set access at the operation level, which is why this option exists. Anonymous access is the default.

Soft Delete

When a user sends a HTTP DELETE operation, there are two mechanisms. Firstly, you can actually delete the record – a hard delete. The record is removed from the database table immediately. In the second case, a soft delete, the deleted property on the record is set to true. It will no longer be returned by the server unless explicitly requested. This is useful in multi-client situations where some clients are using offline sync. The sync table will be updated to remove the record. You will, however, have to manually remove the actual database record – it will no longer be removed from the database through the SDK.

You can configure soft delete as an option on the table like this:

table.softDelete = true;

Next Steps

Today I took a look at almost all the table level operations that you can perform with the Azure Mobile Apps SDK. There is one more, but I’ll leave that for a later post as it’s particularly awesome. I know – teaser! In the next article, I’m going to go down one level and look at what I can configure at the HTTP operation level. In the mean time, you can review the API documentation for the Azure Mobile Apps SDK.