30 Days of Zumo.v2 (Azure Mobile Apps): Day 29: Post-Processing Files with Azure Functions

There are times when you don’t want things to happen as a result of an API call – maybe it’s a cron job that cleans out deleted records; or maybe it’s a post-processing job that looks at images you have uploaded and runs some process on them (such as facial recognition, cognitive services or just plain image handling). Whatever the case, this isn’t handled within Azure Mobile Apps easily. It’s time for a new service.

That service is Azure Functions. Azure Functions is a dynamic compute facility that runs “code” when triggered by an event. It responds to triggers by running code. I introduced a whole bunch of things there, so let’s look at each one.

Dynamic Compute is awesome and it’s the central theme of functions. You may have heard of the “serverless” movement. It’s the new word for an old idea – you don’t have to manage the resources yourself – the platform does it for you. If you had used the Parse server or another MBaaS in the past – same concept. You never had to worry about the VM running the code in those services. Dynamic Compute grabs the resources necessary to run the job and then releases the resources after it is run. You only pay for the resources that you consume. It’s like the ultimate endpoint of the promise of cloud computing.

Events or Triggers can be anything. Yes, they could be a HTTP endpoint (although I am not recommending that right now – I’ll explain at the end). They could also be a timer, an object placed into a storage queue or a file uploaded to blob storage.

Functions are not Web Apps. You can’t use the same project you used for your mobile backend to define your functions. Right now, the best way to define your functions is through the Azure Portal.

Creating a Function for cleaning up the TodoItem table

Let’s create a simple function based on a timer. Our mobile TodoItem table has soft-delete enabled. That means that instead of deleting the records directly, the records are marked as deleted by setting the deleted column to 1. I need to clean up those, so let’s define a simple function that runs every Sunday morning at 3am and deletes all the records marked for deletion. It’s just a SQL statement change if you want to be more prescriptive about this and delete all the records marked for deletion that are older than 7 days, for example.

Start by going to https://functions.azure.com, clicking on the big friendly Get Started button and logging in with your Azure credentials. Click on the + NEW button then select Web + Mobile, then Function App. Give the function app a name, select a resource group and app service plan and click on create.

Screen Shot 2016-06-01 at 7.11.08 PM

Note that creating a function app creates a storage area as well. This is required and has to do with a common storage area for your functions since you don’t actually know where they will run. It’s not serverless per-se (which is a silly term, IMHO – there is always a server), but that is what the term means. You don’t define a server for this app to run on, but it needs a little bit of shared storage to hold your code, logs, etc.

Once the function app has been deployed, the function app blade will open. It’s very different from the other blades with a unique management experience.

Screen Shot 2016-06-01 at 7.17.16 PM

In this case, I want to create something akin to a cron job – a timer job. I’m going to write my timer job in C#, but you could write something similar in Node.js, for example. You can also click on the + New function, in which case you get access to the full range of scenarios. The Timer Function is in the Core set of scenarios near the end. Give the function a name. The Cron schedule I picked is 0 0 3 ? * SUN *, which is one that is compatible with Quartz – a great .NET library for doing cron-type things.

Screen Shot 2016-06-01 at 7.22.09 PM

Once I click on Create, I am in a code editor. Here is the code for a basic function that deletes all the records marked as deleted:

#r "System.Configuration"
#r "System.Data"

using System;
using System.Configuration;
using System.Data.SqlClient;
using System.Threading.Tasks;

public static async Task Run(TimerInfo myTimer, TraceWriter log)
    var connStr = ConfigurationManager.ConnectionStrings["sqldb"].ConnectionString;
    using (var conn = new SqlConnection(connStr))
        var sqlCmd = "DELETE FROM TodoItems WHERE deleted = 'True'";
        using (var cmd = new SqlCommand(sqlCmd, conn))
            var rows = await cmd.ExecuteNonQueryAsync();
            log.Info($"{rows} rows deleted.");

A couple of notes. Firstly, note the #r at the top – these bring in references – just as if you were adding a reference to a project in visual studio. Secondly, it’s not in a class. You could wrap this in a namespace, and then a static class and add a main function to make it a console app. In fact, this is a great way of testing locally. However, you don’t really need it. Once you click on Save, the function is compiled and will run on the schedule you set.

If you are using the ASP.NET version of the Mobile Apps Server SDK, then your database table will be plural – i.e. if you have a model called TodoItem, the database table will be TodoItems. If you are using the Node.js version of the Mobile Apps Server SDK, the table and the table controller are the same thing – make sure you have the right SQL statement for your backend!

Before I can run this, I need a SQL Connection String. I’ve named it (imaginatively) “sqldb”. To set that connection string:

  • Click on function app settings at the top of the page
  • Click on Go to App Service Settings
  • In the new blade, click on Application Settings (you will probably have to scroll down)
  • Scroll down to Connection Strings, enter sqldb (or whatever you used) in the Name field. Enter your SQL Connection String in the Value box
  • Click on Save

I tend to use cut-and-paste from my Mobile App for this step so I get the SQL Connection String right.

Back in your Function app (and more importantly, in your Function)… At the bottom of the page, there is a Run button, just waiting to be clicked. It’s at this point I realized that my cron statement was wrong. Note that the log tells you exactly what is wrong pretty much straight away. When you are creating a timer Function, your cron string needs to be correct. If you are wondering what a correct cron string looks like, check out Wikipedia. My cron string should be 0 0 3 * 0 (five elements – sec min hr month day-of-week). Don’t try to use any of the extended formats – just the basics. You can change this in the Integrate tab.

Now you can click on Run. You will see a 202 Accepted in the panel to the right of the Run button (the Output panel) and the logs will capture any exceptions thrown by the function.

A more complicated example: Post-processing Images

Now that I’ve got the basics covered, let’s do something a little more complicated – image processing. In this example, I’m going to run a function whenever something is uploaded to a specific container in my blob storage. There is a mechanism whereby you can insert a record into an Azure Mobile Apps table by calling the table controller. However I would need to set up an “app key”. App keys are insecure and you definitely don’t want them to be out in the wild. However, for the purposes of this project, your app key isn’t going into the wild – it’s staying entirely within Azure. It’s still not secure. You are setting up a back door, and back doors have a tendency to leak out or get misused. As a result, I’m going to do the same mechanism as above to insert data – I’m going to use a SQL Command to insert directly into the SQL database.

Before continuing, you should set up a new Mobile Apps Table Controller called Image with a model that contains Filename, Height and Width fields. If you’ve followed along, I covered this way back in Day 17. If you can’t figure it out, check out the Backend for the file-upload project on my GitHub repository. I’ve added the code there.

I’m going to add a new Function to my function app. This time I’m going to select a BlobTrigger for C#. After naming my function, I need to select my storage account and the path to the storage. The path can be a specific file within a container, or it can be the container name itself. I’m going to use the container name. So, here is my trigger definition:


Switching over to the code:

using System;

public static void Run(string myBlob, TraceWriter log)
    log.Info($"C# Blob trigger function processed: {myBlob}");

The myBlob value will get filled with the contents of the file. That’s a problem for me because my files are likely images. There are alternatives to a string – you can use a Stream or a CloudBlockBlob, for example. The Stream can be used if you don’t care about the additional metadata and only care about the contents of the file. I’m going to use a CloudBlockBlob.

Looking for the variables you can use as parameters for the Run method? Check out the cheat sheet (link downloads a PDF)

Note that the Log contains the contents of your blob, which is likely to be binary if you are uploading images, then that goes into the log immediately. Time to write some code for the image processing:

#r "System.Configuration"
#r "System.Data"
#r "System.Drawing"
#r "Microsoft.WindowsAzure.Storage"

using System;
using System.Configuration;
using System.Data.SqlClient;
using System.Drawing;
using System.IO;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage.Blob;

public static async Task Run(CloudBlockBlob blob, TraceWriter log)
    log.Info($"Processing file: {blob.Name}");

    // Create an image object from a stream
    Bitmap image = new Bitmap(await blob.OpenReadAsync());
    // Get the information for the table
    int width = image.Width;
    int height = image.Height;
    string id = Guid.NewGuid().ToString();
    // Update the Images database.  Note that we have a bug - if I am using
    // the same filename, then this code adds a new record - it doesn't update
    // the old record.
    var connStr = ConfigurationManager.ConnectionStrings["sqldb"].ConnectionString;
    log.Info($"Using Connection String {connStr}");
    using (var conn = new SqlConnection(connStr))
        var sqlCmd = "INSERT INTO Images (Id, Filename, Height, Width, Deleted) VALUES(@id, @name, @height, @width, @deleted)";
        var cmd = new SqlCommand(sqlCmd, conn);
        cmd.Parameters.AddWithValue("@id", id);
        cmd.Parameters.AddWithValue("@name", blob.Name);
        cmd.Parameters.AddWithValue("@height", height);
        cmd.Parameters.AddWithValue("@width", width);
        cmd.Parameters.AddWithValue("@deleted", false);
        log.Info($"Executing INSERT:('{id}','{blob.Name}',{height},{width})");
        await cmd.ExecuteNonQueryAsync();

This is fairly basic code. I load the stream into a Bitmap object, grab the metadata and insert into the database table. I do need to ensure that the database table exists (which I did in this example by adding an Image table to my application).

Wrap Up

Functions is an awesome new feature for mobile developers. It allows you to do some relatively complex processing usign ad-hoc resources that are only used (and paid for) when required. The possibilities are really astounding. For instance, Cognitive Services – being able to determine what a photo is of and doing automatic tagging of the image – or Machine Learning are possible. These were really only the province of those that wanted to get into the backend processing. In addition to the triggers, you can also add input and output bindings. This allows you, for example, to automatically push the blob Uri into a queue or deliver a push notification or create a Mobile Apps record – without the additional code that goes along with it.

I mentioned at the beginning why I would not recommend using Functions as a HTTP endpoint (despite all the resourcing being there). A lot of endpoints incur a startup cost. For example, connecting to a quiet App Service spins up the service, which connects to your connected database, ensures all the tables are created, etc. Since dynamic compute spins up resources for every request, you are going to incur this cost every request. This limitation is a small one – there are plenty of reasons to use Functions without making it into a web service as well.

Functions is still in preview, but I can’t wait to see what happens with it. The team is active everywhere and it’s free right now, so give it a spin.

Want the code for the changes I made for images? They are at my GitHub Repository.

7 thoughts on “30 Days of Zumo.v2 (Azure Mobile Apps): Day 29: Post-Processing Files with Azure Functions

  1. Some beginners questions on the blogs so far.

    Is it possible to use authentication from another API. Say I have one API just doing Authentication and another API providing rest calls , can the REST API use the authentication that already happened in the Authentication API ? Therefor cross domain from https://todoapp_authentication.azurewebsites.net to https://todoapp_restapi.azurewebsites.net. If a strategy like this is possible (decoupling the API functionality) is this wise or not from experience ?

    I get an error when running the UWP social app login – say the user select google as login option and the google popup appears on desktop pc , when the user just close the login screen without providing username and password details the UWP app just crashes at the following code :

    ==> await clientAzureMobileService.LoginAsync(MobileServiceAuthenticationProvider.Google);
    How can i catch this error as it is already in a Try / catch statement ?

    When playing around with the sync tables and the local store Db i many times find that the file/service stops working especially when i experiment with adding field in the todolist object.

    var localstore = new MobileServiceSQLiteStore(“localstore.db”);

    I assume there is a sync issue or file locking issue on the local store DB. How can I easily reset or recreate the localstore file , i currently just rename the file to say localstore2.db and that works fine but I’m concerned about the “old” files. How do i delete these Sqlite files / drop all the tables or find them on my pc as I have searched for them , but cant find them ;))

    Thanks in advance.


    • Hi Pinox, You can (and probably should, unless there is a good reason for it) combine your mobile REST API and authentication. It’s easy to handle authentication – APIs tagged with [Authorize] (ASP.NET) or table.access = ‘authenticated’ (Node) are authenticated – the others aren’t. Really, the only reason you would have for separating them is if the two API sets had differing requirements for authentication (e.g. one used Azure AD and one used Facebook). For the sync tables, I find that my database falls into a bad state during development – wiping out the table is easy – it’s in your AppData folder for the app – a search from your home directory will find it on Windows. You can just delete the db file then restart your app.


      • Thanks Adrian , what about running a live mobile app in production. If I change the table fields of my backend what would be the best solution to change the localstore on the users mobile device ?


      • Great question. Off the top of my head, I’d create a “config API” that wasn’t authenticated – I generally have one of these to send down details of the backend that I need. Include in the config API the “schema version” – just a self-incrementing number that you control. In your client, store the schema version you have received. If the schema version changes, delete the localstore.db and re-initialize / re-download all the data. There may be a better way, so I’m going to talk to folks at the office on this one.


Comments are closed.