Working with DocumentDb

In my last post, I introduced working with HTTP CRUD APIs with Azure Functions. My intent in all this is to create a proof of concept service that emulates the Azure Mobile Apps service, but using Azure Functions and the dynamic (or consumption-based) SKU. This means that you pay for the API only when it is being used, but it scales seamlessly as your needs grow. In addition, I’m going to make the backing store for this API a NoSQL store based on another Azure resource – DocumentDb.

Fortunately for me, DocumentDb has a nice Node.js driver. I’m going to promisify the callback-based SDK with bluebird. There are a number of samples available for the DocumentDb driver. For instance, here is my docdb-driver/database.js file:

module.exports = {
    createDatabase: function (client, databaseId, callback) {
        client.createDatabase({ id: databaseId }, callback);
    },

    deleteDatabase: function (client, databaseId, callback) {
        client.deleteDatabase(`dbs/${databaseId}`, callback);
    },

    findDatabaseById: function (client, databaseId, callback) {
        var qs = {
            query: 'SELECT * FROM root r WHERE r.id = @id',
            parameters: [
                { name: '@id', value: databaseId }
            ]
        };

        client.queryDatabases(qs).toArray(function (err, results) {
            if (err) {
                callback(err, null);
            } else {
                callback(null, (results.length === 0) ? null : results[0]);
            }
        });
    },

    listDatabases: function (client, callback) {
        client.readDatabases().toArray(callback);
    },

    readDatabase: function (client, database, callback) {
        client.readDatabase(database._self, callback);
    },

    readDatabases: function (client, databaseId, callback) {
        client.readDatabase(`dbs/${databaseId}`, callback);
    }
};

This is based on callbacks, rather than promises. So my docdb-driver/index.js file uses promisify to convert them to promises:

var Promise = require('bluebird');
var collection = require('./collection');
var database = require('./database');
var docops = require('./document');

var dbCache = {};

var createDatabase = Promise.promisify(database.createDatabase);
var findDatabaseById = Promise.promisify(database.findDatabaseById);

function ensureDatabaseExists(client, database) {
    if (database in dbCache) {
        return Promise.resolve(dbCache[database]);
    }

    return findDatabaseById(client, database).then((dbRef) => {
        if (dbRef == null) {
            return createDatabase(client, database).then((result) => {
                dbCache[database] = result;
                return result;
            });
        }
        dbCache[database] = dbRef;
        return dbRef;
    });
}

module.exports = {
    createCollection: Promise.promisify(collection.createCollection),
    listCollections: Promise.promisify(collection.listCollections),
    readCollection: Promise.promisify(collection.readCollection),
    readCollectionById: Promise.promisify(collection.readCollectionById),
    getOfferType: Promise.promisify(collection.getOfferType),
    changeOfferType: Promise.promisify(collection.changeOfferType),
    deleteCollection: Promise.promisify(collection.deleteCollection),

    createDatabase: createDatabase,
    deleteDatabase: Promise.promisify(database.deleteDatabase),
    ensureDatabaseExists: ensureDatabaseExists,
    findDatabaseById: findDatabaseById,
    listDatabases: Promise.promisify(database.listDatabases),
    readDatabase: Promise.promisify(database.readDatabase),
    readDatabases: Promise.promisify(database.readDatabases),

    createDocument: Promise.promisify(docops.createDocument)
};

I’m going to extend this driver package over time. Sometimes I use the straight API from the DocumentDb driver (see the readDatabase() method). Sometimes, however, I want to do something extra. The ensureDatabaseExists() method is an example of this. I want to find the database in the service and create it only if it doesn’t exist.

Back to the Azure Function I’m developing. DocumentDb mainly stores “documents” – JSON blobs of associated data. It organizes these documents into “collections” and collections into a “database”. In the Azure Mobile Apps equivalent, the collection would be a table and the individual rows or entities would be documents. My first requirement is to ensure that the database and collection are initialized properly (in todoitem/index.js):

var DocumentDb = require('documentdb');
var driver = require('../docdb-driver');

/**
 * Global Settings Object
 */
var settings = {
    host: process.env['DocumentDbHost'],
    accountKey: process.env['DocumentDbAccountKey'],
    database: 'AzureMobile',
    connectionPolicy: undefined,
    consistencyLevel: 'Session',
    pricingTier: 'S1',
    table: 'todoitem'
};

// Store any references we receive here as a cache
var refs = {
    initialized: false
};

/**
 * Routes the request to the table controller to the correct method.
 *
 * @param {Function.Context} context - the table controller context
 * @param {Express.Request} req - the actual request
 */
function tableRouter(context, req) {
    var res = context.res;
    var id = context.bindings.id;

    initialize(context).then(() => {
        switch (req.method) {
            case 'GET':
                if (id) {
                    getOneItem(req, res, id);
                } else {
                    getAllItems(req, res);
                }
                break;

            case 'POST':
                insertItem(req, res);
                break;

            case 'PUT':
                replaceItem(req, res, id);
                break;

            case 'DELETE':
                deleteItem(req, res, id);
                break;

            default:
                res.status(405).json({ error: "Operation not supported", message: `Method ${req.method} not supported`})
        }
    });
}

/**
 * Initialize the DocumentDb Driver
 * @param {Function.Context} context - the table controller context
 * @param {function} context.log - used for logging
 * @returns {Promise}
 */
function initialize(context) {
    if (refs.initialized) {
        context.log('[initialize] Already initialized');
    }

    context.log(`[initialize] Creating DocumentDb client ${settings.host} # ${settings.accountKey}`);
    refs.client = new DocumentDb.DocumentClient(
        settings.host,
        { masterKey: settings.accountKey },
        settings.connectionPolicy,
        settings.consistencyLevel
    );

    context.log(`[initialize] EnsureDatabaseExists ${settings.database}`);
    return driver.ensureDatabaseExists(refs.client, settings.database)
        .then((dbRef) => {
            context.log(`[initialize] Initialized Database ${settings.database}`);
            refs.database = dbRef;
            return driver.listCollections(refs.client, refs.database);
        })
        .then((collections) => {
            context.log(`[initialize] Found ${collections.length} collections`);
            const collection = collections.find(c => { return (c.id === settings.table); });
            context.log(`[initialize] Collection = ${JSON.stringify(collection)}`);
            if (typeof collection !== 'undefined') return collection;
            context.log(`[initialize] Creating collection ${settings.table}`);
            return driver.createCollection(refs.client, settings.pricingTier, refs.database, settings.table);
        })
        .then((collectionRef) => {
            context.log(`[initialize] Found collection`);
            refs.table = collectionRef;
            refs.initialized = true;
        });

    context.log('[initialize] Finished Initializing Driver');
}

Let's take this in steps. Firstly, I set up the settings. The important things here are the DocumentDbHost and the DocumentDbAccountKey. If you have created a DocumentDb within the Azure Portal, click on the Keys menu item. The DocumentDbHost is the URI field and the DocumentDbAccountKey is the PRIMARY KEY field. If you are running the Azure Function locally, then you will need to set these as environment variables before starting the func host. If you are running the Azure Function within Azure, you need to make these App Settings. An example of setting these locally in PowerShell:

$env:DocumentDbHost = "https://mydocdb.documents.azure.com:443/"
$env:DocumentDbAccountKey = "fuCZuSomeLongStringaLNKjIiMSEyaojsP05ywmevI7K2yCY9dYLRuCQPd3dMnvg=="
func run test-func --debug

When you use Postman (for example, a GET http://localhost:7071/tables/todoitem), you will see the initialize() method gets called. This method returns a Promise that, when resolved, will then allow the request to be continued. In the initialize() method, I short-circuit the initialization if it has already been initialized. If it has not been initialized, I fill in the refs object. This object will be used by the inidividual CRUD operations, so it needs to be filled in. The client, database, and collection that we need are found or created. At the end, we have resolve the promise by setting the initialized flag to true (thus future calls will be short circuited).

There is a race condition here. If two requests come in to a “cold” function, they will both go through the initialization together and potentially the “create database” and “create collection” will be duplicated, causing an exception in one of the requests. I’m sure I could fix this, but it’s a relatively rare case. Once the datbase and collection are created, the possibility of the condition goes away.

If you run this (either locally or within Azure Functions), you will see the following output in the log window:

function-docdb

If you’ve done something wrong, you will see the exception and you can debug it using the normal methods. Want a primer? I’ve written a blog post about it.

In the next post, I’ll cover inserting, deleting and updating records. Until then, check out the code on my GitHub repository.

Writing HTTP CRUD in Azure Functions

Over the last two posts, I’ve introduced writing Azure Functions locally and deploying them to the cloud. It’s time to do something useful with them. In this post, I’m going to introduce how to write a basic HTTP router. If you follow my blog and other work, you’ll see where this is going pretty quickly. If you are only interested in Azure Functions, you’ll have to wait a bit to see how this evolves.

Create a new Azure Function

I started this blog by installing the latest azure-functions-cli package:

npm i -g azure-functions-cli

Then I created a new Azure Function App:

mkdir dynamic-tables
cd dynamic-tables
func new

Finally, I created a function called todoitem:

dec15-01

Customize the HTTP Route Prefix

By default, any HTTP trigger is bound to /api/_function_ where function is the name of your function. I want full control over where my function exists. I’m going to fix this is the host.json file:

{
    "id":"6ada7ae64e8a496c88617b7ab6682810",
    "http": {
        "routePrefix": ""
    }
}

The routePrefix is the important thing here. The value is normally “/api”, but I’ve cleared it. That means I can put my routes anywhere.

Set up the Function Bindings

In the todoitem directory are two files. The first, function.json, describes the bindings. Here is the version for my function:

{
    "disabled": false,
    "bindings": [
        {
            "name": "req",
            "type": "httpTrigger",
            "direction": "in",
            "authLevel": "function",
            "methods": [ "GET", "POST", "PATCH", "PUT", "DELETE" ],
            "route": "tables/todoitem/{id:alpha?}"
        },
        {
            "type": "http",
            "direction": "out",
            "name": "res"
        }
    ]
}

This is going to get triggered by a HTTP trigger, and will accept five methods: GET, POST, PUT, PATCH and DELETE. In addition, I’ve defined a route that contains an optional string for an id. I can, for example, do GET /tables/todoitem/foo and this will be accepted. On the outbound side, I want to respond to requests, so I’ve got a response object. The HTTP Trigger for Node is modelled after ExpressJS, so the req and res objects are mostly equivalent to the ExpressJS Request and Response objects.

Write the Code

The code for this function is in todoitem/index.js:

/**
 * Routes the request to the table controller to the correct method.
 *
 * @param {Function.Context} context - the table controller context
 * @param {Express.Request} req - the actual request
 */
function tableRouter(context, req) {
    var res = context.res;
    var id = context.bindings.id;

    switch (req.method) {
        case 'GET':
            if (id) {
                getOneItem(req, res, id);
            } else {
                getAllItems(req, res);
            }
            break;

        case 'POST':
            insertItem(req, res);
            break;

        case 'PATCH':
            patchItem(req, res, id);
            break;

        case 'PUT':
            replaceItem(req, res, id);
            break;

        case 'DELETE':
            deleteItem(req, res, id);
            break;

        default:
            res.status(405).json({ error: "Operation not supported", message: `Method ${req.method} not supported`})
    }
}

function getOneItem(req, res, id) {
    res.status(200).json({ id: id, message: "getOne" });
}

function getAllItems(req, res) {
    res.status(200).json({ query: req.query, message: "getAll" });
}

function insertItem(req, res) {
    res.status(200).json({ body: req.body, message: "insert"});
}

function patchItem(req, res, id) {
    res.status(405).json({ error: "Not Supported", message: "PATCH operations are not supported" });
}

function replaceItem(req, res, id) {
    res.status(200).json({ body: req.body, id: id, message: "replace" });
}

function deleteItem(req, res, id) {
    res.status(200).json({ id: id, message: "delete" });
}

module.exports = tableRouter;

I use a tableRouter method (and that is what our function calls) to route the HTTP call to the write CRUD method I want to execute. It’s up to me to put whatever CRUD code I need to execute and respond to the request in those additional methods. In this case, I’m just returning a 200 status (OK) and some JSON data. One key piece is differentiating between a GET /tables/todoitem and a GET /tables/todoitem/foo. The former is meant to return all records and the latter is meant to return a single record. If the id is set, we call the single record GET method and if not, then we call the multiple record GET method.

What’s the difference between PATCH and PUT? In REST semantics, PATCH Is used when you want to do a partial update of a record. PUT is used when you want to send a full record. This CRUD recipe uses both, but you may decide to use one or the other.

Running Locally

As with the prior blog post, you can run func run test-func --debug to start the backend and get ready for the debugger. You can then use Postman to send requests to your backend. (Note: Don’t use func run todoitem --debug – this will cause a crash at the moment!). You’ll get something akin to the following:

dec15-02

That’s it for today. I’ll be progressing on this project for a while, so expect more information as I go along!

Deploying Azure Functions Automatically

In my last post, I went over how to edit, run and debug Azure Functions on your local machine. Eventually, however, you want to place these functions in the cloud. They are, after all, designed to do things in the cloud on dynamic compute. There are two levels of automation you can use:

  1. Continuous Deployment
  2. Automated Resource Creation

Most of you will be interested in continuous deployment. That is, you create your Azure Functions app once and then you just push updates to it via a source code control system. However, a true DevOps mindset requires “configuration as code”, so we’ll go over how to download an Azure Resource Manager (ARM) template for the function app and resource group.

Creating a Function App in the Portal

Creating an Azure Functions app in the portal is a straight forward process.

  1. Create a resource group.
  2. Create a function app in the resource group.

Log into the Azure portal. Select Resource Groups in the left-hand menu (which may be under the “More Services” link in your case), then click on the + Add link in the top bar to create a new resource group. Give it a unique name (it only has to be unique within your subscription), select a nominal location for the resource group, then click on Create:

create-rg-1

Once the resource group is created, click into it and then select + Add inside the resource group. Enter “Function App” in the search box, then select the same and click on Create

create-func-1

Fill in the name and select the region that you want to place the Azure Functions in. Ensure the “Consumption Plan” is selected. This is the dynamic compute plan, so you only pay for resources when your functions are actually being executed. The service will create an associated storage account for storing your functions in the same resource group.

Continuous Deployment

In my last blog post, I created a git repository to hold my Azure Function code. I can now link this git repository to the Function App in the cloud as follows:

  • Open the Function App.
  • Click the Function app settings link in the bottom right corner.
  • Click the Go to App Service Settings button.
  • Click the Deployment Credentials menu option.
  • Fill in the form for the deployment username and password (twice).
  • Click Save at the top of the blade.

You need to know the username and password of your git repository in the cloud that is attached to your Function App so that you can push to it. You’ve just set those credentials.

  • Click the Deployment options menu option.
  • Click Choose Source.
  • Click Local Git Repository
  • Click OK.

I could have just as easily linked my function app to GitHub, Visual Studio Team Services or BitBucket. My git repository is local to my machine, so a local git repository is suitable for this purpose.

  • Click the Properties menu option.
  • Copy the GIT URL field.

I now need to add the Azure hosted git repository as a remote on my local git repository. To do this, open a PowerShell console, change directory to the function app and type the following:

git remote add azure <the-git-url>
git push azure master

This will push the contents of the git repository up to the cloud, which will then do a deployment of the functions for you. You will be prompted for your username and password that you set when setting up the deployment credentials earlier.

create-func-2

Once the deployment is done, you can switch back to the Function App in the portal and you will see that your function is deployed. An important factor is that you are now editing the files associated with the function on your local machine. You can no longer edit the files in the cloud, as any changes would be overwritten by the next deployment from your local machine. To remind you of this, Azure Functions displays a helpful warning:

create-func-3

If you edit your files on the local machine, remember to push them to the Azure remote to deploy them.

Saving the ARM Template

You are likely to only need the Azure Function process shown above. However, in case you like checking in the configuration as code, here is how you do it. Firstly, go to your resource group:

create-rg-2

Note the menu item entitled Automation script – that’s the one you want. The portal will generate an Azure Resource Manager (ARM) template plus a PowerShell script or CLI script to run it. Click on Download to download all the files – you will get a ZIP file.

Before extracting the ZIP file, you need to unblock it. In the File Explorer, right-click on the ZIP file and select Properties.

create-rg-3

Check the Unblock box and then click on OK. You can now extract the ZIP file with your favorite tool. I just right-click and select Extract All….

Creating a new Azure Function with the ARM Template

You can now create a new Azure Function App with the same template as the original by running .\deploy.ps1 and filling in the fields. Yep – it’s that simple!

Creating and Debugging Azure Functions Locally

I’ve written about Azure Functions before as part of my Azure Mobile Apps series. Azure Functions is a great feature of the Azure platform that allows you to run custom code in the cloud in a “serverless” manner. In this context, “serverless” doesn’t mean “without a server”. Rather, it means that the server is abstracted away from you. In my prior blog post, I walked through creating an Azure Function using the web UI, which is a problem when you want to check your Azure Functions in to source code and deploy them as part of your application.

UPDATE: Azure App Service have released a blog on the new CLI tools.

This is the first in a series of blog posts. I am going to walk through a process by which you can write and debug Azure Functions on your Windows 10 PC, then check the code into your favorite SCCM and deploy in a controlled manner. In short – real world.

Getting Ready

Before you start, let’s get the big elephant out of the way. The actual runtime Windows only. Sorry, Mac Users. The run-time relies on the 4.x .NET Framework, and you don’t have that. Boot into Windows 10. You can still create functions locally, but you will have to publish them to the cloud to run them. There is no local runtime on a Mac.

To get your workstation prepped, you will need the following:

  • Node
  • Visual Studio Code
  • Azure Functions Runtime

Node is relatively easy. Download the Node package from nodejs.org and install it as you would any other package. You should be able to run the node and npm programs from your command line before you continue. Visual Studio Code is similarly easily downloaded and installed. You can download additional extensions if you like. If you write functions in C#, I would definitely download the C# extension.

The final bit is the Azure Functions Runtime. This is a set of tools produced by the Azure Functions team to create and run Functions locally and are based on Yeoman. To install:

npm install -g yo generator-azurefunctions azure-functions-cli

WARNING There is a third-party module called azure-functions which is not the same thing at all. Make sure you install the right thing!

After installing, the func command should be available:

func-1

Once you have these three pieces, you are ready to start working on Azure Functions locally.

Creating an Azure Functions Project

Creating an Azure Functions project uses the func command:

mkdir my-func-application
cd my-func-application
func init

Note that func init creates a git repository as well – one less thing to do! Our next step is to create a new function. The Azure Functions CLI uses Yeoman underneath, which we can call directly using yo azurefunctions:

func-2

You can create as many functions as you want in a single function app. In the example above, I created a simple HTTP triggered function written in JavaScript. This can be used as a custom API in a mobile app, for example. The code for my trigger is stored in test-func\index.js:

module.exports = function(context, req) {
    context.log('Node.js HTTP trigger function processed a request. RequestUri=%s', req.originalUrl);

    if (req.query.name || (req.body && req.body.name)) {
        context.res = {
            // status: 200, /* Defaults to 200 */
            body: "Hello " + (req.query.name || req.body.name)
        };
    }
    else {
        context.res = {
            status: 400,
            body: "Please pass a name on the query string or in the request body"
        };
    }
    context.done();
};

and the binding information is in test-func\function.json:

{
    "disabled": false,
    "bindings": [
        {
            "authLevel": "function",
            "type": "httpTrigger",
            "direction": "in",
            "name": "req"
        },
        {
            "type": "http",
            "direction": "out",
            "name": "res"
        }
    ]
}

Running the function

To run the Azure Functions Runtime for your function app, use func run test-func.

The runtime is kicked off first. This monitors the function app for changes, so any changes you do in the code will be reflected as soon as you save the file. If you are running something that can be triggered manually (like a cron job), then it will be run immediately. For my HTTP trigger, I need to hit the HTTP endpoint – in this case, http://localhost:7071/api/test-func.

Note that the runtime is running with the version of Node that you installed and it is running on your local machine. Yet it still can be triggered by whatever you set up. If you set up a blob trigger from a storage account, then that will trigger. You have to set up the environment properly. Remember that App Service (and Functions) app settings appear as environment variables to the runtime. When you run locally, you will need to manually set up the app settings by setting an environment variable of the same name. Do this before you use func run for the first time.

Debugging the function

Running the function is great, but I want to debug the function – set a breakpoint, inspect the internal state of the function, etc. This can be done easily in Visual Studio Code as the IDE has an integrated Node debugger.

  • Run func run test-func --debug
  • Go into the debugger within Visual Studio Code, and set a breakpoint
  • Switch to the Debugger tab and hit Start (or F5)

    func-3

  • Trigger the function. Since I have a HTTP triggered function, I’m using Postman for this:

    func-4

  • Note that the function is called and I can inspect the internals and step through the code:

    func-5

You can now step through the code, inspect local variables and generally debug your script.

Next up

In the next article, I’ll discuss programmatically creating an Azure Function app and deploying our function through some of the deployment mechanisms we have available to us in Azure App Service.

30 Days of Zumo.v2 (Azure Mobile Apps): Day 29: Post-Processing Files with Azure Functions

There are times when you don’t want things to happen as a result of an API call – maybe it’s a cron job that cleans out deleted records; or maybe it’s a post-processing job that looks at images you have uploaded and runs some process on them (such as facial recognition, cognitive services or just plain image handling). Whatever the case, this isn’t handled within Azure Mobile Apps easily. It’s time for a new service.

That service is Azure Functions. Azure Functions is a dynamic compute facility that runs “code” when triggered by an event. It responds to triggers by running code. I introduced a whole bunch of things there, so let’s look at each one.

Dynamic Compute is awesome and it’s the central theme of functions. You may have heard of the “serverless” movement. It’s the new word for an old idea – you don’t have to manage the resources yourself – the platform does it for you. If you had used the Parse server or another MBaaS in the past – same concept. You never had to worry about the VM running the code in those services. Dynamic Compute grabs the resources necessary to run the job and then releases the resources after it is run. You only pay for the resources that you consume. It’s like the ultimate endpoint of the promise of cloud computing.

Events or Triggers can be anything. Yes, they could be a HTTP endpoint (although I am not recommending that right now – I’ll explain at the end). They could also be a timer, an object placed into a storage queue or a file uploaded to blob storage.

Functions are not Web Apps. You can’t use the same project you used for your mobile backend to define your functions. Right now, the best way to define your functions is through the Azure Portal.

Creating a Function for cleaning up the TodoItem table

Let’s create a simple function based on a timer. Our mobile TodoItem table has soft-delete enabled. That means that instead of deleting the records directly, the records are marked as deleted by setting the deleted column to 1. I need to clean up those, so let’s define a simple function that runs every Sunday morning at 3am and deletes all the records marked for deletion. It’s just a SQL statement change if you want to be more prescriptive about this and delete all the records marked for deletion that are older than 7 days, for example.

Start by going to https://functions.azure.com, clicking on the big friendly Get Started button and logging in with your Azure credentials. Click on the + NEW button then select Web + Mobile, then Function App. Give the function app a name, select a resource group and app service plan and click on create.

Screen Shot 2016-06-01 at 7.11.08 PM

Note that creating a function app creates a storage area as well. This is required and has to do with a common storage area for your functions since you don’t actually know where they will run. It’s not serverless per-se (which is a silly term, IMHO – there is always a server), but that is what the term means. You don’t define a server for this app to run on, but it needs a little bit of shared storage to hold your code, logs, etc.

Once the function app has been deployed, the function app blade will open. It’s very different from the other blades with a unique management experience.

Screen Shot 2016-06-01 at 7.17.16 PM

In this case, I want to create something akin to a cron job – a timer job. I’m going to write my timer job in C#, but you could write something similar in Node.js, for example. You can also click on the + New function, in which case you get access to the full range of scenarios. The Timer Function is in the Core set of scenarios near the end. Give the function a name. The Cron schedule I picked is 0 0 3 ? * SUN *, which is one that is compatible with Quartz – a great .NET library for doing cron-type things.

Screen Shot 2016-06-01 at 7.22.09 PM

Once I click on Create, I am in a code editor. Here is the code for a basic function that deletes all the records marked as deleted:

#r "System.Configuration"
#r "System.Data"

using System;
using System.Configuration;
using System.Data.SqlClient;
using System.Threading.Tasks;

public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
    var connStr = ConfigurationManager.ConnectionStrings["sqldb"].ConnectionString;
    
    using (var conn = new SqlConnection(connStr))
    {
        conn.Open();
        var sqlCmd = "DELETE FROM TodoItems WHERE deleted = 'True'";
        using (var cmd = new SqlCommand(sqlCmd, conn))
        {
            var rows = await cmd.ExecuteNonQueryAsync();
            log.Info($"{rows} rows deleted.");
        }
    }
}

A couple of notes. Firstly, note the #r at the top – these bring in references – just as if you were adding a reference to a project in visual studio. Secondly, it’s not in a class. You could wrap this in a namespace, and then a static class and add a main function to make it a console app. In fact, this is a great way of testing locally. However, you don’t really need it. Once you click on Save, the function is compiled and will run on the schedule you set.

If you are using the ASP.NET version of the Mobile Apps Server SDK, then your database table will be plural – i.e. if you have a model called TodoItem, the database table will be TodoItems. If you are using the Node.js version of the Mobile Apps Server SDK, the table and the table controller are the same thing – make sure you have the right SQL statement for your backend!

Before I can run this, I need a SQL Connection String. I’ve named it (imaginatively) “sqldb”. To set that connection string:

  • Click on function app settings at the top of the page
  • Click on Go to App Service Settings
  • In the new blade, click on Application Settings (you will probably have to scroll down)
  • Scroll down to Connection Strings, enter sqldb (or whatever you used) in the Name field. Enter your SQL Connection String in the Value box
  • Click on Save

I tend to use cut-and-paste from my Mobile App for this step so I get the SQL Connection String right.

Back in your Function app (and more importantly, in your Function)… At the bottom of the page, there is a Run button, just waiting to be clicked. It’s at this point I realized that my cron statement was wrong. Note that the log tells you exactly what is wrong pretty much straight away. When you are creating a timer Function, your cron string needs to be correct. If you are wondering what a correct cron string looks like, check out Wikipedia. My cron string should be 0 0 3 * 0 (five elements – sec min hr month day-of-week). Don’t try to use any of the extended formats – just the basics. You can change this in the Integrate tab.

Now you can click on Run. You will see a 202 Accepted in the panel to the right of the Run button (the Output panel) and the logs will capture any exceptions thrown by the function.

A more complicated example: Post-processing Images

Now that I’ve got the basics covered, let’s do something a little more complicated – image processing. In this example, I’m going to run a function whenever something is uploaded to a specific container in my blob storage. There is a mechanism whereby you can insert a record into an Azure Mobile Apps table by calling the table controller. However I would need to set up an “app key”. App keys are insecure and you definitely don’t want them to be out in the wild. However, for the purposes of this project, your app key isn’t going into the wild – it’s staying entirely within Azure. It’s still not secure. You are setting up a back door, and back doors have a tendency to leak out or get misused. As a result, I’m going to do the same mechanism as above to insert data – I’m going to use a SQL Command to insert directly into the SQL database.

Before continuing, you should set up a new Mobile Apps Table Controller called Image with a model that contains Filename, Height and Width fields. If you’ve followed along, I covered this way back in Day 17. If you can’t figure it out, check out the Backend for the file-upload project on my GitHub repository. I’ve added the code there.

I’m going to add a new Function to my function app. This time I’m going to select a BlobTrigger for C#. After naming my function, I need to select my storage account and the path to the storage. The path can be a specific file within a container, or it can be the container name itself. I’m going to use the container name. So, here is my trigger definition:

cap1

Switching over to the code:

using System;

public static void Run(string myBlob, TraceWriter log)
{
    log.Info($"C# Blob trigger function processed: {myBlob}");
}

The myBlob value will get filled with the contents of the file. That’s a problem for me because my files are likely images. There are alternatives to a string – you can use a Stream or a CloudBlockBlob, for example. The Stream can be used if you don’t care about the additional metadata and only care about the contents of the file. I’m going to use a CloudBlockBlob.

Looking for the variables you can use as parameters for the Run method? Check out the cheat sheet (link downloads a PDF)

Note that the Log contains the contents of your blob, which is likely to be binary if you are uploading images, then that goes into the log immediately. Time to write some code for the image processing:

#r "System.Configuration"
#r "System.Data"
#r "System.Drawing"
#r "Microsoft.WindowsAzure.Storage"

using System;
using System.Configuration;
using System.Data.SqlClient;
using System.Drawing;
using System.IO;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage.Blob;

public static async Task Run(CloudBlockBlob blob, TraceWriter log)
{
    log.Info($"Processing file: {blob.Name}");

    // Create an image object from a stream
    Bitmap image = new Bitmap(await blob.OpenReadAsync());
    
    // Get the information for the table
    int width = image.Width;
    int height = image.Height;
    string id = Guid.NewGuid().ToString();
    
    // Update the Images database.  Note that we have a bug - if I am using
    // the same filename, then this code adds a new record - it doesn't update
    // the old record.
    var connStr = ConfigurationManager.ConnectionStrings["sqldb"].ConnectionString;
    log.Info($"Using Connection String {connStr}");
    
    using (var conn = new SqlConnection(connStr))
    {
        conn.Open();
        var sqlCmd = "INSERT INTO Images (Id, Filename, Height, Width, Deleted) VALUES(@id, @name, @height, @width, @deleted)";
        var cmd = new SqlCommand(sqlCmd, conn);
        
        cmd.Parameters.AddWithValue("@id", id);
        cmd.Parameters.AddWithValue("@name", blob.Name);
        cmd.Parameters.AddWithValue("@height", height);
        cmd.Parameters.AddWithValue("@width", width);
        cmd.Parameters.AddWithValue("@deleted", false);
        
        log.Info($"Executing INSERT:('{id}','{blob.Name}',{height},{width})");
        await cmd.ExecuteNonQueryAsync();
    }
}

This is fairly basic code. I load the stream into a Bitmap object, grab the metadata and insert into the database table. I do need to ensure that the database table exists (which I did in this example by adding an Image table to my application).

Wrap Up

Functions is an awesome new feature for mobile developers. It allows you to do some relatively complex processing usign ad-hoc resources that are only used (and paid for) when required. The possibilities are really astounding. For instance, Cognitive Services – being able to determine what a photo is of and doing automatic tagging of the image – or Machine Learning are possible. These were really only the province of those that wanted to get into the backend processing. In addition to the triggers, you can also add input and output bindings. This allows you, for example, to automatically push the blob Uri into a queue or deliver a push notification or create a Mobile Apps record – without the additional code that goes along with it.

I mentioned at the beginning why I would not recommend using Functions as a HTTP endpoint (despite all the resourcing being there). A lot of endpoints incur a startup cost. For example, connecting to a quiet App Service spins up the service, which connects to your connected database, ensures all the tables are created, etc. Since dynamic compute spins up resources for every request, you are going to incur this cost every request. This limitation is a small one – there are plenty of reasons to use Functions without making it into a web service as well.

Functions is still in preview, but I can’t wait to see what happens with it. The team is active everywhere and it’s free right now, so give it a spin.

Want the code for the changes I made for images? They are at my GitHub Repository.