30 Days of Zumo.v2 (Azure Mobile Apps): Day 27: File Handling (v1)

One of the oft requested features is file upload. It comes in two forms. Firstly, the simple version: “I want to upload a file”. Secondly, the database version: “I want to upload a file and link it to a record within an offline table”. Both of these have valid use cases, so I should cover both of them. For the linkage to the offline table edition, it would be nice for the file upload to be offline capable as well.

On to actual file uploads to Azure. The key here is: Don’t use Azure Mobile Apps to upload files. I know what you are thinking…. if I’m not using Azure Mobile Apps for file upload, how do I do it? Well, the information lies in how Azure Storage works and what you need to do to upload a file. I’ll still be using Azure Mobile Apps as you will see.

Azure Storage is an awesome flexible (and cheap) cloud storage solution. Like other cloud storage solutions, it offers multiple modes – Blob, File, Table and Queue Storage. I’m only going to be looking at Blob Storage today. You can read about the various types in the introduction from Azure. The key fact about Azure Storage is that it already has a lot of the features you need – things like restartable uploads, for example – in the REST interface. It also has a bunch of SDKs available for just about all the major platforms – including UWP, iOS, Android and Xamarin.

So, what’s the problem? Can’t I just go ahead and use the SDK? Well, sort of. There is a process here. For each request, you need to:

  1. Ensure that a storage account exists (just like you would your SQL Azure instance)
  2. Find or Create a Storage Container for each user (or some other segmentation that makes sense)
  3. Create a Storage Access Key that allows an authenticated user to write to the storage container
  4. Then use the Azure Storage SDK to upload the file

Let’s take a look at each one:

Create a Storage Account

You only need to create a storage account once per backend. Once it’s created, it’s linked for the duration. Creating a storage account is pretty much identical to any other service within Azure:

  1. Go to your Resource Group. You will want to store the Storage Account with your other items so that when you delete the Resource Group, the storage gets deleted as well.
  2. Click on the + Add button at the top of the Resource Group blade to add a resource:


  1. Enter “Storage” in the search box and hit Enter. The right Storage Account template should be first in the list:

4. Click on the right “Storage Accounnt” template, then click on the Create button.
5. Fill in the blade details. Enter a name and place the storage account in the same region as your mobile backend. Also select the right replication scheme and storage account type. Once done, click on Create.


Let’s take a moment to talk about replication schema and storage account type. There are two types of storage account – General Purpose allows access to all four types of storage – blob, file, table and queue. Blob Storage only allows access to, well, blob storage. There is no downside to selecting General Purpose. You also have four different replication schemes. Azure Storage maintains three copies of your data. You get to choose where the copies are stored:

  • Locally Redundant Storage – the copies are stored in the same data center
  • Zone Redundant Storage – the copies are stored in different data centers in the same region
  • Geo-Redundant Storage – six copies (instead of three) distributed across multiple regions
  • Read-Access Geo-Redundant – a copy is maintained in a different region but is read-only

If you are just testing, go with LRS – it’s the cheapest. If you are going into production, make sure your replication strategy matches your needs. Also note that there may be one-time charges for converting from LRS to one of the others. There are also special notes on Zone Redundant Storage, so read the docs on this carefully.

You’ve also got an option between Standard and Premium performance. You can get up to 50Gbps throughput with premium storage, at a premium cost. Premium storage is really designed for VM disks, so you are unlikely to need it for this use case.

Now that I have a storage account, I can move on to the next step.

Link Storage Account to Mobile Backend

The next step is to link the storage account to the mobile backend App Service. To do this:

  1. Select your App Service
  2. Click on All Settings, then Data Connections
  3. Click on + Add
  4. In the drop-down (that normally shows SQL Database), select Storage
  5. Click on Connection String – it should be set correctly, (but correct if you need to), then click on OK
  6. Click on OK to add the connection string.

The storage account connection string will appear as the environment variable CUSTOMCONNSTR_MS_AzureStorageAccountConnectionString – which is a mouthful. You can check it in the environment section of Kudu.

Create a Shared Access Signature

The shared access signature is a URI that encompasses everything needed to access storage in a delegated manner. You can decide to limti the rights and give a time limit on the shared access signature. One must be created for each file to be uploaded, so this makes for a great custom API example. In this code, I’m going to create a storage container for the user (if one does not already exist), then I’m going to securely create a write key for the storage container. How securely? Well, I’m going to ensure that the user is authenticated. I’m also going to limit the amount of time that the storage access key is valid. You can tune the amount of time provided, but make it small – 30 minutes is plenty to upload a file. Here is my Custom API:

using System.Web.Http;
using Microsoft.Azure.Mobile.Server.Config;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using System.Threading.Tasks;
using System.Security.Claims;
using System.Diagnostics;

namespace Backend.Controllers
    public class GetStorageTokenController : ApiController
        public GetStorageTokenController()
            ConnectionString = Environment.GetEnvironmentVariable("CUSTOMCONNSTR_MS_AzureStorageAccountConnectionString", EnvironmentVariableTarget.Process);
            Debug.WriteLine($"[GetStorageTokenController$init] Connection String = {ConnectionString}");
            StorageAccount = CloudStorageAccount.Parse(ConnectionString);
            BlobClient = StorageAccount.CreateCloudBlobClient();

        public string ConnectionString { get; set; }

        public CloudStorageAccount StorageAccount { get; set; }

        public CloudBlobClient BlobClient { get; set; }

        // GET api/GetStorageToken
        public async Task<StorageTokenViewModel> Get()
            // Get the container name for the user
            Debug.WriteLine($"[GetStorageTokenController] Get()");
            var claimsPrincipal = User as ClaimsPrincipal;
            var sid = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier).Value.Substring(4); // strip off the sid: from the front
            string containerName = $"container-{sid}";
            Debug.WriteLine($"[GetStorageTokenController] Container Name = {containerName}");

            // Create the container if it does not yet exist
            CloudBlobContainer container = BlobClient.GetContainerReference(containerName);
            Debug.WriteLine($"[GetStorageTokenController] Got Container Reference");
            // This will throw a StorageException, which results in a 500 Internal Server Error on the outside
                await container.CreateIfNotExistsAsync();
                Debug.WriteLine($"[GetStorageTokenController] Container is created");
            catch (StorageException ex)
                Debug.WriteLine($"[GetStorageTokenController] Cannot create container: {ex.Message}");

            // Create a blob URI - based on a GUID
            var blobName = Guid.NewGuid().ToString("N");
            Debug.WriteLine($"[GetStorageTokenController] Blob Name = {blobName}");
            var blob = container.GetBlockBlobReference(blobName);
            Debug.WriteLine($"[GetStorageTokenController] Got Blob Reference");

            // Create a policy for the blob access
            var blobPolicy = new SharedAccessBlobPolicy
                // Set start time to five minutes before now to avoid clock skew.
                SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
                // Allow Access for the next 60 minutes (according to Azure)
                SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(60),
                // Allow read, write and create permissions
                Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
            Debug.WriteLine($"[GetStorageTokenController] Got Blob SAS Policy");

            return new StorageTokenViewModel
                Name = blobName,
                Uri = blob.Uri,
                SasToken = blob.GetSharedAccessSignature(blobPolicy)


    public class StorageTokenViewModel
        public string Name { get; set; }
        public Uri Uri { get; set; }
        public string SasToken { get; set; }

make sure you add the NuGet WindowsAzure.Storage package to your project – this contains the Azure Storage SDK. You can test this with Postman like this:


Thinking of using this information to store that file? Remember that the SAS token is secure – it’s only valid for a certain time (just 60 minutes), so it won’t be valid by the time you read this.

Uploading a File

Now that I’ve got a custom API for generating a SAS token, all I need to do is use the client.invokeApi method to generate a SAS token for my upload, then use the Azure Storage SDK to actually upload the file. To test this, I’ve got a new Xamarin Forms application – it does pretty much the same as the UWP and Cordova applications that I’ve been using. Xamarin Forms is used for cross-platform development, so the resulting applications are native applications and I can compile for iOS, Android and Universal Windows simultaneously. I’ve added a button to the ItemList.xaml UI for handling a file upload:

          <StackLayout Orientation="Horizontal" HorizontalOptions="FillAndExpand" Padding="10">
            <Button BackgroundColor="Teal" Command="{Binding AddNewItemCommand}" Text="Add New Item" TextColor="White" />
            <Button BackgroundColor="Purple" Command="{Binding UploadFileCommand}" Text="Upload File" TextColor="White" />

The handler for this is located in the ItemListViewModel.cs file:

        Command c_uploadFile;
        public Command UploadFileCommand
            get { return c_uploadFile ?? (c_uploadFile = new Command(async () => await ExecuteUploadFileCommand())); }

        async Task ExecuteUploadFileCommand()
            if (IsBusy) return;
            IsBusy = true;

                await UserDialogs.Instance.ActionSheetAsync("Upload File", "Cancel", "Upload File");
            catch (Exception ex)
                IsBusy = false;

This code doesn’t actually do anything other than pop up a dialog box with the appropriate buttons for me. I’m not even that interested in the dialog box. This is just serving as a temporary holding area so I can move through the process of uploading a file. This is my base code before I’ve added any of the upload file logic. My first step is to get the SAS token. I’ve copied the StorageTokenViewModel from my Backend project to my shared Xamarin Forms project (it’s in the Models directory). Getting the storage token is now easy. I’ve expanded the ICloudService to include the following definition:

        Task<StorageTokenViewModel> GetStorageToken();

I’ve also added appropriate implementations in the cloud service implementations. The important one is in AzureCloudService.cs:

        public Task<StorageTokenViewModel> GetStorageToken()
            return MobileService.InvokeApiAsync<StorageTokenViewModel>("GetStorageToken", HttpMethod.Get, null);

I can now do the first part in my try block of my ItemListViewModel.cs:

                var storageToken = await cloudService.GetStorageToken();
                await UserDialogs.Instance.AlertAsync($"URI = {storageToken.Uri}{storageToken.SasToken}", "Got SAS Token", "OK");

When you run this code, wait for the item list to refresh and then click on the Upload File button – you will get something akin to this:


I’m going to need some assistance with a mobile file picker. I’m going to use the Xamarin Media Plugin from James Montemagno. James has produced a number of high quality plugins for Xamarin, so check out his collection. The easiest way to install this plugin is to get it from NuGet. You can add it directly within Visual Studio. Don’t forget to add the NuGet package to all the Xamarin projects. Once this is done, adding a photo picker is simple:

                // Xamarin Media Plugin to get a picture
                await CrossMedia.Current.Initialize();
                var file = await CrossMedia.Current.PickPhotoAsync();

I’m going to assume that requesting the storage token will always work if I am online, so I added the photo picker before the call to GetStorageToken(). My final piece of code uploads the file I’ve picked to the SAS location I’ve received from the custom API.

You will want to add the WindowsAzure.Storage SDK v7.0.1-preview edition. This is a specific version that is compatible with Xamarin Forms. You can find it in the NuGet Package Manager in Visual Studio by checking the “Include Prerelease” checkbox. As always, add the NuGet package to all the Xamarin Forms projects.

Here is the completed code with the additional upload code highlighted:

            Uri storageUri = null;
                // Xamarin Media Plugin to get a picture
                await CrossMedia.Current.Initialize();
                var file = await CrossMedia.Current.PickPhotoAsync();

                // Get the storage token from the custom API
                var storageToken = await cloudService.GetStorageToken();
                storageUri = new Uri($"{storageToken.Uri}{storageToken.SasToken}");

                // Store the MediaFile to the storage token
                var blob = new CloudBlockBlob(storageUri);
                var stream = file.GetStream();
                await blob.UploadFromStreamAsync(stream);
                UserDialogs.Instance.SuccessToast("File Upload Complete", null, 1500);
            catch (Exception ex)
                UserDialogs.Instance.Alert(storageUri.ToString(), ex.Message, "OK");

It’s important to note that you don’t upload files – you upload a stream of data. Fortunately, the MediaFile type has taken that into account and provides a handy method for converting the file into a stream. The file upload is done asynchronously and I’ve added a toast for successful upload. If the upload fails, the SAS URI is displayed so you can debug further.

Looking at your uploaded file

You can look at your files easily enough by using the Resource browser in the portal:


Ensure you are selecting the storage account that is linked to your mobile backend. The container name (in my case) is the SID of the user, so it’s not obvious who this belongs to. The alternative would be to use GetIdentitiesAsync to get a more reasonable username (potentially munging the email address). Note that you can set up the storage account to provide read-only access to the files generally. In this case, the Uri property on the StorageTokenViewModel object holds the URI that you can use to access the photo later.


The main caveat is that I can only upload a file while I am online. I don’t have any checks for online vs. offline. That’s easily added by leveraging another Xamarin plugin from James Montemagno, but it doesn’t solve the underlying problem that this solution is online only. I also have not done any of the improve suggested by the Azure Storage SDK – like progress reporting, restart upload and so on. That would take a whole UX design to include that, so I have not bothered.

I could also deal with a bunch of deficiencies in the blob URI generation within the custom URI. Right now, all URIs are generated as GUIDs, which isn’t the most friendly. To allow easier browsing, I should at least be looking at appending the suffix extension of the original file (especially since I have it at that point) and making the container specification easier. This is definitely left as an exercise for the reader.

Next time, I’ll look at the offline aspect of file upload and attaching pictures to specific items. Until then, my code is in the file-upload solution on my GitHub repository.

One thought

  1. Pingback: Azure Weekly: June 6, 2016 | Build Azure

Comments are closed.