Working with Azure Storage and Azure Functions

Azure Function – Blob Trigger

Azure Function is a cloud service available on-demand that provides all the continually updated infrastructure and resources needed to run your application. The Blob storage trigger starts a function when a new or updated blob is detected. The blob contents are provided as input to the function. You focus on the pieces of code that matter most to you, and Functions handles the rest. Functions provides serverless compute for Azure. You can use Functions to build web APIs, respond to database changes, process IoT streams, manage message queues, and more. The Azure blob storage trigger requires a general-purpose storage account. Storage v2 accounts with hierarchical namespaces are also supported.

Prerequisite

  1. Azure Subscription
  2. Azure Storage Account (Tutorial to create Azure Storage)
  3. Visual Studio or Visual Studio Code
  4. Working knowledge of Visual Studio, C#, Azure Storage and Azure Cloud

Step 1

Create a new project in Visual Studio. Choose the default Azure function template

In the next screen give a name to your VS project and click create

On the Create a new Azure Functions Application screen select Blob Trigger. In this example we are working with blob triggers. Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.

Here you can also select the storage account you have in your Azure subscription.

Once created the solutions gets loaded in the solution explorer. You can see below that I have a project called BlobTrigger. Open the .cs file in the solution explorer. Please note that I have already renames the .cs file to a meaning full name.

In this example we have one Azure Storage Account with 2 containers, container1 and container2. I will be leveraging the Microsoft.Azure.WebJobs SDK to trigger an even to copy files from container1 to container2 when a file gets uploaded to container1. Below is the code in the .cs file. Below I am using the simple copyTo function to copy files from container1 to container2 whenever a file is uploaded to container1

using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace BlobTrigger
{
    [StorageAccount("BlobConnectionString")]
    public static class BlobCopyTrigger
    {
        [FunctionName("BlobCopyTrigger")]
        public static void Run(
            [BlobTrigger("container1/{name}")]Stream inputBlob,
            [Blob("container2/{name}", FileAccess.Write)] Stream outputBlob,
            string name, 
            ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {inputBlob.Length} Bytes");

            try
            {
                inputBlob.CopyTo(outputBlob);
            }
            catch(Exception e)
            {
                log.LogError("copy fail", e);
            }
        }
    }
}

Define the Azure Storage Connection String in the local.settings.json file as shown below

{
    "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "BlobConnectionString": "DefaultEndpointsProtocol=https;AccountName=myazurestorage;AccountKey=mykey==;EndpointSuffix=core.windows.net"
  }
}

Once the solution is setup, you can test the function locally. To make sure there are no errors in your solution, click on the play button in VS and let the solution build and run. You should see the screen below if everything works.

Please note all the lines in green starting with Executing are my test when I uploaded a file to container1. At this point you should be able to test and make sure you get the executing messages as well.

Now it is time to publish your Azure function to Azure. For this right click on the solution name in the solution explorer and click “Publish”

On the next screen select the appropriate plan. In my case I choose the Azure functions Consumption Plan and I create a new plan and click create. On the next screen select the subscription, resource group , location and the storage account (this is different than the storage account used above. This is more for the storage of the logs for your Azure function)

Then you should be ready to publish as shown below.

In the next article I will show you how to configure some of the settings in the published Azure functions like configuring application insights for better logging. configuration key usage, app key usage etc.

Common scenarios for Azure Functions

The following are a common, but by no means exhaustive, set of scenarios for Azure Functions.

If you want to…then…
Build a web APIImplement an endpoint for your web applications using the HTTP trigger
Process file uploadsRun code when a file is uploaded or changed in blob storage
Build a serverless workflowChain a series of functions together using durable functions
Respond to database changesRun custom logic when a document is created or updated in Cosmos DB
Run scheduled tasksExecute code on pre-defined timed intervals
Create reliable message queue systemsProcess message queues using Queue StorageService Bus, or Event Hubs
Analyze IoT data streamsCollect and process data from IoT devices
Process data in real timeUse Functions and SignalR to respond to data in the moment

Published by Narayan Sujay Somasekhar

• 12+ years of experience leading the build of BI and Cloud Data Platform solutions using cloud technologies such as Snowflake, Azure Synapse, Databricks and AWS Redshift. • Over 8+ years as a Data Analytics and Engineering practice leader with demonstrated history of working with management consulting firms across Tax & Accounting, Finance, Power & Utility industry. • Experience in managing the team roadmap, and delivering actionable data insights to sales, product, marketing, and senior leadership. • Strong background in Data Technology Solutions delivery & Data Automation for business processes using various tools. • Expertise in bringing Data-Driven IT Strategic Planning to align metrics, communicate data changes across reporting, Enterprise Data Warehouses, Data Lakes and Customer Relationship Managements Systems. • Experienced working with cross functional teams, Data Scientists/Analysts and Business Managers in building Data Science and Data Engineering practice from the ground up. • Experienced in Designing and implementing NLP solutions with focus on sentiment analysis, opinion mining, key phase extraction using Azure Cognitive Services and Amazon Comprehend • Extensive programming experience with SQL, Python, C#, R, and Scala.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: