
Azure Function – Blob Trigger
Azure Function is a cloud service available on-demand that provides all the continually updated infrastructure and resources needed to run your application. The Blob storage trigger starts a function when a new or updated blob is detected. The blob contents are provided as input to the function. You focus on the pieces of code that matter most to you, and Functions handles the rest. Functions provides serverless compute for Azure. You can use Functions to build web APIs, respond to database changes, process IoT streams, manage message queues, and more. The Azure blob storage trigger requires a general-purpose storage account. Storage v2 accounts with hierarchical namespaces are also supported.
Prerequisite
- Azure Subscription
- Azure Storage Account (Tutorial to create Azure Storage)
- Visual Studio or Visual Studio Code
- Working knowledge of Visual Studio, C#, Azure Storage and Azure Cloud
Step 1
Create a new project in Visual Studio. Choose the default Azure function template

In the next screen give a name to your VS project and click create

On the Create a new Azure Functions Application screen select Blob Trigger. In this example we are working with blob triggers. Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.
Here you can also select the storage account you have in your Azure subscription.

Once created the solutions gets loaded in the solution explorer. You can see below that I have a project called BlobTrigger. Open the .cs file in the solution explorer. Please note that I have already renames the .cs file to a meaning full name.

In this example we have one Azure Storage Account with 2 containers, container1 and container2. I will be leveraging the Microsoft.Azure.WebJobs SDK to trigger an even to copy files from container1 to container2 when a file gets uploaded to container1. Below is the code in the .cs file. Below I am using the simple copyTo function to copy files from container1 to container2 whenever a file is uploaded to container1
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace BlobTrigger
{
[StorageAccount("BlobConnectionString")]
public static class BlobCopyTrigger
{
[FunctionName("BlobCopyTrigger")]
public static void Run(
[BlobTrigger("container1/{name}")]Stream inputBlob,
[Blob("container2/{name}", FileAccess.Write)] Stream outputBlob,
string name,
ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {inputBlob.Length} Bytes");
try
{
inputBlob.CopyTo(outputBlob);
}
catch(Exception e)
{
log.LogError("copy fail", e);
}
}
}
}
Define the Azure Storage Connection String in the local.settings.json file as shown below
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"BlobConnectionString": "DefaultEndpointsProtocol=https;AccountName=myazurestorage;AccountKey=mykey==;EndpointSuffix=core.windows.net"
}
}
Once the solution is setup, you can test the function locally. To make sure there are no errors in your solution, click on the play button in VS and let the solution build and run. You should see the screen below if everything works.

Please note all the lines in green starting with Executing are my test when I uploaded a file to container1. At this point you should be able to test and make sure you get the executing messages as well.
Now it is time to publish your Azure function to Azure. For this right click on the solution name in the solution explorer and click “Publish”

On the next screen select the appropriate plan. In my case I choose the Azure functions Consumption Plan and I create a new plan and click create. On the next screen select the subscription, resource group , location and the storage account (this is different than the storage account used above. This is more for the storage of the logs for your Azure function)

Then you should be ready to publish as shown below.

In the next article I will show you how to configure some of the settings in the published Azure functions like configuring application insights for better logging. configuration key usage, app key usage etc.

Common scenarios for Azure Functions
The following are a common, but by no means exhaustive, set of scenarios for Azure Functions.
If you want to… | then… |
---|---|
Build a web API | Implement an endpoint for your web applications using the HTTP trigger |
Process file uploads | Run code when a file is uploaded or changed in blob storage |
Build a serverless workflow | Chain a series of functions together using durable functions |
Respond to database changes | Run custom logic when a document is created or updated in Cosmos DB |
Run scheduled tasks | Execute code on pre-defined timed intervals |
Create reliable message queue systems | Process message queues using Queue Storage, Service Bus, or Event Hubs |
Analyze IoT data streams | Collect and process data from IoT devices |
Process data in real time | Use Functions and SignalR to respond to data in the moment |