Azure Data Factory is Azure’s cloud ETL service for scale-out serverless data integration and data transformation. It offers a code-free UI for intuitive authoring and single-pane-of-glass monitoring and management. You can also lift and shift existing SSIS packages to Azure and run them with full compatibility in ADF. SSIS Integration Runtime offers a fully managed service, so you don’t have to worry about infrastructure management.

Recently while architecting a big data platform solution I came across a scenario where I had to provide my end users the ability to trigger pipeline runs based on button clicks from a web app via Power BI. One of my favorite ways to handle on demand user triggers is to use c# .net Microsoft.Azure.Management library which has a rich set of classes to handle such time of scenarios. Below is the .net code that triggers any pipeline run.
Make sure to get Microsoft.Azure.Management from nugget package manager
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Rest;
using Microsoft.Azure.Management.ResourceManager;
using Microsoft.Azure.Management.DataFactory;
using Microsoft.Azure.Management.DataFactory.Models;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
class pipeline
{
public IDataFactoryManagementClient client;
//private string applicationId = "";
//private string clientSecret = "";
//private string subscriptionId = "";
//private string tenantID = "";
private string applicationId = "applicationId ";
private string clientSecret = "clientSecret ";
private string subscriptionId = "subscriptionId ";
private string tenantID = "tenantID ";
public void create_adf_client()
{
// Authenticate and create a data factory management client
var context = new AuthenticationContext("https://login.windows.net/" + tenantID);
var ClientCredential = new ClientCredential(applicationId, clientSecret);
var result = context.AcquireTokenAsync(
"https://management.azure.com/", ClientCredential).Result;
var cred = new TokenCredentials(result.AccessToken);
client = new DataFactoryManagementClient(cred)
{
SubscriptionId = subscriptionId
};
}
public pipeline()
{
create_adf_client();
}
public void StartPipeline(string resourceGroup, string dataFactoryName, string pipelineName)
{
Console.WriteLine("Creating pipeline run...");
var runResponse = client.Pipelines.CreateRunWithHttpMessagesAsync(resourceGroup, dataFactoryName, pipelineName).Result.Body;
Console.WriteLine("Pipeline run ID: " + runResponse.RunId);
}
}
Call from main function
static void Main(string[] args)
{
var runPipeline = new pipeline();
string resourceGroup = ConfigurationManager.AppSettings["resourceGroup"];
string dataFactoryName = ConfigurationManager.AppSettings["dataFactoryName"];
if (args == null)
{
Console.WriteLine("args is null");
}
else
{
runPipeline.StartPipeline(resourceGroup, dataFactoryName, args[0]);
}
}
Once the console app is complied and build you can use a agent service like sql agent jobs, , power automate or logic apps via t sql stored procedures (managed gateway needs to be etup for this to work) to call the console app