cancel
Showing results for 
Search instead for 
Did you mean: 

Expose Parameters to Azure Data Factory Pipeline

The Azure Data Factory connector is great for executing a pipeline, however, its severly limiting since you can not pass parameters to the pipeline. The request is to please expose parameters defined for the pipeline, just the same way to expose parameters for a stored procedure. Once you pick the pipeline, if there are parameters, show them in the action settings so we can pass parameters in.  This would be a HUGE improvement to the connector.

 

Thanks

Status: New
Comments
New Member

This is possible, though not easy, please read the blog of Matt:

 

http://blogs.adatis.co.uk/matthow/post/Using-ADF-V2-Activities-in-Logic-Apps

 

Thanks Matt for this great blog.

 

With kind regards,

Jeroen

Frequent Visitor

Hi,

 

I can see that there's been an addition to the ADF - Create Pipeline Run connector to allow parameters to be set within the flow interface. I am however struggling to set the paramenters using this. Has anyone had any luck?

 

Thanks

Chris

Anonymous
Not applicable

@C_Roberts_BRS 

I was having a similar problem, but finally figured it out. 

 

First, in the main screen of the App, concatenate the contents of text input boxes (where the end user will enter individual parameters) into a JSON format.

Using triple quotes and Char(34) to make sure parameter is formatted as a STRING (only way it will work)Using triple quotes and Char(34) to make sure parameter is formatted as a STRING (only way it will work)Concatenating contents of labels above into JSON format.Concatenating contents of labels above into JSON format.Attach the following Flow to a button, and use the JSON-formatted parameters as the argument for .Run().Parses the JSON-formatted argument referenced by the button on the App interface. Simply copy and paste a JSON with sample data into the "Sample payload to generate schema" and Flow will do the rest.Parses the JSON-formatted argument referenced by the button on the App interface. Simply copy and paste a JSON with sample data into the "Sample payload to generate schema" and Flow will do the rest.

Connect to your Data Factory pipeline and in the "parameters" box, enter a JSON-formatted array with dynamic content for the string values (make sure to place double quotes around the dynamic content).Connect to your Data Factory pipeline and in the "parameters" box, enter a JSON-formatted array with dynamic content for the string values (make sure to place double quotes around the dynamic content).Attach the flow to a button (or whatever you want) and use the concatenated label as the argument within .Run()Attach the flow to a button (or whatever you want) and use the concatenated label as the argument within .Run()As a final note, make sure the names of the parameters you are using throughout the process match the names of the parameters used in your Data Factory. Mine are simply ParamDF and ParamDF2, but generally there will be more parameters, so make sure to keep track of them throughout the entire workflow.

My parameters in Data Factory are ParamDF and ParamDF2, consistent with what is passed through the entire workflow.My parameters in Data Factory are ParamDF and ParamDF2, consistent with what is passed through the entire workflow.

Hope this helps