question

Hochmanj-5166 avatar image
1 Vote"
Hochmanj-5166 asked SaurabhSharma-msft commented

How to pass arguments from Pipeline to Data Flow parameter of type 'map'

If one has a data flow with a parameter of type map[string, string], how do you pass in the argument when calling from a pipeline. See image below.

If i'm not mistaken, you cannot create a mapping in Pipelines using '->' notation. I tried a few approaches such as making string templates and/or equivalent jsons but none of the approaches seem to work.

87006-image.png


azure-data-factory
image.png (11.7 KiB)
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

SaurabhSharma-msft avatar image
0 Votes"
SaurabhSharma-msft answered SaurabhSharma-msft commented

Hi @hochmanj-5166,

Thanks for using Microsoft Q&A !!

I have tried passing a values to dataflow parameter (having data type as map(string,string)) and it works fine for me. When you select the DataFlow activity on Pipeline canvas you will be presented with the parameters defined for the specific data flow and you can then pass the values either using DataFlow Expressions and Pipeline Expressions.
87151-image.png
When "DataFlow expressions" are used you can access the functions, parameters, or any other schema defined in your data flow and when you use "Pipeline Expressions" then you can refer to system variables, functions, pipeline parameters/variables. Also, pipeline expressions type does not need to match the parameters type as defined in Data Flow. Please refer to the documentation.
I have tried using both these ways and able to pass the values correctly against a map datatype. Please refer to the below gif.
87161-dataflow-mapdatatype.gif

Hope this helps. Please let me know if you have any questions.

Thanks
Saurabh


Please do not forget to "Accept the answer" wherever the information provided helps you to help others in the community.



image.png (17.5 KiB)
· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Am I able to compose the argument from within the pipeline from a lookup from a dataset rather than hard coding the argument into the pipeline?

0 Votes 0 ·

@hochmanj-5166 Can you please share more details on how you are generating the values out of lookup and what's the source of the dataset - csv, SQL ?

0 Votes 0 ·

well i have json dataset with the schema like .. {"old_column_name1": "new_column_name1", "old_column_name2": "new_column_name2"}.

Ultimately, i'm trying to rename columns so if there's a better approach to this i'm all ears. The direction i was going was passing array[k->v] to a data flow and trying to rename either Select or Derive column like:

                 ~ new column name = map[$$] or map[name] or something simliar. 

Perhaps I'm approaching this wrong.

for example it seems this is similar approach in pyspark i found online

                 df = reduce(lambda data, idx: data.withColumnRenamed(oldColumns[idx], newColumns[idx]), xrange(len(oldColumns)), data)

or with pandas for example,

                 df = df.rename(columns={'oldName1': 'newName1', 'oldName2': 'newName2'})

0 Votes 0 ·
Show more comments