After the parse I do a flatten transaction (using a rule based mapping to flatten an array in the complex type hierarchy) which allows me to save a CSV file. Every different report type needs a different complex type, which after I flatten gives me a different set of columns. I was hoping to create one mapping flow that was dynamic, rather than having to create a mapping flow per different source file.
Parameterize mapping data flow parse transform
I have an azure mapping data flow in Synapse. As part of the data flow I need to use the parse transform on an embedded string. The parse transform requires me to define the complex type generated by the parse transform. I can do that for a particular source file. However I would like to be able to parameterize the mapping flow so I can use it to process different files which have different complex types. So what I would like to do is to be able to pass in the complex type definition (or maybe part of the complex type definition) as a parameter to the mapping data flow, rather than having a hardcoded column type in the parse transform. Then in the mapping flow parse transformation be able to dynamically build the output column type from the parameter. Is there a way to do something like this?
Other options I've considered are creating a custom mapping flow for every different source file, or possibly trying to use a conditional split but both of those require dev work for every different source file. It would be nice to have a reusable data flow that I can just parameterize for different source files.
2 answers
Sort by: Newest
-
-
Kiran-MSFT 691 Reputation points Microsoft Employee
2021-06-17T04:27:21.593+00:00 If this is a parameter you would not be able to do any downstream transformations after parse. What is the purpose of parsing if you don't intent to access any information in it?