question

arkiboys avatar image
0 Votes"
arkiboys asked arkiboys answered

generic transformation

Hello,
At present, the pipeline works fine where it reads a config file and based on the values of the config, it goes to the required sources and lands it into blobstorage containers accordingly.
For example, based on the parameter passed, around 50 filtered tables get transferred to blob storage containers...

What I would like to have is as follows:

From the container, each table to be transferred to the second container once the required fields have gone through the necessary transformations if required...
Doing the above perhaps means I can create a separate dataflow for each table and in those dataflows have the necessary checks to see if the fields need any transformations, etc.
Question, how can I do these checks/transformations generically so that I do not end up with so many dataflows and rather have one which handles all the transformations?
A long while back, I did the exact same thing in ssis using C# script task but not sure how similar tasks can be done in data factory.

Any suggestions?

Thank you

azure-data-factory
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @arkiboys,
Thanks for the ask and using Microsoft Q&A platform .
Can you please elaborate the a bit more on the point ? An example on the kind of transformation you are fering will help .

"From the container, each table to be transferred to the second container once the required fields have gone through the necessary transformations if required..."

Thanks
Himanshu



0 Votes 0 ·

1 Answer

arkiboys avatar image
0 Votes"
arkiboys answered

Any transformation, for example, trimming, applying some logic or even checking for datatypes, etc.
Thanks

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.