question

arkiboys avatar image
0 Votes"
arkiboys asked nasreen-akter answered

dataflow sink to update the source file

dataflow1 has the following:
source1 --> aggregate --> sink1

source1 --> dsDatacompanies
sink1 --> dsDatacompanies

Note that source reads a .csv
aggregate then gets the distinct rows
sink1 then writes to the same file as source.

Is this ok or should the sink file be different to that of source?

Thank you

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

nasreen-akter avatar image
0 Votes"
nasreen-akter answered

Hi @arkiboys,

I would create a separate output file than overwriting the source file. Note that you can use the same dataset as SOURCE and as SINK, only you have to parameterize the dataset e.g., if you want to create the file in a different folder, param the folder path, if you want to create the file in the same folder with different name --> param the filename. Thanks!

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.