I have been using Azure Data Factory for various ETL activities; copying from various data sources, transforming and dumping it into other destinations. Sometimes the pipeline I'm creating becomes very complex (its not simple transformation) . For example I need to connect to external server(using REST) then get data , do many steps and finally write to different files. This requires good amount of logic which makes the data factory look very complex and difficult to read. Is there any better alternative where I can code all of this instead of using defined Azure Data activities?
I can write Python/Java program and instead of using those defined boxes I can write my own custom code. May be Synapse or Data bricks or something else can be used. Which is the better alternative (on similar cost)?