Hello everyone,
I'm trying to incrementally ingest data from a Delta table (Delta Lake format) stored in an Azure Data Lake gen2 to another Delta table in a different Azure Data Lake gen2 using Azure Data Factory's Data Flow.
With a Filter activity I retrieve only the new data to ingest (based on technical fields) and, as you can see in the attached snapshot, source data is correctly filtered through Data Flow.
The Data Flow process produces a new version of the Delta table, but the new parquet files created contains either updated and unchanged data. Is it possible to create new table versions which contains only new/updated data?
