Hi everyone,
I'm trying to load data from azure data lake v2 to azure sql dedicated pool.
The target table has the following definition for distribution and partition:
WITH
(
DISTRIBUTION = ROUND_ROBIN,
CLUSTERED COLUMNSTORE INDEX,
PARTITION
(
[PARTITION_ID] RANGE LEFT FOR VALUES (1)
)
)
The table has around 320 columns, therefore I suspect the row size is playing a role here.
When the insert start using the copy command after a couple of minutes I get this error:
Operation on target CD_EDL_STG_DWH failed: ErrorCode=UserErrorSqlDWCopyCommandError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL DW Copy Command operation failed with error 'Unable to allocate 178149 KB for columnstore compression because it exceeds the remaining memory from total allocated for current resource class and DWU. Please rerun query at a higher resource class, and also consider increasing DWU. See 'https://aka.ms/sqldw_columnstore_memory' for assistance.'
I have no idea how to tackle this. The only info I gathered was using the DBCC PDW_SHOWSPACEUSED command:
Could someone help me to troubleshoot this?
I think I have enough DWU and is a design problem.