question

CalderonFernandezJoseExternoEmpres-0090 avatar image
0 Votes"
CalderonFernandezJoseExternoEmpres-0090 asked MartinJaffer-MSFT commented

good morning i have a problem with pipeline

ErrorCode=UserErrorSqlDWCopyCommandError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL DW Copy Command operation failed with error 'HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: ClassCastException: ',Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: ClassCastException: ,Source=.Net SqlClient Data Provider,SqlErrorNumber=106000,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=106000,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: ClassCastException: ,},],'

azure-data-factoryazure-data-lake-storage
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @CalderonFernandezJoseExternoEmpres-0090 and welcome to Microsoft Q&A.

It looks like you ran into trouble reading data from a Synapse dedicated pool, or reading data from ADLS Gen2. Could you please share some more details?

Did this happen only once, or does it happen again every time you try?
Is the source SQL DW, or Data Lake?
If this is a copy activity, what is the sink?

The Unexpected error encountered filling record reader buffer: ClassCastException suggests to me there might be a record with the wrong data type, or perhapse the schema changed.

0 Votes 0 ·

If you found your own solution, please share with the community.

0 Votes 0 ·

1 Answer

CalderonFernandezJoseExternoEmpres-0090 avatar image
0 Votes"
CalderonFernandezJoseExternoEmpres-0090 answered MartinJaffer-MSFT commented

Good morning, I need to contextualize, I have a data that I transform into parquet and leave it in storage account and then load them into synapse sql with pipeline, these .parquet files weigh 8.1 KiB and are around 300. The strange thing is that with the files I think it throws me the error message mentioned in the first email, because the historical files have been loading correctly


Thanks for your time

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Good evening @CalderonFernandezJoseExternoEmpres-0090

If the historical data is loading correctly and only the new data has a problem, and you are using the same process to load both, then something must have changed between historical and new.

This type of error message does not sound like a weight problem.

0 Votes 0 ·