question

Murugeswari-2987 avatar image
0 Votes"
Murugeswari-2987 asked Murugeswari-2987 commented

Blob datatype conversion

We have migrated few of our tables from on premise oracle to Dedicated SQL Pool via ADF. In that we have a table with blob datatype column. The records in that on-premise (oracle) table are in the format as shown below:-

193872-image.png


But when we migrated to dedicated pool, we could seethe column datatype has changed to nvarchar and records are in the format as shown below:-

193873-image.png

So is there any way to validate the data

Thanks


azure-data-factoryazure-synapse-analytics
image.png (24.1 KiB)
image.png (22.4 KiB)
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @Murugeswari-2987 ,

Thanks for posting query in Microsoft Q&A Platform.

Could you please help to understand how you are loading data in to dedicated SQL Pool? Is it using Copy activity? Or any other way?

0 Votes 0 ·

Hi Maheer,

Yes through copy activity, loaded data into Dedicated Sql Pool. Tried two ways

  1. With Staging - From on-premise oracle to blob (.gz format) then from blob to dedicated sql pool

  2. Without Staging - Direct copy from oracle to synapse

In both the methods getting data like in above image





0 Votes 0 ·

1 Answer

ShaikMaheer-MSFT avatar image
0 Votes"
ShaikMaheer-MSFT answered Murugeswari-2987 commented

Hi @Murugeswari-2987 ,

Thanks for posting query in Microsoft Q&A Platform.

Azure Synapse SQL not supports BLOB data type. Hence you are seeing BLOB converted as you mentioned above. Click here to know more about data types supported in Azure Synapse SQL.

Also, When we use Copy activity and Oracle as source then BLOB data type will be converted as interim data type and there by loads to Sink. Click here to see interim datatypes related to oracle data types.

If you would like to validate data, then we can consider validating other columns and see all of them loaded or not. OR we can consider validating that column data alone between stage(blob storage) and sink.

Between, below video helps with validating data logic in ADF.
https://www.youtube.com/watch?v=i2PkwNqxj1E

We can also use Assert Transformation to data quality and validation checks.
https://www.youtube.com/watch?v=_NzWpTRxt0s

Hope this helps. Please let us know if any further queries.


Please consider hitting Accept Answer button. Accepted answers help community as well.

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks Maheer for your Solution

0 Votes 0 ·