Ingestion mapping from parquet source to Kusto using ADF copy activity failing because of mapping kind?

youssef jouan 0 Reputation points
2024-02-19T01:18:40.67+00:00

Hi, I have a blob storage with parquet files that I want to ingest into Kusto using copy activity in ADF V2. I have created my source dataset as parquet source. I have created my sink dataset to point to my Kusto table. I have created an ingestion mapping in Kusto for parquet and followed the documentation(example below) :

.create-or-alter table TestTable ingestion parquet mapping "MappingTest" 
```[  {"Column": "id", "Properties": {"Path": "$.id"}},  
{"Column": "BLE", "Properties": {"Path": "$.BLE"}},  
{"Column": "schema_version", "Properties": {"Path": "$.schema_version"}},  
{"Column": "timestamp", "Properties": {"Path": "$.timestamp"}}
]```

so far everything works fine and my parquet source columns are exactly the same as my Kusto table. User's image

User's image

when I run the copy activity in ADF , it thinks that the mapping kind should be CSV and not parquet and I am not sure why. I have troubleshooted this for a while and did some research but could not find any useful information on why this error is pooping up.
error :
ErrorCode=KustoMappingReferenceHasWrongKind,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Mapping reference should be of kind 'Csv'. Mapping reference: 'MappingTest'. Kind 'Parquet'.,Source=Microsoft.DataTransfer.Runtime.KustoConnector,' User's image

Any help would be very appreciated! Thank you!

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,469 questions
Azure Data Explorer
Azure Data Explorer
An Azure data analytics service for real-time analysis on large volumes of data streaming from sources including applications, websites, and internet of things devices.
487 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,693 questions
{count} votes