Data Explorer Pools : Data Flow - Invalid token (empty value)

Mahasubramanian Maharajan 1 Reputation point
2022-01-10T09:24:03.43+00:00

Hi team,

We are trying to integrate "Data Explorer Pools" data in Synapse workspace especially through Mapping Data Flow. Where i can connect and read data from using linked service (Azure Data Explorer) with a help of integrated dataset on Pipeline & Spark notebooks. But not working in data flow, tried same dataset as a source (Integrated Dataset / Inline).

`
Error details
Error code : DFExecutorUserError
Activity ID : 1707222b-0dac-4997-acbd-cb8aaeb12a2e

Details Invalid token (empty value) is returned from service side with url: https://DMEastus.svc.datafactory.azure.com/sparkjob/GetMsiTokenV2 and payload:{"LinkedIRCredential":"","Resource":"https://dataexpool.xxx-cmd-df-synp-01-dev.kusto.azuresynapse.net/","EntityName":"xxx-cmd-df-synp-01-dev"}
`

Azure Data Explorer
Azure Data Explorer
An Azure data analytics service for real-time analysis on large volumes of data streaming from sources including applications, websites, and internet of things devices.
483 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,395 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,601 questions
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,422 Reputation points Microsoft Employee
    2022-01-18T20:40:45.513+00:00

    Hello @Mahasubramanian Maharajan ,

    Thanks for the additional details.

    As per the discussion with product team, it has been identified that Synapse workspace created Data explorer pool need extra configuration for get AAD token in dataflow which is why you are facing this issue. We have tested, both ADF and synapse and both will fail.

    As per the latest info from product team, they are working on implementing a fix for this and tentative ETA is end of this month (please note that this is only a tentative anticipation).

    A temporary suggestion/workaround from Product team is to use Copy activity. If you want to apply transformation, please use copy activity to move the data from data explorer to gen2/blob then use gen2/blob as source in dataflow.

    But once after the fix is applied it can automatically work in Synapse/ADF dataflow without any changes. So, no need to re-add permission or recreate the resources.

    I will keep this thread posted once the fix for the issue is released.

    Sorry for the inconvenience caused because of this issue.

    Hope this info will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators