Azure data factory copy data activity in queued state.

Vimlesh Rivonkar 0 Reputation points
2024-04-04T07:00:32.6633333+00:00

Hello,

I am trying to run Azure data factory pipeline with copy data activity. This pipeline is supposed to read excel file from azure blob storage and copy the data to azure sql db. Till yesterday it was working. When I try to debug today its getting stuck into Queued status. Can you please guide me to debug this issue.

Thanks

Vimlesh Rivonkar

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,752 questions
{count} votes

3 answers

Sort by: Most helpful
  1. Vinodh247-1375 11,396 Reputation points
    2024-04-04T07:17:33.36+00:00

    Hi Vimlesh Rivonkar,

    Thanks for reaching out to Microsoft Q&A.

    check for the following points to narrow down the issue..

    1. Service outage from Azure
    2. Check the concurrency settings
      1. Sometimes, pipelines get stuck in the “queued” status due to concurrency settings. Ensure that the concurrency property is set appropriately. See this --> https://learn.microsoft.com/en-us/answers/questions/417761/azure-data-factory-pipelines-are-sometimes-stuck-i

    Please 'Upvote'(Thumbs-up) and 'Accept' as an answer if the reply was helpful. This will benefit other community members who face the same issue.


  2. Smaran Thoomu 10,720 Reputation points Microsoft Vendor
    2024-04-08T08:39:37.33+00:00

    Hi @Vimlesh Rivonkar

    Thank you for reaching out to us regarding your Azure Data Factory pipeline with copy data activity. We understand that you are facing an issue where your pipeline is stuck in the Queued status.

    We would like to inform you that there has been an outage reported in the Europe region, which might have affected your pipeline. However, the issue has been resolved, and the service has been restored. We suggest you rerun your pipeline and check if the issue still persists. If the issue persists, please let us know, and we will investigate further.

    Please note that if your pipeline has a concurrency policy, you should verify that there are no old pipeline runs in progress.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

  3. Pinaki Ghatak 2,400 Reputation points Microsoft Employee
    2024-05-10T20:43:15.55+00:00

    Hello @Vimlesh Rivonkar

    There could be various reasons why your pipeline is stuck in the queued status. It could be due to hitting concurrency limits, service outages, network failures, and so on.

    Let's try to troubleshoot the issue step by step. First, check if your pipeline has a concurrency policy assigned to it. If it does, go to the Monitoring view, and make sure there's nothing in the past 45 days that's in progress.

    If there is something in progress, you can cancel it, and the new pipeline run should start. If there are no old pipeline runs in progress, it is possible that your run was impacted by a transient network issue, credential failures, services outages, etc.

    If this happens, Azure Data Factory has an internal recovery process that monitors all the runs and starts them when it notices something went wrong.

    You can rerun pipelines and activities as described here: https://docs.microsoft.com/en-us/azure/data-factory/monitor-visually#rerun-pipelines-and-activities.

    You can rerun activities if you had canceled activity or had a failure as per Rerun from activity failures. This process happens every one hour, so if your run is stuck for more than an hour, create a support case. If the issue persists, please provide me with more information like the error message or any other details that you see in the pipeline run details. It will help me to provide you with a more specific solution. Let me know if this helps.


    I hope that this response has addressed your query and helped you overcome your challenges. If so, please mark this response as Answered. This will not only acknowledge our efforts, but also assist other community members who may be looking for similar solutions.

    0 comments No comments