question

EricksonWinter-9685 avatar image
0 Votes"
EricksonWinter-9685 asked KranthiPakala-MSFT commented

pipeline operation requesting more cores than available

I am getting this error when running a pipeline from a trigger. I am trying to identify where to adjust this request:

Operation on target Each Source Table failed: Activity failed because an inner activity failed; Inner activity name: PushToSQL, Error: Your ADF cluster requested 8 vcores. However, the workspace only has 2 vcores available out of quota of 50 vcores for node size family [MemoryOptimized]. Try reducing the numbers of vcores requested or increasing your vcore quota.

azure-synapse-analytics
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

KranthiPakala-MSFT avatar image
0 Votes"
KranthiPakala-MSFT answered KranthiPakala-MSFT commented

Hi @EricksonWinter-9685,

Welcome to Microsoft Q&A forum and thanks for reaching out.

I assume that you are trying to use Dataflow in your Synapse pipelines. If that is the case then the minimum cluster size to run a Data Flow is 8 vCores. But seems like your workspace has only 2 vcores available out of quota of 50 vcores.

Every Azure Synapse workspace comes with a default quota of vCores that can be used for Spark. The quota is split between the user quota and the dataflow quota so that neither usage pattern uses up all the vCores in the workspace. The quota is different depending on the type of your subscription but is symmetrical between user and dataflow. However if you request more vCores than are remaining in the workspace, then you will get the above error:

To resolve this issue, you need to request a capacity increase via the Azure portal by creating a new support ticket.

Step1: Create a new support ticket and select issue type as Service and subscription limits (quotas) and quota type as Azure Synapse Analytics.


124476-image.png

Step2: In the Details tab, click on Enter details and choose quota type as Apache Spark (vCore) per workspace , select workspace, and request quota as shown below.

124531-image.png

Step3: Select support method and create the ticket.

For more details, refer to Quotas and resource constraints in Apache Spark for Azure Synapse.

Hope this helps. Do let us know if you any further queries.


  • Please accept an answer and up-vote if it helps. Original posters help the community find answers faster by identifying the correct answer. Here is how.

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification.





image.png (439.3 KiB)
image.png (351.4 KiB)
· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thank you, I am waiting to validate this works now. One point of clarification. Why would my pipeline/dataflow work when I'm debugging, but throw this error when I trigger the pipeline?

0 Votes 0 ·

Hi @EricksonWinter-9685,

Thanks for your response. AFAIK , the debug session uses a different sandbox clusters which is why your debug runs are successful.

0 Votes 0 ·

Hi @EricksonWinter-9685,

Just checking in to see if the above suggestion was helpful. If it answers your query, please do click “Accept Answer” and/or Up-Vote, as it might be beneficial to other community members reading this thread. And, if you have any further query do let us know.

0 Votes 0 ·