Change node size in Azure Data Factory (Spark)

Joaquin Chemile 41 Reputation points
2020-06-06T00:51:51.307+00:00

Hello!

How do I change the node Size in Azure Data Factory for HD Insight?

9158-cluster.jpg

I Want to create an Standard_D13_v2. But I didn't find a tutorial for do it:

9235-capture.jpg

Azure HDInsight
Azure HDInsight
An Azure managed cluster service for open-source analytics.
199 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,689 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 78,986 Reputation points Microsoft Employee
    2020-06-08T07:47:15.743+00:00

    @Joaquin Chemile Welcome to the Microsoft Q&A platform.

    Note: If you want to create Standard D13_v2 sized head nodes, specify Standard D13_v2 as the value for Head node size in Advanced properties.

    9168-adf-hdi-customsize.jpg

    Specifying node sizes See the Sizes of Virtual Machines article for string values you need to specify for the properties mentioned in the previous section. The values need to conform to the CMDLETs & APIS referenced in the article. As you can see in the article, the data node of Large (default) size has 7-GB memory, which may not be good enough for your scenario.

    For more details, refer "Compute environments supported by Azure Data Factory - Node Sizes".

    Hope this helps. Do let us know if you any further queries.

    ----------------------------------------------------------------------------------------

    Do click on "Accept Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful