Submit jobs from R Tools for Visual Studio
R Tools for Visual Studio (RTVS) is a free, open-source extension for the Community (free), Professional, and Enterprise editions of both Visual Studio 2017, and Visual Studio 2015 Update 3 or higher. RTVS is not available for Visual Studio 2019.
RTVS enhances your R workflow by offering tools such as the R Interactive window (REPL), intellisense (code completion), plot visualization through R libraries such as ggplot2 and ggviz, R code debugging, and more.
Set up your environment
Install R Tools for Visual Studio.
Select the Data science and analytical applications workload, then select the R language support, Runtime support for R development, and Microsoft R Client options.
You need to have public and private keys for SSH authentication.
Install PuTTY to provide a compute context to run
RevoScaleRfunctions from your local client to your HDInsight cluster.
You have the option to apply the Data Science Settings to your Visual Studio environment, which provides a new layout for your workspace for the R tools.
To save your current Visual Studio settings, use the Tools > Import and Export Settings command, then select Export selected environment settings and specify a file name. To restore those settings, use the same command and select Import selected environment settings.
Go to the R Tools menu item, then select Data Science Settings....
Using the approach in step 1, you can also save and restore your personalized data scientist layout, rather than repeating the Data Science Settings command.
Execute local R methods
Create your HDInsight ML Services cluster.
Install the RTVS extension.
Download the samples zip file.
examples/Examples.slnto launch the solution in Visual Studio.
1-Getting Started with R.Rfile in the
A first look at Rsolution folder.
Starting at the top of the file, press Ctrl+Enter to send each line, one at a time, to the R Interactive window. Some lines might take a while as they install packages.
- Alternatively, you can select all lines in the R file (Ctrl+A), then either execute all (Ctrl+Enter), or select the Execute Interactive icon on the toolbar.
After running all the lines in the script, you should see an output similar to this:
Submit jobs to an HDInsight ML Services cluster
Using a Microsoft ML Server/Microsoft R Client from a Windows computer equipped with PuTTY, you can create a compute context that will run distributed
RevoScaleR functions from your local client to your HDInsight cluster. Use
RxSpark to create the compute context, specifying your username, the Apache Hadoop cluster's edge node, SSH switches, and so forth.
The ML Services edge node address on HDInsight is
CLUSTERNAMEis the name of your ML Services cluster.
Paste the following code into the R Interactive window in Visual Studio, altering the values of the setup variables to match your environment.
# Setup variables that connect the compute context to your HDInsight cluster mySshHostname <- 'r-cluster-ed-ssh.azurehdinsight.net ' # HDI secure shell hostname mySshUsername <- 'sshuser' # HDI SSH username mySshClientDir <- "C:\\Program Files (x86)\\PuTTY" mySshSwitches <- '-i C:\\Users\\azureuser\\r.ppk' # Path to your private ssh key myHdfsShareDir <- paste("/user/RevoShare", mySshUsername, sep = "/") myShareDir <- paste("/var/RevoShare", mySshUsername, sep = "/") mySshProfileScript <- "/usr/lib64/microsoft-r/3.3/hadoop/RevoHadoopEnvVars.site" # Create the Spark Cluster compute context mySparkCluster <- RxSpark( sshUsername = mySshUsername, sshHostname = mySshHostname, sshSwitches = mySshSwitches, sshProfileScript = mySshProfileScript, consoleOutput = TRUE, hdfsShareDir = myHdfsShareDir, shareDir = myShareDir, sshClientDir = mySshClientDir ) # Set the current compute context as the Spark compute context defined above rxSetComputeContext(mySparkCluster)
Execute the following commands in the R Interactive window:
rxHadoopCommand("version") # should return version information rxHadoopMakeDir("/user/RevoShare/newUser") # creates a new folder in your storage account rxHadoopCopy("/example/data/people.json", "/user/RevoShare/newUser") # copies file to new folder
You should see an output similar to the following:
Verify that the
rxHadoopCopysuccessfully copied the
people.jsonfile from the example data folder to the newly created
From your HDInsight ML Services cluster pane in Azure, select Storage accounts from the left-hand menu.
Select the default storage account for your cluster, making note of the container/directory name.
Select Containers from the left-hand menu on your storage account pane.
Select your cluster's container name, browse to the user folder (you might have to click Load more at the bottom of the list), then select RevoShare, then newUser. The
people.jsonfile should be displayed in the
After you are finished using the current Apache Spark context, you must stop it. You cannot run multiple contexts at once.
- Compute context options for ML Services on HDInsight
- Combining ScaleR and SparkR provides an example of airline flight delay predictions.