Deploy .NET for Apache Spark worker and user-defined function binaries
This how-to provides general instructions on how to deploy .NET for Apache Spark worker and user-defined function binaries. You learn which Environment Variables to set up, as well as some commonly used parameters for launching applications with
Configurations show the general environment variables and parameters settings in order to deploy .NET for Apache Spark worker and user-defined function binaries.
When deploying workers and writing UDFs, there are a few commonly used environment variables that you may need to set:
|DOTNET_WORKER_DIR||Path where the
It's used by the Spark driver and will be passed to Spark executors. If this variable is not set up, the Spark executors will search the path specified in the
|DOTNET_ASSEMBLY_SEARCH_PATHS||Comma-separated paths where
Note that if a path starts with ".", the working directory will be prepended. If in yarn mode, "." would represent the container's working directory.e.g. "C:\Users\<user name>\<mysparkapp>\bin\Debug\<dotnet version>"
|DOTNET_WORKER_DEBUG||If you want to debug a UDF, then set this environment variable to
Once the Spark application is bundled, you can launch it using
spark-submit. The following table shows some of the commonly used options:
|--class||The entry point for your application.
|--master||The master URL for the cluster.
|--deploy-mode||Whether to deploy your driver on the worker nodes (
|--conf||Arbitrary Spark configuration property in
|--files||Comma-separated list of files to be placed in the working directory of each executor.
|--archives||Comma-separated list of archives to be extracted into the working directory of each executor.
|application-jar||Path to a bundled jar including your application and all dependencies.
e.g. hdfs://<path to your jar>/microsoft-spark-<version>.jar
|application-arguments||Arguments passed to the main method of your main class, if any.
e.g. hdfs://<path to your app>/<your app>.zip <your app name> <app args>
Specify all the
application-jar when launching applications with
spark-submit, otherwise they will be ignored. For more information, see
spark-submit options and running spark on YARN details.
Frequently asked questions
When I run a spark app with UDFs, I get a `FileNotFoundException' error. What should I do?
Error: [Error] [TaskRunner]  ProcessStream() failed with exception: System.IO.FileNotFoundException: Assembly 'mySparkApp, Version=22.214.171.124, Culture=neutral, PublicKeyToken=null' file not found: 'mySparkApp.dll'
Answer: Check that the
DOTNET_ASSEMBLY_SEARCH_PATHS environment variable is set correctly. It should be the path that contains your
After I upgraded my .NET for Apache Spark version and reset the
DOTNET_WORKER_DIR environment variable, why do I still get the following
Error: Lost task 0.0 in stage 11.0 (TID 24, localhost, executor driver): java.io.IOException: Cannot run program "Microsoft.Spark.Worker.exe": CreateProcess error=2, The system cannot find the file specified.
Answer: Try restarting your PowerShell window (or other command windows) first so that it can take the latest environment variable values. Then start your program.
After submitting my Spark application, I get the error
System.TypeLoadException: Could not load type 'System.Runtime.Remoting.Contexts.Context'.
Error: [Error] [TaskRunner]  ProcessStream() failed with exception: System.TypeLoadException: Could not load type 'System.Runtime.Remoting.Contexts.Context' from assembly 'mscorlib, Version=126.96.36.199, Culture=neutral, PublicKeyToken=...'.
Answer: Check the
Microsoft.Spark.Worker version you are using. There are two versions: .NET Framework 4.6.1 and .NET Core 2.1.x. In this case,
Microsoft.Spark.Worker.net461.win-x64-<version> (which you can download) should be used since
System.Runtime.Remoting.Contexts.Context is only for .NET Framework.
How do I run my spark application with UDFs on YARN? Which environment variables and parameters should I use?
Answer: To launch the spark application on YARN, the environment variables should be specified as
spark.yarn.appMasterEnv.[EnvironmentVariableName]. Please see below as an example using
spark-submit \ --class org.apache.spark.deploy.dotnet.DotnetRunner \ --master yarn \ --deploy-mode cluster \ --conf spark.yarn.appMasterEnv.DOTNET_WORKER_DIR=./worker/Microsoft.Spark.Worker-<version> \ --conf spark.yarn.appMasterEnv.DOTNET_ASSEMBLY_SEARCH_PATHS=./udfs \ --archives hdfs://<path to your files>/Microsoft.Spark.Worker.net461.win-x64-<version>.zip#worker,hdfs://<path to your files>/mySparkApp.zip#udfs \ hdfs://<path to jar file>/microsoft-spark-2.4.x-<version>.jar \ hdfs://<path to your files>/mySparkApp.zip mySparkApp