I created a new .net core C# Web API project and added .NET for Apache Spark library. My idea is to submit Web API to Spark and than run Spark queries via HTTP.
Project works from Visual Studio, but after submiting code to Spark I can not acces the web server. Any requests returns Not Found 404.
In Spark log I can see tha web servise is started and connected to a port (5000). Here is the log:
C:\Users\krstic\Desktop\spark proba>spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner --master local .\bin\Debug\net5.0\microsoft-spark-3-1_2.12-2.0.0.jar dotnet .\bin\Debug\net5.0\SparkWebApiRest.dll
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Spark/spark-3.1.2-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
21/09/03 13:38:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/09/03 13:38:05 INFO DotnetRunner: Starting DotnetBackend with dotnet.
21/09/03 13:38:05 INFO DotnetBackend: The number of DotnetBackend threads is set to 10.
21/09/03 13:38:05 INFO DotnetRunner: Port number used by DotnetBackend is 62071
21/09/03 13:38:05 INFO DotnetRunner: Adding key=spark.jars and value=file:/C:/Users/krstic/Desktop/spark%20proba/./bin/Debug/net5.0/microsoft-spark-3-1_2.12-2.0.0.jar to environment
21/09/03 13:38:05 INFO DotnetRunner: Adding key=spark.app.name and value=org.apache.spark.deploy.dotnet.DotnetRunner to environment
21/09/03 13:38:05 INFO DotnetRunner: Adding key=spark.submit.pyFiles and value= to environment
21/09/03 13:38:05 INFO DotnetRunner: Adding key=spark.submit.deployMode and value=client to environment
21/09/03 13:38:05 INFO DotnetRunner: Adding key=spark.master and value=local to environment
info: Microsoft.Hosting.Lifetime[0]
Now listening on: http://localhost:5000
info: Microsoft.Hosting.Lifetime[0]
Now listening on: https://localhost:5001
info: Microsoft.Hosting.Lifetime[0]
Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
Hosting environment: Production
info: Microsoft.Hosting.Lifetime[0]
Content root path: C:\Users\krstic\Desktop\spark proba
I'm testing on my local machine as administrator - windows 10.
I am happy to hear any suggestion.