Shuts down the remote Spark application and switches to a local compute context. All rx* function calls after this will run in a local compute context. In pyspark-interop mode, if Spark application is started by pyspark APIs, rx_spark_disconnect will not shut down the remote Spark application but disassociate from it. Run ‘help(revoscalepy.rx_spark_connect)’ for more information about interop.
Spark compute context to be terminated by rx_spark_disconnect. If input is None, then current compute context will be used.
from revoscalepy import rx_spark_connect, rx_spark_disconnect cc = rx_spark_connect() rx_spark_disconnect(cc)