Something I didn't think would be particularly complex but no details of how to do such in the MS help files apache-spark-job-definitions
I've created a Spark notebook that I want to call from a Spark job definition, how to do such? Or am I overthinking and should follow exactly how it's done for DataBricks?
