Currently I'm connecting to databricks with local VS Code via databricks-connect. But my submmission all comes with error of module not found, which means the code in other python files not found.
I tried:
Move code into the folder with main.py
import the file inside of the function that uses it
adding the file via sparkContext.addPyFile
Does anyone have any experiecen on it? Or the even better way to interact with databricks for python projects.
I seems my python part code is executed in local python env, only the code directlry related spark is in cluster, but the cluster does not load all my python files. then raising error.