Hello
Maybe it not corrected tag about azure synapse, but my team is checking to have local dev experience to write code for Synapse Job definition. What the best practise in this case?
I have working local dev environment VSCode, PySpark and I could execute PySpark code and theoretically use the same scripts in Synapse Job definition, but I would like to use command
spark.read.load('abfss://XXX@XXX.dfs.core.windows.net/XXX/file.csv', format='csv')
to read file from adls gen2 from both environment, locally and in remote spark cluster, is it possible? Someone has such experience?
locally doesn't work