Getting error while writing dataframe result to SQLDW table

Chiru 1 Reputation point
2021-10-09T09:52:13.44+00:00

Hi All,

Thanks in advance. I am facing following error while writing dataframe result to SQL DW table.

Used code:

df.write \
.format("com.databricks.spark.sqldw") \
.option("url",url) \
.option("dbTable", "Schmename.tablename") \
.option("tempDir",tempDir) \
.option("encrypt", "true") \
.option("enableServicePrincipalAuth", "true") \
.mode("append") \
.save()

Error:

Caused by: org.apache.spark.sql.AnalysisException: Multiple sources found for parquet (org.apache.spark.sql.execution.datasources.v2.parquet.ParquetDataSourceV2, org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat), please specify the fully qualified class name.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,938 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 77,751 Reputation points Microsoft Employee
    2021-10-11T06:54:23.833+00:00

    Hello @Chiru ,

    For the error message it clearly says that multiple method found for parquet.

    You can use any one of the above mentioned formats to write.

    Example: df.write.format("org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat").mode(SaveMode.Overwrite).save(outputPath)

    For more details, you may refer to the SO thread addressing similar issue.

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.