"append" statement doesn't work while appending the data stating "ClassCastException"

2021-09-22T10:00:56.597+00:00

We have recently upgraded our cluster from Databricks version 6.3 to 7.3 . Databricks version 7.3 doesn't support implicit conversion of data types while appending the data using pyspark into existing table in sql server. What can we do to automate the data type conversion? Is there any configuration setting for this?

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,947 questions
0 comments No comments
{count} votes

Accepted answer
  1. HimanshuSinha-msft 19,381 Reputation points Microsoft Employee
    2021-09-22T18:39:43.863+00:00

    Hello @Manoj Biswal (Harman Connected Services Inc) ,
    Thanks for the ask and using Microsoft Q&A platform .

    I think setting the option spark.sql.storeAssignmentPolicy to "Legacy" is help . Please read about this here

    Please do let me know how it goes .
    Thanks
    Himanshu


    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
      • Want a reminder to come back and check responses? Here is how to subscribe to a notification
      • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of [Q&A Volunteer Moderators][50]

0 additional answers

Sort by: Most helpful