question

ManojBiswalHarmanConnectedServices-6525 avatar image
0 Votes"
ManojBiswalHarmanConnectedServices-6525 asked ManojBiswalHarmanConnectedServices-6525 commented

"append" statement doesn't work while appending the data stating "ClassCastException"

We have recently upgraded our cluster from Databricks version 6.3 to 7.3 . Databricks version 7.3 doesn't support implicit conversion of data types while appending the data using pyspark into existing table in sql server. What can we do to automate the data type conversion? Is there any configuration setting for this?

azure-databricks
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

HimanshuSinha-MSFT avatar image
0 Votes"
HimanshuSinha-MSFT answered ManojBiswalHarmanConnectedServices-6525 commented

Hello @ManojBiswalHarmanConnectedServices-6525 ,
Thanks for the ask and using Microsoft Q&A platform .

I think setting the option spark.sql.storeAssignmentPolicy to "Legacy" is help . Please read about this here

Please do let me know how it goes .
Thanks
Himanshu


  • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how

    • Want a reminder to come back and check responses? Here is how to subscribe to a notification

    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of [Q&A Volunteer Moderators][50]


· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks for your reply @HimanshuSinha-MSFT .

We have already tried with this option but it's not working. Any other idea?

Thanks
Manoj.

0 Votes 0 ·