Recieved this Stage failure error message when trying to merge new records to a table on Databricks using Pyspark

Gunuganti, Rohit 1 Reputation point
2021-09-23T20:10:45.46+00:00

Caused by: java.io.IOException: Operation failed: "The condition specified using HTTP conditional header(s) is not met.", 412, GET, https://*******86fc602dd_1.csv?timeout=90, ConditionNotMet, "The condition specified using HTTP conditional header(s) is not met.

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 2592.0 failed 4 times, most recent failure: Lost task 2.3 in stage 2592.0 (TID 8128, 10.77.68.23, executor 0): com.databricks.sql.io.FileReadException: Error while reading file abfss:REDACTED_LOCAL_PART@prdes1.dfs.core.windows.net/sc993857_a085834b4191492ba15d65686fc602dd_1.csv.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,949 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 78,576 Reputation points Microsoft Employee
    2021-09-24T09:34:55.577+00:00

    Hello @Gunuganti, Rohit ,

    Welcome to the Microsoft Q&A platform.

    We have seen this error previously and below is recommendation from databricks.

    Reason: The failures happen on files that have old Modification time and are very unlikely to be modified.

    Next steps: Use the flag fs.azure.io.read.tolerate.concurrent.append true as workaround.

    Please test it by putting it inside the cluster config page and let us know if you are still facing the issue.

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.