Apache Spark 作業失敗,併發生Failed to parse byte stringApache Spark job fails with Failed to parse byte string

問題Problem

Spark-提交工作失敗,並出現 Failed to parse byte string: -1 錯誤訊息。Spark-submit jobs fail with a Failed to parse byte string: -1 error message.

java.util.concurrent.ExecutionException: java.lang.NumberFormatException: Size must be specified as bytes (b), kibibytes (k), mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). E.g. 50b, 100k, or 250m.
Failed to parse byte string: -1
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec.doExecuteBroadcast(BroadcastExchangeExec.scala:182)
... 108 more
Caused by: java.lang.NumberFormatException: Size must be specified as bytes (b), kibibytes (k), mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). E.g. 50b, 100k, or 250m.
Failed to parse byte string: -1

原因Cause

spark.driver.maxResultSize應用程式屬性的值為負數。The value of the spark.driver.maxResultSize application property is negative.

解決方法Solution

指派的值會 spark.driver.maxResultSize 定義每個 Spark 動作之序列化結果的大小上限 ((以位元組) 為單位)。The value assigned to spark.driver.maxResultSize defines the maximum size (in bytes) of the serialized results for each Spark action. 您可以將正值指派給 spark.driver.maxResultSize 屬性,以定義特定的大小。You can assign a positive value to the spark.driver.maxResultSize property to define a specific size. 您也可以指派0的值來定義無限制的大小上限。You can also assign a value of 0 to define an unlimited maximum size. 您不能將負值指派給這個屬性。You cannot assign a negative value to this property.

如果作業的總大小高於 spark.driver.maxResultSize 值,則會中止作業。If the total size of a job is above the spark.driver.maxResultSize value, the job is aborted.

設定過高的 (或無限制的) 值時,請務必小心 spark.driver.maxResultSizeYou should be careful when setting an excessively high (or unlimited) value for spark.driver.maxResultSize. 如果屬性的設定不足,可能會導致驅動程式發生記憶體不足的錯誤 spark.driver.memoryA high limit can cause out-of-memory errors in the driver if the spark.driver.memory property is not set high enough.

如需詳細資訊,請參閱Spark 設定應用程式屬性See Spark Configuration Application Properties for more details.