You have Spark containers failed to launch on the worker instances within your cluster.
Based on this old thread :
- Check the status of the worker instances in your cluster. Make sure that they are all up and running and that there are no issues with the underlying infrastructure. You can also check the logs of the worker instances to see if there are any errors or issues that might be causing the problem.
- Check the configuration of your cluster. Make sure that the configuration is correct and that there are no errors or inconsistencies. You can also try changing the configuration and see if that resolves the issue.
- Try restarting the cluster. Sometimes, restarting the cluster can resolve the issue. Make sure to save any important data before restarting the cluster.
More links :
https://stackoverflow.com/questions/75865288/spark-container-launch-failed
https://community.databricks.com/t5/data-engineering/cluster-occasionally-fails-to-launch/td-p/7803