You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
✋ I have searched the open/closed issues and my issue is not listed.
Hello,
We faced an issue recently with the Spark Operator when one of our clients tried to submit an incorrect Spark Application using the pod templates.
The following file represents the faulty Yaml declaration (extra .requests inside the template specs) : incorrect-spark-application.txt
The Spark Application seems to be valid according to the CR definition, but once submitted we have these logs from the controller spark-controller.log and when listing the Spark Applications with kubectl, the output is rather empty and stuck :
kubectl get sparkApplications | grep spark-pi
spark-pi 40s
Once in this state, no other Spark Applications can be submitted, and when trying to restart the controller, we ended up in a CrashLoopBackOff with that kind of output spark-controller-restart.log
Deleting the Spark Application fixes the issue but it would be nice to have some safeguard when submitting the SparkApp that automatically returns FAILED or similar status
Reproduction Code
To reproduce the issue, just launch the Spark Application yaml aforementioned.
Expected behavior
The Spark Application should be rejected by the CR upon validation.
Actual behavior
Pod templates from the Spark Application are not valid when parsed against OpenAPI schemes, leading to this behaviour.
Environment & Versions
Kubernetes Version: 1.30.8
Spark Operator Version: 2.1.0
Apache Spark Version: 3.5.3
Webhooks disabled
Additional context
No response
Impacted by this bug?
Give it a 👍 We prioritize the issues with most 👍
The text was updated successfully, but these errors were encountered:
What happened?
Hello,
We faced an issue recently with the Spark Operator when one of our clients tried to submit an incorrect Spark Application using the pod templates.
The following file represents the faulty Yaml declaration (extra .requests inside the template specs) : incorrect-spark-application.txt
The Spark Application seems to be valid according to the CR definition, but once submitted we have these logs from the controller spark-controller.log and when listing the Spark Applications with kubectl, the output is rather empty and stuck :
instead of this for example :
Once in this state, no other Spark Applications can be submitted, and when trying to restart the controller, we ended up in a CrashLoopBackOff with that kind of output spark-controller-restart.log
Deleting the Spark Application fixes the issue but it would be nice to have some safeguard when submitting the SparkApp that automatically returns FAILED or similar status
Reproduction Code
To reproduce the issue, just launch the Spark Application yaml aforementioned.
Expected behavior
The Spark Application should be rejected by the CR upon validation.
Actual behavior
Pod templates from the Spark Application are not valid when parsed against OpenAPI schemes, leading to this behaviour.
Environment & Versions
Additional context
No response
Impacted by this bug?
Give it a 👍 We prioritize the issues with most 👍
The text was updated successfully, but these errors were encountered: