Cluster Apache Spark configuration not applied

Values set in your cluster's Spark configuration are not applying correctly.

Written by Gobinath.Viswanathan

Last published at: March 4th, 2022


Your cluster’s Spark configuration values are not applied.


This happens when the Spark config values are declared in the cluster configuration as well as in an init script.

When Spark config values are located in more than one place, the configuration in the init script takes precedence and the cluster ignores the configuration settings in the UI.


You should define your Spark configuration values in one place.

Choose to define the Spark configuration in the cluster configuration or include the Spark configuration in an init script.

Do not do both.

Was this article helpful?