Cluster Apache Spark configuration not applied

Problem

Your cluster’s Spark configuration values are not applied.

Cause

This happens when the Spark config values are declared in the cluster configuration as well as in an init script.

When Spark config values are located in more than one place, the configuration in the init script takes precedence and the cluster ignores the configuration settings in the UI.

Solution

You should define your Spark configuration values in one place.

Choose to define the Spark configuration in the cluster configuration or include the Spark configuration in an init script.

Do not do both.