Adding a configuration setting overwrites all default spark.executor.extraJavaOptions settings

Learn how to resolve overwritten configuration settings in Databricks.

Written by Adam Pavlacka

Last published at: December 8th, 2022


When you add a configuration setting by entering it in the Apache Spark config text area, the new setting replaces existing settings instead of being appended.


Databricks Runtime 5.1 and below.


When the cluster restarts, the cluster reads settings from a configuration file that is created in the Clusters UI, and overwrites the default settings.

For example, when you add the following extraJavaOptions to the Spark config text area:

spark.executor.extraJavaOptions -

Then, in Spark UI > Environment > Spark Properties under spark.executor.extraJavaOptions, only the newly added configuration setting shows:


Any existing settings are removed.

For reference, the default settings are: -XX:ReservedCodeCacheSize=256m -
XX:+UseCodeCacheFlushing -Ddatabricks.serviceName=spark-executor-1 - -XX:+PrintFlagsFinal -
XX:+PrintGCDateStamps -verbose:gc -XX:+PrintGCDetails -Xss4m -
peFactoryImpl -
tBuilderFactoryImpl -
oryImpl -
Djavax.xml.validation.SchemaFactory= - -


To add a new configuration setting to spark.executor.extraJavaOptions without losing the default settings:

  1. In Spark UI > Environment > Spark Properties, select and copy all of the properties set by default for spark.executor.extraJavaOptions.
  2. Click Edit.
  3. In the Spark config text area (Clusters > cluster-name > Advanced Options > Spark), paste the default settings.
  4. Append the new configuration setting below the default settings.
  5. Click outside the text area, then click Confirm.
  6. Restart the cluster.

For example, let’s say you paste the following settings into the Spark config text area. The new configuration setting is appended to the default settings.

spark.executor.extraJavaOptions = -
XX:ReservedCodeCacheSize=256m -XX:+UseCodeCacheFlushing -Ddatabricks.serviceName=spark-
executor-1 -
XX:+PrintFlagsFinal -XX:+PrintGCDateStamps -verbose:gc -XX:+PrintGCDetails -Xss4m -
peFactoryImpl -
uilderFactoryImpl -
yImpl -
ces.internal.jaxp.validation.XMLSchemaFactory - -
entationSourceImpl -

After you restart the cluster, the default settings and newly added configuration setting appear in Spark UI > EnvironmentSpark Properties.

Was this article helpful?