Cannot customize Apache Spark config in Databricks SQL warehouse

You can only configure a limited set of global Spark properties when using a SQL warehouse.

Written by mounika.tarigopula

Last published at: March 15th, 2023

Problem

You want to set Apache Spark configuration properties in Databricks SQL warehouses like you do on standard clusters.

Cause

Databricks SQL is a managed service. You cannot modify the Spark configuration properties on a SQL warehouse. This is by design.

You can only configure a limited set of global Spark properties that apply to all SQL warehouses in your workspace.

Solution

Review the SQL warehouse data access configuration supported properties (AWS | Azure | GCP) documentation for information on Spark properties you can apply to all SQL warehouses in your workspace.

If you have a unique need to set a specific Spark configuration property that is preventing you from using Databricks SQL please contact your Databricks representative.

Was this article helpful?