Cluster’s Apache Spark UI not appearing

Append the default Databricks daemon listener to your custom listener.

Written by david.vega

Last published at: July 3rd, 2025

Problem

Your cluster’s Apache Spark UI is not available, and you see the following error message. 

Could not find data to load UI for driver <driver-id> in cluster <cluster-id>

 

Cause

This message can appear when you have a custom Spark config, spark.extraListeners overwriting the default Databricks daemon listener com.databricks.backend.daemon.driver.DBCEventLoggingListener

 

Solution

  1. Open the affected cluster.
  2. Click the Edit button.
  3. Scroll to Advanced options and click to expand.
  4. Click the Spark option in the vertical menu. 
  5. Within the Spark config field, select spark.extraListeners
  6. Append the default Databricks daemon listener to your custom listener. You can use the following example code.
<your-custom-spark-listener-reference>, com.databricks.backend.daemon.driver.DBCEventLoggingListener

 

If this Spark setting does not appear in the Spark config field, check any of the init scripts influencing the cluster and adjust using the init script. You can use the following example code. 

"spark.extraListeners" = "<your-custom-spark-listener-reference>, com.databricks.backend.daemon.driver.DBCEventLoggingListener"

 

If the further init script check doesn’t reveal any script influencing the cluster, contact Support to determine alternative causes.