Spark UI is unavailable for any classic cluster in the workspace

Configure your VPC endpoint to allow access to the regional S3 bucket.

Written by aishwarya.sood

Last published at: April 28th, 2025

Problem

You are using a custom VPC for your Databricks workspace and you are unable to view the Apache Spark UI for any cluster in the workspace. The Spark UI may be available for serverless, but not available for any of the classic all-purpose or jobs compute clusters. This issue affects all non-serverless clusters in the workspace.

  • All-purpose clusters - Spark UI is not accessible; attempts to open it result in errors or timeouts.
  • Jobs compute clusters - Spark UI is not accessible; users are unable to view job stages, tasks, and so on, through the UI.
  • Serverless clusters - Spark UI is accessible as expected.

 

Cause

This issue typically occurs because your custom VPC restricts access to the S3 bucket used by the Spark history server for log storage.

 

Solution

Configure your VPC endpoint to allow access to the regional log storage S3 bucket. For example, in ap-southeast-1 you would use databricks-prod-storage-singapore.

For more information, review the Addresses for artifact storage, log storage, system tables, and shared datasets buckets documentation. It contains all required S3 endpoints by region.

Review the Requirements for bucket policies documentation for the requirements when using custom VPCs.