Streamlit app deployed as Databricks App failing with JAVA_GATEWAY_EXITED error

Replace the direct SparkSession instantiation with a supported remote-connection SDK or driver.

Written by Amruth Ashoka

Last published at: July 18th, 2025

Problem

Your Streamlit application deployed as a Databricks App contains a call such as the following. 

spark = SparkSession.builder.appName("QuestionAnswer").getOrCreate()

 

The app crashes on startup and the container throws the following error. 

PySparkRuntimeError: [JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number

 

Cause

Databricks Apps are lightweight, container-based runtimes designed for UI rendering and light orchestration. They do not ship with an Apache Spark driver, executor, or JVM.

 

Any call that instantiates a SparkSession (or lower-level SparkContext) tries to start the Java gateway and fails, producing the [JAVA_GATEWAY_EXITED] error.

 

Solution

Databricks Apps should delegate compute to an existing Databricks cluster or to Databricks SQL instead of attempting to create Spark locally. 

 

Replace the direct SparkSession instantiation with one of the supported remote-connection SDKs or drivers, such as: