Error “Cannot find catalog plugin class for catalog” when using a custom catalog plugin in JDBC driver

Place the custom JAR file in /databricks/jars and /databricks/hive_metastore_jars using an init script.

Written by rushali.kumari

Last published at: May 27th, 2025

Problem

You’re using a custom catalog plugin in Databricks. The plugin, packaged as a JAR file and uploaded to your workspace, is configured as a library on your cluster.

 

When executing a query using the Databricks JDBC driver from an external application (for example, Java code), the encounter the following error. 

Error:[_LEGACY_ERROR_TEMP_2215] org.apache.spark.SparkException: Cannot find catalog plugin class for catalog '<catalog-name>': <custom-catalog-plugin-class>.

 

However, you notice if the same query is first executed in a Databricks notebook on the same cluster, the catalog plugin initializes successfully, allowing subsequent queries on the JDBC driver to run without errors.

 

Cause

The custom catalog plugin is not being initialized when queries are executed using JDBC. 

 

Solution

Preload the plugin to make it available for JDBC sessions. Use an init script to place the custom JAR file in the appropriate directories. 

 

The following init script places the JAR files. It sets a target directory and then copies the JAR file to the target directory. 

\#!/bin/bash

# Target directories
TARGET_DIRECTORIES=("/databricks/jars" "/databricks/hive_metastore_jars")

# Copy the JAR file to the target directories
for TARGET_DIR in "${TARGET_DIRECTORIES[@]}"; do
  cp "<path-to-your-custom-JAR-file>" "$TARGET_DIR/"
done