AnalysisException error when trying to execute spark.catalog.tableExists()

Upgrade to Databricks Runtime 17.1 or above, or switch to standard compute.

Written by Sahil Singh

Last published at: October 17th, 2025

Problem

You’re working with Databricks Runtime 16.4 LTS or below on a dedicated compute. When you try to execute spark.catalog.tableExists(“<catalog>.<schema>.<table>”), you receive the following error. 

AnalysisException: [RequestId=<request-id> ErrorClass=INVALID_PARAMETER_VALUE.ROW_COLUMN_ACCESS_POLICIES_NOT_SUPPORTED_ON_ASSIGNED_CLUSTERS] Query on table <catalog>.<schema>.<table> with row filter or column mask not supported on assigned clusters.

 

Cause

The table you’re trying to query is a fine-grained access control (FGAC) table.

 

Querying FGAC tables using commands like spark.catalog.tableExists() is not supported on computes with Databricks Runtime versions below 17.1.

 

Solution

Upgrade to Databricks Runtime 17.1 or above on your dedicated compute.

 

Alternatively, if you need to continue using a Databricks Runtime below 17.1, use standard compute instead.

 

For more information, refer to the Fine-grained access control on dedicated compute (AWS | AzureGCP) documentation.