Bad Request error when creating a table from a shared catalog

Upgrade the client’s Databricks Runtime version or disable the enableDeletionVectors property on the source table.

Written by girish.sharma

Last published at: December 23rd, 2024

Problem

You are attempting to create a table from a Delta Sharing catalog in your Databricks workspace and you get an error message.

Example error

UncheckedExecutionException: io.delta.sharing.spark.util.UnexpectedHttpStatus: HTTP request failed with status: HTTP/1.1 400 Bad Request

Example error details

"error_code" : "BAD_REQUEST", "message" : "Failed request to sharing server\nEndpoint: https://<workspace-name>.databricks.com:443/api/2.0/delta-sharing/metastores/<metastore-id>/shares/<catalog-name>/schemas/<schema-neme>/tables/menu/metadata\nMethod: GET\nHTTP Code: 400\nStatus Line: 400\nBody: { \"error_code\" : \"BAD_REQUEST\", \"message\" : \"\\nTable property\\ndelta.enableDeletionVectors\\nis found in table version: 1.\\nHere are a couple options to proceed:\\n 1. Use DBR version 14.1(14.2 for CDF and streaming) or higher or delta-sharing-spark with version 3.1 or higher and set option (\\\"responseFormat\\\", \\\"delta\\\") to query the table.\\n 2. Contact your provider to ensure the table is shared with full history.\\n[Trace Id: 58bd2d3f99bdcd2f2a302f977074]\"}\n[Trace Id: 58bd2d3f99bdcd2f2a302f977074]\n"

This issue typically arises in environments utilizing Delta Sharing and a mixture of Databricks Runtime versions.

Cause

The table being accessed has the enableDeletionVectors property enabled, which is not supported by the Databricks Runtime version you are using. The server returns a 400 Bad Request error.

For more information, review the What are deletion vectors? (AWSAzureGCP) documentation.

Solution

If you are using Delta Sharing, you should upgrade your clients to a supported Databricks Runtime (recommended) or disable the enableDeletionVectors property on the source Delta table.

Upgrade Databricks Runtime

Info

You should use Databricks Runtime 14.3 LTS or above if you want to use CDF (Change Data Feed) or streaming.

 
  1. Stop your cluster.
  2. Change the Databricks Runtime to 14.1 or above.
  3. Restart the cluster.
  4. Create the table.

This approach ensures compatibility with the enableDeletionVectors property.

Disable enableDeletionVectors

If you cannot upgrade your Databricks Runtime version, you can disable the enableDeletionVectors property on the source Delta table. This makes the source table readable by Delta Lake clients that do not support deletion vectors.

  1. Ensure that you have MODIFY permissions on the source Delta table.
  2. Access the table from a cluster running Databricks Runtime 14.1 or above.
  3. Follow the directions in the Drop Delta table features (AWSAzureGCP) documentation to disable the enableDeletionVectors property.
  4. Run VACUUM to delete old files and ensure no concurrent write operations are running.
  5. Access the table from the client.