Error when creating a Delta table using the UI and external data in Delta format

Create the table using a notebook instead.

Written by manikandan.ganesan

Last published at: December 13th, 2024

Problem

When you try to use the UI to create a Delta table with an external data source in Delta format, you get the following error. 

CloudFilesIllegalArgumentException: Reading from a Delta table is not supported with this syntax. 

 

Cause

The UI is designed to create Delta tables from external data in CSV, TSV, JSON, Avro, Parquet, or text file formats.

 

Solution

Create the Delta table using a notebook command instead.

  1. Open a notebook in your Databricks workspace.
  2. Choose one of the following methods based on the table type you want to create. 

 

External table type

CREATE TABLE <your-catalog>.<your-schema>.<your-table-name> LOCATION <your-delta-table-location>; 

 

Managed table type

df=spark.read.format("delta").load("<your-delta-table-location>")
df.write.mode("overwrite").saveAsTable("<your-catalog.your-schema.your-table>")

 

For more information, review the Delta table streaming reads and writes (AWSAzureGCP) documentation.