Getting AssertionError when using the withColumns() function in Databricks Runtime 14.3 LTS or above

Use the col() wrapper when using the withColumns() function.

Written by Tarun Sanjeev

Last published at: July 18th, 2025

Problem

In Databricks Runtime 14.3 LTS, you use the withColumns() function on an Apache Spark DataFrame to create new columns by referencing existing columns with the same names. The following example code uses id, name, category.

df1 = df.withColumns(
    [("id", "id"), ("name", "name"), ("category", "category")]
)

 

You then encounter the following error.

Py4JJavaError: An error occurred while calling o395.sql.
: java.util.NoSuchElementException: key not found: LocationType#214517

 

Cause

In Databricks Runtime 14.3 LTS or above, the withColumns() function requires column names to be wrapped with the col() wrapper. 

 

Solution

Wrap column names with the col() wrapper when using the withColumns() function. 

from pyspark.sql.functions import col

df1 = df.withColumns(
    [(col(c), col(c)) for c in ["id", "name", "category"]]
)

 

For more information, refer to the pyspark.sql.functions.col API documentation.

 

Best practices

  • Regularly review Databricks Runtime release notes for changes that might affect your code.
  • When upgrading to a new Databricks Runtime version, test your existing code to identify and address any compatibility issues. Refer to the Databricks Runtime release notes versions and compatibility (AWSAzureGCP) documentation for the corresponding upgrade version to assist.