Casting BigNumeric values to Decimal(38, 38) fails with NUMERIC_VALUE_OUT_OF_RANGE error

Modify the data type to NUMERIC in BigQuery first.

Written by shubham.bhusate

Last published at: June 9th, 2025

Problem

When integrating BigQuery with Databricks through Lake Federation, you try to cast BigNumeric values to Databricks Decimal(38, 38) type. You receive the following error. 

[NUMERIC_VALUE_OUT_OF_RANGE.WITHOUT_SUGGESTION] 
arises during federated queries when a BigQuery BigNumeric value (e.g., 66000.00000000000000000000000000000000000000) cannot be cast to Databricks' Decimal(38, 38) type. This occurs because the value exceeds the 38-digit precision limit enforced by Spark.

 

Cause

When using Lake Federation, BigNumeric values are automatically mapped to Decimal(38, 38) in Databricks. Decimal(38, 38) reserves all 38 digits for the fractional part (scale) of the value, leaving no digits for the integer part.

 

Values with over 38 digits of precision and non-zero integer digits (for example, 66000.0...) violate the Decimal(38,38) behavior, triggering the error.

 

Solution

Modify the federated query to first cast the BigNumeric column to NUMERIC (which has 38-digit precision) in BigQuery, then transfer the data to Databricks.

SELECT 
  CAST(<the-BigNumeric-column> AS NUMERIC) AS <Databricks-column-name> 
FROM 
  `<your-bigquery-project>.<your-dataset>.<your-table>`