Problem
You are running jobs on serverless compute when you get an invalid cast input error message. When you run the same jobs on classic compute, you do not get an invalid cast input error.
Example error message
[CAST_INVALID_INPUT] The value '100.0' of the type "STRING" cannot be cast to "BIGINT" because it is malformed. Correct the value as per the syntax, or change its target type. Use 'try_cast' to tolerate malformed input and return NULL instead.
Cause
The default setting of spark.sql.ansi.enabled is set to true on serverless compute, which enforces strict ANSI SQL compliance. When you start a classic cluster with Databricks Runtime, spark.sql.ansi.enabled is set to false by default. This difference in settings can lead to casting errors when running jobs on serverless compute.
Solution
Set spark.sql.ansi.enabled to false before running queries on serverless compute. This disables strict ANSI SQL compliance.
- Run
SET spark.sql.ansi.enabled=falseto disable strict ANSI SQL compliance. - Run your query.