Error in CREATE TABLE with external Hive metastore

Problem

You are connecting to an external MySQL metastore and attempting to create a table when you get an error.

AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException:
MetaException(message:An exception was thrown while adding/validating class(es) : (conn=21)
Column length too big for column 'PARAM_VALUE' (max = 16383); use BLOB or TEXT instead.

Cause

This is a known issue with MySQL 8.0 when the default charset is utfmb4.

You can confirm this by running a query on the database with the error.

SELECT default_character_set_name FROM information_schema.SCHEMATA S WHERE schema_name = "<database-name>"

Solution

You need to update or recreate the database and set the charset to latin1.

Option 1

  • Manually run create statements in the Hive database with DEFAULT CHARSET=latin1 at the end of each CREATE TABLE statement.

    CREATE TABLE `TABLE_PARAMS`
    (
        `TBL_ID` BIGINT NOT NULL,
        `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
        `PARAM_VALUE` VARCHAR(4000) BINARY NULL,
        CONSTRAINT `TABLE_PARAMS_PK` PRIMARY KEY (`TBL_ID`,`PARAM_KEY`)
    ) ENGINE=INNODB DEFAULT CHARSET=latin1;
    
  • Restart the Hive metastore and repeat until all creation errors have been resolved.

Option 2

  • Setup the database and user accounts.
  • Create the database and run alter database hive character set latin1; before you launch the metastore.

This command sets the default CHARSET for the database. It is applied when the metastore creates tables.