SQL access control error when using Snowflake as a data source

Snowflake does not officially support schema as an option; you must use sfschema.

Written by John.Lourdu

Last published at: January 20th, 2023


The Snowflake Connector for Spark is used to read data from, and write data to, Snowflake while working in Databricks. The connector makes Snowflake look like another Spark data source.

When you try to query Snowflake, your get a SnowflakeSQLException error message.

SnowflakeSQLException: SQL access control error: Insufficient privileges to operate on schema '<SCHEMA>' 
2Insufficient privileges to operate on schema '<SCHEMA>' at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:127)


When you attempted to read and write data from Snowflake (AWS | Azure | GCP) you used schema instead of sfschema.


snowflake_table = (spark.read
  .format("snowflake")   .option("dbtable", <table_name>)   .option("sfUrl", <database_host_url>)   .option("sfUser", <username>)   .option("sfPassword", <password>)   .option("sfDatabase", <database_name>)   .option("Schema", <schema_name>)   .option("sfWarehouse", <warehouse_name>)   .load() )

Snowflake does not officially support schema as an option.

In some cases, schema is treated as sfschema, but there is no guarantee that this will happen.


When reading or writing data from Snowflake you must use sfschema instead of schema in Snowflake options.


snowflake_table = (spark.read
  .option("dbtable", <table_name>)
  .option("sfUrl", <database_host_url>)
  .option("sfUser", <username>)
  .option("sfPassword", <password>)
  .option("sfDatabase", <database_name>)
  .option("sfSchema", <schema_name>)
  .option("sfWarehouse", <warehouse_name>)

Please review the Snowflake Using the Spark Connector documentation for more information.

Was this article helpful?