Error when trying to access Azure storage account from China region

Rule out common Apache Spark configuration issues and ensure your Spark configuration for your OAuth endpoint is set to the China region.

Written by saikumar.divvela

Last published at: January 22nd, 2025

Problem

When trying to access your Azure storage account, or external tables associated with that account, from a China region-based Azure Databricks environment, you receive an error.

 

`Failure to initialize configuration for storage account xxxxxxxxxxxxxx.dfs.core.chinacloudapi.cn: Invalid configuration value detected for fs.azure.account.key`

 

Cause

There is an issue with your Apache Spark properties used to configure Azure credentials to access the Azure storage account.

 

Alternatively, there is an issue with the Spark configuration setting for the OAuth endpoint.

 

Solution

First, make sure your Spark properties do not have any of the following. 

  • Typos
  • Incorrect or invalid Spark configurations
  • Expired secrets
  • Missing required permissions for the service principal on the storage account
  • Missing configured external location credentials to access the storage account

 

Then check your Spark configuration setting for the OAuth endpoint. Change the endpoint setting to the correct value for the China region. 

 

spark.hadoop.fs.azure.account.oauth2.client.endpoint.<storage-account>.dfs.core.windows.net https://login.chinacloudapi.cn/<directory-id>/oauth2/v2.0/token

 

Preventative measures

Ensure that the Spark configuration settings are correct for the specific Azure region and environment.

 

All the national clouds authenticate users separately in each environment and have separate authentication endpoints. For more information, refer to Microsoft’s National clouds documentation.