'Cluster does not support jobs workload' error during notebook or job run

Use a cluster policy that allows the dbutils.notebooks.run API, or run the code directly within a notebook to avoid the API.

Written by girish.sharma

Last published at: December 20th, 2024

Problem

When running notebooks or jobs, you receive an error message stating the cluster doesn’t support jobs workload. 

 

com.databricks.WorkflowException: com.databricks.common.client.DatabricksServiceHttpClientException: INVALID_PARAMETER_VALUE: The cluster 0126-082707-xug0zuli does not support jobs workload

 

Cause

You have a cluster policy that prevents the use of the dbutils.notebooks.run API, which is used to run notebooks as ephemeral jobs. 

The relevant policy block is:

 

"workload_type.clients.jobs": {
"type": "fixed",
"value": false
}

 

 

Solution

If possible, remove or modify the block that restricts jobs workload in your existing cluster policy. Edit the cluster policy JSON to remove the following code.  

"workload_type.clients.jobs": {
"type": "fixed",
"value": false
}

 

If modifying your existing policy is not feasible, apply a different cluster policy that doesn’t have the restrictive block. This may involve creating a new policy or using a different, existing policy that allows jobs workload. 

Alternatively, run the code directly within a notebook to avoid the API.  

For more information, refer to the Compute policy definition (AWSAzureGCP) documentation.