Run a custom Databricks Runtime on your cluster

Configure your cluster to run a custom Databricks Runtime image via the UI or API.

Written by rakesh.parija

Last published at: September 11th, 2024

The majority of Databricks customers use production Databricks Runtime releases (AWS | Azure | GCP) for their clusters. However, there may be certain times when you are asked to run a custom Databricks Runtime after raising a support ticket.

Delete

Warning

Custom Databricks Runtime images are created for specific, short-term fixes and edge cases. If a custom image is appropriate, it will be provided by Databricks Support during case resolution.

Databricks Support cannot provide a custom image on demand. You should NOT open a ticket just to request a custom Databricks Runtime.

This article explains how to start a cluster using a custom Databricks Runtime image after you have been given the Runtime image name by support.

Instructions

Use the workspace UI

Follow the steps for your specific browser to add the Custom Spark Version field to the New Cluster menu.

After you have enabled the Custom Spark Version field, you can use it to start a new cluster using the custom Databricks runtime image you were given by support.

Delete

Info

When following the steps in this article, you will see a warning in your browser's Javascript console that says:

Do not copy-paste anything here. This can be used to compromise your account.

It is OK to enter the commands listed in this article.

Chrome / Edge

  1. Login to your Databricks workspace.
  2. Click Compute.
  3. Click All-purpose clusters.
  4. Click Create Cluster.
  5. Press Command+Option+J (Mac) or Control+Shift+J (Windows, Linux, ChromeOS) to open the Javascript console.
  6. Enter window.prefs.set("enableCustomSparkVersions",true) in the Javascript console and run the command.
  7. Reload the page.
  8. Custom Spark Version now appears in the New Cluster menu.
  9. Enter the custom Databricks runtime image name that you got from Databricks support in the Custom Spark Version field.
  10. Continue creating your cluster as normal.

Firefox

  1. Login to your Databricks workspace.
  2. Click Compute.
  3. Click All-purpose clusters.
  4. Click Create Cluster.
  5. Press Command+Option+K (Mac) or Control+Shift+K (Windows, Linux) to open the Javascript console.
  6. Enter window.prefs.set("enableCustomSparkVersions",true) in the Javascript console and run the command.
  7. Reload the page.
  8. Custom Spark Version now appears in the New Cluster menu.
  9. Enter the custom Databricks runtime image name that you got from Databricks support in the Custom Spark Version field.
  10. Continue creating your cluster as normal.

Safari

  1. Login to your Databricks workspace.
  2. Click Compute.
  3. Click All-purpose clusters.
  4. Click Create Cluster.
  5. Press Command+Option+C (Mac) to open the Javascript console.
  6. Enter window.prefs.set("enableCustomSparkVersions",true) in the Javascript console and run the command.
  7. Reload the page.
  8. Custom Spark Version now appears in the New Cluster menu.
  9. Enter the custom Databricks runtime image name that you got from Databricks support in the Custom Spark Version field.
  10. Continue creating your cluster as normal.

Use the API

You need to set the custom image with the spark_version attribute when starting a cluster via the API.

You can use the API to create both interactive clusters and job clusters with a custom Databricks runtime image.

"spark_version": "custom:<custom-runtime-version-name>

Example code

This sample code shows the spark_version attribute used within the context of starting a cluster via the API.

%sh

curl -H "Authorization: Bearer <token-id>" -X POST  https://<databricks-instance>/api/2.0/clusters/create -d '{
  "cluster_name": "heap",
  "spark_version": "custom:<custom-runtime-version-name>",
  "node_type_id": "r3.xlarge",
  "spark_conf": {
    "spark.speculation": true
  },
  "aws_attributes": {
    "availability": "SPOT",
    "zone_id": "us-west-2a"
  },
  "num_workers": 1,
  "spark_env_vars": {
    "SPARK_DRIVER_MEMORY": "25g"
  }
}'


For more information please review the create Clusters API 2.0 (AWS | Azure | GCP) documentation.