Get Apache Spark config in DBConnect

You can always view the Spark configuration for your cluster by reviewing the cluster details in the workspace.

If you are using DBConnect you may want to quickly review the current Spark configuration details without switching over to the workspace UI.

This example code shows you how to get the current Spark configuration for your cluster by making a REST API call in DBConnect.

import json
import requests
import base64

with open("/<path-to-dbconnect-config>/.databricks-connect") as readconfig:
    conf = json.load(readconfig)

CLUSTER_ID = conf["cluster_id"]
TOKEN = conf["token"]
API_URL = conf["host"]

headers = {'Content-Type': 'application/json', 'Authorization': 'Bearer ' + TOKEN}
payload =  {'cluster_id': '' + CLUSTER_ID}
response = requests.get(API_URL + "/api/2.0/clusters/get/?cluster_id="+CLUSTER_ID, headers=headers, json = payload)
sparkconf = response.json()["spark_conf"]

for config_key, config_value in sparkconf.items():
    print(config_key, config_value)

Important

DBConnect only works with supported Databricks Runtime versions. Ensure that you are using a supported runtime on your cluster before using DBConnect.