How to import a custom CA certificate

When working with Python, you may want to import a custom CA certificate to avoid connection errors to your endpoints.

ConnectionError: HTTPSConnectionPool(host='my_server_endpoint', port=443): Max retries exceeded with url: /endpoint (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fb73dc3b3d0>: Failed to establish a new connection: [Errno 110] Connection timed out',))

To import one or more custom CA certificates to your Databricks cluster:

  1. Create an init script that adds the entire CA chain and sets the REQUESTS_CA_BUNDLE property.

    In this example, PEM format CA certificates are added to the file myca.crt which is located at /user/local/share/ca-certificates/. This file is referenced in the custom-cert.sh init script.

    dbutils.fs.put("/databricks/init-scripts/custom-cert.sh", """#!/bin/bash
    
    cat << 'EOF' > /usr/local/share/ca-certificates/myca.crt
    -----BEGIN CERTIFICATE-----
    <CA CHAIN 1 CERTIFICATE CONTENT>
    -----END CERTIFICATE-----
    -----BEGIN CERTIFICATE-----
    <CA CHAIN 2 CERTIFICATE CONTENT>
    -----END CERTIFICATE-----
    EOF
    
    update-ca-certificates
    
    echo "export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh
    """)
    
  2. Attach the init script to the cluster as a cluster-scoped init script.

  3. Restart the cluster.