When working with Python, you may want to import a custom CA certificate to avoid connection errors to your endpoints.
ConnectionError: HTTPSConnectionPool(host='my_server_endpoint', port=443): Max retries exceeded with url: /endpoint (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fb73dc3b3d0>: Failed to establish a new connection: [Errno 110] Connection timed out',))
To import one or more custom CA certificates to your Databricks cluster:
Create an init script that adds the entire CA chain and sets the
In this example, PEM format CA certificates are added to the file
myca.crtwhich is located at
/user/local/share/ca-certificates/. This file is referenced in the
dbutils.fs.put("/databricks/init-scripts/custom-cert.sh", """#!/bin/bash cat << 'EOF' > /usr/local/share/ca-certificates/myca.crt -----BEGIN CERTIFICATE----- <CA CHAIN 1 CERTIFICATE CONTENT> -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- <CA CHAIN 2 CERTIFICATE CONTENT> -----END CERTIFICATE----- EOF update-ca-certificates echo "export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh """)
Attach the init script to the cluster as a cluster-scoped init script.
Restart the cluster.