Problem
When using Databricks Asset Bundles (DAB) to deploy artifacts, you encounter a situation where artifacts are not uploaded to a given volume as expected.
This issue is not accompanied by an error message. When you run the databricks bundle deploy
command, the deployment completes but the expected path in the volume is not created, and no artifacts appear there.
Cause
By default, the workspace.artifact_path
property in the DAB YAML file points to the workspace file system. This property indicates where artifacts should be uploaded.
Volumes are a separate storage location, but you can edit the workspace.artifact_path
property to point to a volume.
Solution
Configure the workspace.artifact_path
property to your desired volume path in your databricks.yml
file. Replace <databricks-domain>
with your respective platform:
- AWS:
cloud.databricks.com
- Azure:
azuredatabricks.net
- GCP:
gcp.databricks.com
bundle:
name: <your-bundle-name>
description: Upload a test .whl artifact to Volumes via DAB
include:
- ./artifacts/*
targets:
dev:
workspace:
host: https://<your-workspace>.<databricks-domain>/
artifact_path: /Volumes/<path-to-your-volume>
artifacts:
my_whl:
path: ./<path-to-file-you-want-to-upload-to-a-volume>
Run the databricks bundle deploy
command to deploy your artifacts to the configured volume.
For more information, review the “artifact_path” section of the Databricks Asset Bundle configuration (AWS | Azure | GCP) documentation.