Problem
You want to locate audit records for files that have been created or deleted within a specific path in your workspace's external location. When you query the system.access.audit
table for relevant service names and action names, you don’t see the expected records.
Cause
Databricks does not currently support file-level audit logging for creation or deletion of individual files in external locations.
Solution
Query the system.access.audit
table for the "filesystem"
service and the "volumeDelete"
action to track when entire volumes are deleted. While this does not provide visibility into file-level actions, it helps track operations at the external location level.
If file-level tracking is critical, consider using external tools or cloud-native services to monitor file activity.
AWS workloads
Use AWS CloudTrial to capture file-level logging. For more information, review the Enabling CloudTrail event logging for S3 buckets and objects documentation.
Azure workloads
Leverage Azure Storage analytics logging to track detailed file operations in Azure storage accounts. For more information, review the Azure Storage analytics logging documentation.
GCP workloads
Google Cloud Storage (GCS) bucket file-level logging provides detailed insights into access and operations performed on objects stored in a bucket. For more information, review the Configure log buckets documentation.