You are using AssumeRole to send cluster logs to a S3 bucket in another account and you get an access denied error.
AssumeRole does not allow you to send cluster logs to a S3 bucket in another account.
This is because the log daemon runs on the host machine. It does not run inside the container.
Only items that run inside the container have access to the Apache Spark configuration. This is required for AssumeRole to work correctly.
You can achieve a similar result in one of two ways.
- You can use a cross account bucket policy and send the logs to a S3 bucket in another account.
- You can mount the target S3 bucket on DBFS with a role that has write permissions. Specify this path as the log delivery destination.