Jobs
These articles can help you with your Databricks jobs.
- Distinguish active and dead jobs
- Spark job fails with
Driver is temporarily unavailable
- How to delete all jobs using the REST API
- Identify less used jobs
- Job cluster limits on notebook output
- Job fails, but Apache Spark tasks finish
- Job fails due to job rate limit
- Create table in overwrite mode fails when interrupted
- Apache Spark Jobs hang due to non-deterministic custom UDF
- Apache Spark job fails with
Failed to parse byte string
- Apache Spark UI shows wrong number of jobs
- Apache Spark job fails with a
Connection pool shut down
error - Job fails with
atypical errors
message - Apache Spark job fails with
maxResultSize
exception - Databricks job fails because library is not installed
- Jobs failing on Databricks Runtime 5.5 LTS with an SQLAlchemy package error
- Job failure due to Azure Data Lake Storage (ADLS) CREATE limits
- Job fails with invalid access token
- How to ensure idempotency for jobs
- Monitor running jobs with a Job Run dashboard
- Streaming job has degraded performance
- Task deserialization time is high