Content size error when trying to import or export a notebook

Reduce the size of your notebook so it is under 10MB.

Written by parth.sundarka

Last published at: January 16th, 2025

Problem

You are trying to import or export a Databricks notebook when you get a content size error. This can happen when using the API, CLI, or Terraform provider.

 

content size (xxxx) exceeded the limit 10485760.

 

Cause

There is a size limit of 10MB per notebook. Trying to import or export a notebook larger than 10MB generates an error.

 

Solution

Reduce the size of your notebook so it is under 10MB.

Ways to reduce the notebook side include:

  • Clear cell outputs. This removes any results stored in the notebook and can quickly lower the size.
  • Split large notebooks into multiple smaller notebooks and use %run or other techniques to run those smaller notebooks in a larger file. For more information, review the Orchestrate notebooks and modularize code in notebooks (AWSAzureGCP) documentation.
  • Remove unnecessary cells. If your notebook contains cells that are not essential, removing them can help reduce the size of your notebook.
  • Consider limiting the number of rows returned by your queries. Add filters to your query to remove unnecessary records from the notebook.
  • Export the notebook as source files such as Python (.py) or Scala (.scala).

 

If you cannot reduce the size of your notebook below 10MB, reach out to Databricks support for assistance. You will need to share the notebook URL, a description of why the file size cannot be trimmed, and the specific error message.