Problem
You are running a notebook on serverless when you get a PySpark assertion error message.
Error: PySparkAssertionError: Received incorrect server side session identifier for request. Please create a new Spark Session to reconnect. (52a6f5e0-3410-4f58-8a4e-ca81a4f41dc0 != 13140493-d7a8-4d33-85fd-044e753afef6)
The error persists even if you try to start a new Apache Spark session.
Cause
This can happen if the target cluster has restarted or crashed. It is part of the crash detection features in Spark Connect.
The server maintains a session id which it sends to the client in every response. When the client first gets an RPC back from the server, it records the server session id, and throws an error if it ever receives a different server session id.
If the server crashes, restarts, or otherwise loses the session state, the client sees a new session id and the error is thrown.
Solution
You must detach and reattach to serverless compute to reset the state.
To detach and reattach a notebook to serverless compute in Databricks, follow these steps:
- Click the cluster dropdown menu in the notebook toolbar.
- Hover over the attached cluster in the list to display a side menu.
- Click Detach & re-attach from the menu options.