Runbook

Apache Spark Out of Memory Error incident.

Back to Runbooks

Overview

Apache Spark Out of Memory Error is a type of incident that occurs when the Apache Spark application running on a system is unable to allocate enough memory to perform its tasks. This can happen when the system's memory is insufficient or when there is a memory leak in the application's code. When this incident occurs, the application may crash, become unresponsive, or produce incorrect results. It is important to diagnose and resolve this issue as soon as possible to ensure the smooth functioning of the application.

Parameters

Debug

Check available memory

Check system logs for any relevant error messages

Check Spark logs for any relevant error messages

Check Spark configuration settings

Check for any running processes that may be consuming memory

Check for any Java processes that may be consuming memory

Check for any other system or application processes that may be consuming memory

Repair

Increase the memory capacity of the server in use.

Learn more

Related Runbooks

Check out these related runbooks to help you debug and resolve similar issues.