Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Spark Issue: java.lang.OutOfMemoryError

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

Symptom

OutOfMemoryError

Cause

java.lang.OutOfMemoryError is thrown when there is not enough heap memory (for JVM to allocating new objects).

Solution

Increase executor memory.

:::bash
--executor-memory=20G

Reference:

http://stackoverflow.com/questions/27462061/why-does-spark-fail-with-java-lang-outofmemoryerror-gc-overhead-limit-exceeded