Ben Chuanlong Du's Blog

It is never too late to learn.

Spark Issue: Too Many Containers Asked

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

Error Message

org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Too many containers asked, 16731530.

image

Possible Causes

"Too many containers asked" is a protection mechanism of the Resource Manager. It might be triggered when dynamic allocation is enabled.

Solutions

Generally speaking, it is a good idea to turn on dynamic allocation. However, there is some issues in yarn/Spark which can cause the Spark cluster to allocate too many containers. One simple fix for this issue is to restrict the maximum number of executors.

    ...
    --conf spark.dynamicAllocation.enabled=true \
    --conf spark.dynamicAllocation.maxExecutors=1000 \
    ...

Comments