Out of memory error

How to overcome out of memory error in Hadoop?

Hi Akriti, Most of the times DEB packages take priority over the other settings as these packages install hadoop configuration files into /etc/hadoop.
It is always better to install the packages in /etc/hadoop/hadoop-env.sh as it sets the maximum java heap memory for Hadoop.
This can be done by a simple command:
export HADOOP_CLIENT_OPTS=“-Xmx2048m $HADOOP_CLIENT_OPTS”
Hope this helps you.