Tom! I would guess that just giving the NN JVM lots of memory (64Gb / 96Gb) should be the easiest way.
From: Tom Brown <[EMAIL PROTECTED]>
To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
Sent: Wednesday, September 25, 2013 11:29 AM
Subject: Is there any way to partially process HDFS edits?
I have an edits file on my namenode that is 35GB. This is quite a bit larger than it should be (the secondary namenode wasn't running for some time, and HBASE-9648 caused a huge number of additional edits).
The first time I tried to start the namenode, it chewed on the edits for about 4 hours and then ran out of memory. I have increased the memory available to the namenode (was 512MB, now 2GB), and started the process again.
Is there any way that the edits file can be partially processed to avoid having to re-process the same edits over and over until I can allocate enough memory for it to be done in one shot?
How long should it take (hours? days?) to process an edits file of that size?
Any help is appreciated!