Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Is there any way to partially process HDFS edits?

Copy link to this message
Is there any way to partially process HDFS edits?
I have an edits file on my namenode that is 35GB. This is quite a bit
larger than it should be (the secondary namenode wasn't running for some
time, and HBASE-9648 caused a huge number of additional edits).

The first time I tried to start the namenode, it chewed on the edits for
about 4 hours and then ran out of memory. I have increased the memory
available to the namenode (was 512MB, now 2GB), and started the process

Is there any way that the edits file can be partially processed to avoid
having to re-process the same edits over and over until I can allocate
enough memory for it to be done in one shot?

How long should it take (hours? days?) to process an edits file of that

Any help is appreciated!