Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Why so many unexpected files like partitions_xxxx are created?


Copy link to this message
-
Re: Why so many unexpected files like partitions_xxxx are created?
>From the stack trace posted I saw:

org.apache.commons.logging.impl.Log4JLogger.error(Log4JLogger.java:257)
    at
org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.tryAtomicRegionLoad(
LoadIncrementalHFiles.java:577)

Assuming 0.94 is used, line 577 at the tip of 0.94 is:
        LOG.warn("Attempt to bulk load region containing "
            + Bytes.toStringBinary(first) + " into table "

But the following should be the corresponding line w.r.t. stack trace:
    } catch (IOException e) {
      LOG.error("Encountered unrecoverable error from region server", e);

Tao:
Can you check the log of LoadIncrementalHFiles to see what was the error
from region server ?

As Jieshan said, checking region server log would reveal something.

Cheers
On Tue, Dec 17, 2013 at 10:40 PM, Bijieshan <[EMAIL PROTECTED]> wrote:

> It seems LoadIncrementalHFiles is still running.  Can you run "jstack" on
> 1 RegionServer process also?
>
> Which version are you using?
>
> Jieshan.
> -----Original Message-----
> From: Tao Xiao [mailto:[EMAIL PROTECTED]]
> Sent: Wednesday, December 18, 2013 1:49 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Why so many unexpected files like partitions_xxxx are created?
>
> I did jstack one such process and can see the following output in the
> terminal, and I guess this info told us that the processes started by the
> command "LoadIncrementalHFiles" never exit. Why didn't they exit after
> finished running ?
>
> ... ...
> ... ...
>
> "LoadIncrementalHFiles-0.LruBlockCache.EvictionThread" daemon prio=10
> tid=0x000000004129c000 nid=0x2186 in Object.wait() [0x00007f53f3665000]
>    java.lang.Thread.State: WAITING (on object monitor)
>     at java.lang.Object.wait(Native Method)
>     - waiting on <0x000000075fcf3370> (a
> org.apache.hadoop.hbase.io.hfile.LruBlockCache$EvictionThread)
>     at java.lang.Object.wait(Object.java:485)
>     at
>
> org.apache.hadoop.hbase.io.hfile.LruBlockCache$EvictionThread.run(LruBlockCache.java:631)
>     - locked <0x000000075fcf3370> (a
> org.apache.hadoop.hbase.io.hfile.LruBlockCache$EvictionThread)
>     at java.lang.Thread.run(Thread.java:662)
>
>    Locked ownable synchronizers:
>     - None
>
> "LoadIncrementalHFiles-3" prio=10 tid=0x00007f540ca55800 nid=0x2185
> runnable [0x00007f53f3765000]
>    java.lang.Thread.State: RUNNABLE
>     at java.io.FileOutputStream.writeBytes(Native Method)
>     at java.io.FileOutputStream.write(FileOutputStream.java:282)
>     at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
>     - locked <0x0000000763e5af70> (a java.io.BufferedOutputStream)
>     at java.io.PrintStream.write(PrintStream.java:430)
>     - locked <0x0000000763d5b670> (a java.io.PrintStream)
>     at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202)
>     at sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:263)
>     at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:106)
>     - locked <0x0000000763d6c6d0> (a java.io.OutputStreamWriter)
>     at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:116)
>     at java.io.OutputStreamWriter.write(OutputStreamWriter.java:203)
>     at java.io.Writer.write(Writer.java:140)
>     at org.apache.log4j.helpers.QuietWriter.write(QuietWriter.java:48)
>     at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:317)
>     at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
>     at
> org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
>     - locked <0x0000000763d5fb90> (a org.apache.log4j.ConsoleAppender)
>     at
>
> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
>     at org.apache.log4j.Category.callAppenders(Category.java:206)
>     - locked <0x0000000763d65fe8> (a org.apache.log4j.spi.RootLogger)
>     at org.apache.log4j.Category.forcedLog(Category.java:391)
>     at org.apache.log4j.Category.log(Category.java:856)
>     at
> org.apache.commons.logging.impl.Log4JLogger.error(Log4JLogger.java
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB