Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase, mail # user - HBase master fails at deleting af file but won't tell me which one and retries forever


+
Chip Salzenberg 2013-06-16, 17:44
Copy link to this message
-
Re: HBase master fails at deleting af file but won't tell me which one and retries forever
Ted Yu 2013-06-16, 20:19
Here is related code:

        return fs.delete(dir, false);
      } catch (IOException ioe) {
        lastIOE = ioe;
        if (!fs.exists(dir)) return true;
        // dir is there, retry deleting after some time.
        sleepBeforeRetry("Delete File", i + 1);

Looks like the parameter dir should have been part of the message.

Did you find any IOException in the vicinity of the master log ?

Cheers

On Sun, Jun 16, 2013 at 10:44 AM, Chip Salzenberg <[EMAIL PROTECTED]>wrote:

> My cluster's master log contains this infinite repetition:
>
> 2013-06-16 10:38:13,651 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 1
> 2013-06-16 10:38:14,654 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 2
> 2013-06-16 10:38:16,657 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 3
> 2013-06-16 10:38:19,659 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 4
> 2013-06-16 10:38:23,662 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 5
> 2013-06-16 10:38:28,665 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 6
> 2013-06-16 10:38:34,668 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 7
> 2013-06-16 10:38:41,671 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 8
> 2013-06-16 10:38:49,673 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 9
> 2013-06-16 10:38:58,676 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 10
> 2013-06-16 10:39:08,681 WARN org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, retries exhausted
> 2013-06-16 10:39:13,879 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 1
> 2013-06-16 10:39:14,882 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 2
> 2013-06-16 10:39:16,885 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 3
> 2013-06-16 10:39:19,888 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 4
> 2013-06-16 10:39:23,891 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 5
> 2013-06-16 10:39:28,894 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 6
> 2013-06-16 10:39:34,897 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 7
> 2013-06-16 10:39:41,900 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 8
> 2013-06-16 10:39:49,903 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 9
> 2013-06-16 10:39:58,905 INFO org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, sleeping 1000 times 10
> 2013-06-16 10:40:08,910 WARN org.apache.hadoop.hbase.HBaseFileSystem:
> Delete File, retries exhausted
>
> What is it trying to do?  And if it keeps saying "retries exhausted", why
> does it keep retrying?  This is
> hbase-0.94.2+218-1.cdh4.2.1.p0.8~lucid-cdh4.2
>
+
Jean-Marc Spaggiari 2013-06-17, 12:10