Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Re: Need your help with Hadoop


Copy link to this message
-
Re: Need your help with Hadoop
You'd probably want to recheck your configuration of dfs.data.dir on the
node16 (perhaps its overriding the usual default), to see if it is perhaps
including more dirs than normal (and they may be all on the same disks as
well, the DN counts space via du/df on each directory so the number can
grow that way).

Also, please direct usage questions to [EMAIL PROTECTED] community,
which I've included in my response :)
On Tue, Mar 19, 2013 at 5:40 PM, 姚吉龙 <[EMAIL PROTECTED]> wrote:

> Hi
>
> I am a newer for the Hadoop platform, I really need your help.
> Now we have 32 datanodes available, while we find that the Configured
> Capacity is different among these datanodes though the hardware is the same.
> I wonder the reson why the node16 is much bigger than the others, besides
> which is main factor or directory that determine the capacity for each
> datanode.
>
>
> I wiil apprecite your kindly help, this problem has been puzzled me for a
> long time.
>
> BRs
> Geelong
>
> --
> From Good To Great
>

--
Harsh J
+
Harsh J 2013-03-22, 05:04
+
Harsh J 2013-03-22, 06:55