-RE: Total Space Available on Hadoop Cluster Or Hadoop version of "df".
Jonathan Gray 2010-10-03, 04:32
There is a ton of documentation available for Hadoop (including books).
Best place to start is the wiki: http://wiki.apache.org/hadoop/
On your specific issue, you need to configure Hadoop to tell it what directories to store data.
The configuration parameter name is 'dfs.data.dir' and you need to put in a comma-delimited list of directories to use to store data.
> -----Original Message-----
> From: rahul [mailto:[EMAIL PROTECTED]]
> Sent: Saturday, October 02, 2010 9:53 AM
> To: [EMAIL PROTECTED]
> Subject: Re: Total Space Available on Hadoop Cluster Or Hadoop version
> of "df".
> Hi Marcos,
> Same thing is happening for me as well.
> I have multiple disks mounted to my system but by default when i format
> it took the nearest/ disk in which hadoop binary is present.
> Is there a way in which I can format all the drives mounted to my
> system ?
> So can we control in some way the drives or the places which we want to
> format for hdfs?
> On Oct 2, 2010, at 7:39 AM, Marcos Pinto wrote:
> > I gotte the same problem, I remember it was something realted to
> > partition.
> > for example I created hadoop user so HDFS took the closest partition
> > user.
> > I dont remenber exaclty but it was something like that. I hope it
> helps u in
> > someway.
> > On Sat, Oct 2, 2010 at 2:13 AM, Glenn Gore
> <[EMAIL PROTECTED]>wrote:
> >> hadoop dfsadmin -report
> >> Regards
> >> Glenn
> >> -----Original Message-----
> >> From: rahul [mailto:[EMAIL PROTECTED]]
> >> Sent: Sat 10/2/2010 2:27 PM
> >> To: [EMAIL PROTECTED]
> >> Subject: Total Space Available on Hadoop Cluster Or Hadoop version
> of "df".
> >> Hi,
> >> I am using Hadoop 0.20.2 version for data processing by setting up
> >> Cluster on two nodes.
> >> And I am continuously adding more space to the nodes.
> >> Can some body let me know how to get the total space available on
> >> hadoop cluster using command line.
> >> or
> >> Hadoop version "df", Unix command.
> >> Any input is helpful.
> >> Thanks
> >> Rahul