Rita 2012-05-11, 10:42
Michael Segel 2012-05-11, 11:29
-Re: large machine configuration
Rita 2012-05-11, 11:51
most of the operations I do with MR are exporting tables and importing
tables. Does that still require a lot of memory and does it help to
allocate more memory for jobs like that?
Yes, I have 12 cores also. Are there any HDFS/MR/Hbase tuning tips for this
btw, 64GB is a lot for us :-)
On Fri, May 11, 2012 at 7:29 AM, Michael Segel <[EMAIL PROTECTED]>wrote:
> Funny, but this is part of a talk that I submitted to Strata....
> 64GB and HBase isn't necessarily a 'large machine'.
> If you're running w 12 cores, you're talking about a minimum of 48GB just
> for M/R.
> (4GB a core is a good rule of thumb )
> Depending on what you want to do, you could set aside 8GB of heap and tune
> that, but even that might not be enough...
> On May 11, 2012, at 5:42 AM, Rita wrote:
> > Hello,
> > While looking at,
> > I noticed large machine configuration section still isnt completed.
> > ¨Unfortunately¨, I am running on a large machine which as 64gb of memory
> > therefore I would need some help tuning my hbase/hadoop instance for
> > maximum performance. Can someone please shed light on what I should look
> > into?
> > --
> > --- Get your facts first, then you can distort them as you please.--
--- Get your facts first, then you can distort them as you please.--
Rita 2012-05-18, 10:42
Michael Segel 2012-05-18, 12:29