Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >>


Hi Harsh,

I need to take care my eyes recently, I mis-read 1.2.0 to 1.0.2, so I said
upgrade. Sorry.
On Tue, Jun 4, 2013 at 9:46 AM, Harsh J <[EMAIL PROTECTED]> wrote:

> Azuryy,
>
> 1.1.2 < 1.2.0. Its not an upgrade you're suggesting there. If you feel
> there's been a regression, can you comment that on the JIRA?
>
> On Tue, Jun 4, 2013 at 6:57 AM, Azuryy Yu <[EMAIL PROTECTED]> wrote:
> > yes. hadoop-1.1.2 was released on Jan. 31st. just download it.
> >
> >
> > On Tue, Jun 4, 2013 at 6:33 AM, Lanati, Matteo <[EMAIL PROTECTED]>
> wrote:
> >>
> >> Hi Azuryy,
> >>
> >> thanks for the update. Sorry for the silly question, but where can I
> >> download the patched version?
> >> If I look into the closest mirror (i.e.
> >> http://mirror.netcologne.de/apache.org/hadoop/common/), I can see that
> the
> >> Hadoop 1.1.2 version was last updated on Jan. 31st.
> >> Thanks in advance,
> >>
> >> Matteo
> >>
> >> PS: just to confirm that I tried a minimal Hadoop 1.2.0 setup, so
> without
> >> any security, and the problem is there.
> >>
> >> On Jun 3, 2013, at 3:02 PM, Azuryy Yu <[EMAIL PROTECTED]> wrote:
> >>
> >> > can you upgrade to 1.1.2, which is also a stable release, and fixed
> the
> >> > bug you facing now.
> >> >
> >> > --Send from my Sony mobile.
> >> >
> >> > On Jun 2, 2013 3:23 AM, "Shahab Yunus" <[EMAIL PROTECTED]>
> wrote:
> >> > Thanks Harsh for the reply. I was confused too that why security is
> >> > causing this.
> >> >
> >> > Regards,
> >> > Shahab
> >> >
> >> >
> >> > On Sat, Jun 1, 2013 at 12:43 PM, Harsh J <[EMAIL PROTECTED]> wrote:
> >> > Shahab - I see he has mentioned generally that security is enabled
> >> > (but not that it happens iff security is enabled), and the issue here
> >> > doesn't have anything to do with security really.
> >> >
> >> > Azurry - Lets discuss the code issues on the JIRA (instead of here) or
> >> > on the mapreduce-dev lists.
> >> >
> >> > On Sat, Jun 1, 2013 at 10:05 PM, Shahab Yunus <[EMAIL PROTECTED]
> >
> >> > wrote:
> >> > > HI Harsh,
> >> > >
> >> > > Quick question though: why do you think it only happens if the OP
> >> > > 'uses
> >> > > security' as he mentioned?
> >> > >
> >> > > Regards,
> >> > > Shahab
> >> > >
> >> > >
> >> > > On Sat, Jun 1, 2013 at 11:49 AM, Harsh J <[EMAIL PROTECTED]>
> wrote:
> >> > >>
> >> > >> Does smell like a bug as that number you get is simply
> >> > >> Long.MAX_VALUE,
> >> > >> or 8 exbibytes.
> >> > >>
> >> > >> Looking at the sources, this turns out to be a rather funny Java
> >> > >> issue
> >> > >> (there's a divide by zero happening and [1] suggests Long.MAX_VALUE
> >> > >> return in such a case). I've logged a bug report for this at
> >> > >> https://issues.apache.org/jira/browse/MAPREDUCE-5288 with a
> >> > >> reproducible case.
> >> > >>
> >> > >> Does this happen consistently for you?
> >> > >>
> >> > >> [1]
> >> > >>
> >> > >>
> http://docs.oracle.com/javase/6/docs/api/java/lang/Math.html#round(double)
> >> > >>
> >> > >> On Sat, Jun 1, 2013 at 7:27 PM, Lanati, Matteo <
> [EMAIL PROTECTED]>
> >> > >> wrote:
> >> > >> > Hi all,
> >> > >> >
> >> > >> > I stumbled upon this problem as well while trying to run the
> >> > >> > default
> >> > >> > wordcount shipped with Hadoop 1.2.0. My testbed is made up of 2
> >> > >> > virtual
> >> > >> > machines: Debian 7, Oracle Java 7, 2 GB RAM, 25 GB hard disk. One
> >> > >> > node is
> >> > >> > used as JT+NN, the other as TT+DN. Security is enabled. The input
> >> > >> > file is
> >> > >> > about 600 kB and the error is
> >> > >> >
> >> > >> > 2013-06-01 12:22:51,999 WARN
> >> > >> > org.apache.hadoop.mapred.JobInProgress: No
> >> > >> > room for map task. Node 10.156.120.49 has 22854692864 bytes free;
> >> > >> > but we
> >> > >> > expect map to take 9223372036854775807
> >> > >> >
> >> > >> > The logfile is attached, together with the configuration files.
> The
> >> > >> > version I'm using is
> >> > >> >
> >> > >> > Hadoop 1.2.0
> >> > >> > Subversion
> >> > >> >
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB