Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: Problems

Hi Jean-Marc,
                        The Linux machine on which I am attempting to get
Hadoop running is actually Linux running in a VM partition. This VM
partition had 2 Gigs of RAM when I first encountered the problem. This RAM
allocation has been bumped up to 6 Gigs, but the problem still persists, i.e

bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'  still
crashes out as before.

Is there a minimum RAM size requirement?
Will Hadoop run correctly on Linux in a VM partition?

                        I had attempted to run Hadoop in Pseudo-Distributed
Operation mode and this included modifying the conf/core-site.xml,
conf/hdfs-site.xml and the conf/mapred-site.xml files as per the Quick Start
instructions. I also formatted a new distributed-filesystem as per the
instructions. To re-test in Standalone mode with 6 Gigs of RAM, I reversed
the changes to the above three .xml files in /conf. However, I don't see a
way to back-out the distributed-filesystem. Will the existence of this
distributed-filesystem interfere with my Standalone tests?


Sean Hudson

-----Original Message-----
From: Jean-Marc Spaggiari
Sent: Friday, January 18, 2013 3:24 PM
Subject: Re: Problems

Hi Sean,

It's strange. You should not faced that.  I faced same kind of issues
on a desktop with memory errors. Can you install memtest86 and fullty
test your memory (one pass is enought) to make sure you don't have
issues on that side?

2013/1/18, Sean Hudson <[EMAIL PROTECTED]>:
> Leo,
>         I downloaded the suggested 1.6.0_32 Java version to my home
> directory, but I am still experiencing the same problem (See error below).
> The only thing that I have set in my hadoop-env.sh file is the JAVA_HOME
> environment variable. I have also tried it with the Java directory added
> to
> export JAVA_HOME=/home/shu/jre1.6.0_32
> export PATH=$PATH:/home/shu/jre1.6.0_32
> Every other environment variable is defaulted.
> Just to clarify, I have tried this in Local Standalone mode and also in
> Pseudo-Distributed Mode with the same result.
> Frustrating to say the least,
> Sean Hudson
> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar
> grep
> input output 'dfs[a-z.]+'
> #
> # A fatal error has been detected by the Java Runtime Environment:
> #
> #  SIGFPE (0x8) at pc=0xb7fc51fb, pid=23112, tid=3075554208
> #
> # JRE version: 6.0_32-b05
> # Java VM: Java HotSpot(TM) Client VM (20.7-b02 mixed mode, sharing
> linux-x86 )
> # Problematic frame:
> # C  [ld-linux.so.2+0x91fb]  double+0xab
> #
> # An error report file with more information is saved as:
> # /home/shu/hadoop-1.0.4/hs_err_pid23112.log
> #
> # If you would like to submit a bug report, please visit:
> #   http://java.sun.com/webapps/bugreport/crash.jsp
> # The crash happened outside the Java Virtual Machine in native code.
> # See problematic frame for where to report the bug.
> #
> Aborted
> -----Original Message-----
> From: Leo Leung
> Sent: Thursday, January 17, 2013 6:46 PM
> Subject: RE: Problems
> Use Sun/Oracle  1.6.0_32+   Build should be 20.7-b02+
> 1.7 causes failure and AFAIK,  not supported,  but you are free to try the
> latest version and report back.
> -----Original Message-----
> From: Sean Hudson [mailto:[EMAIL PROTECTED]]
> Sent: Thursday, January 17, 2013 6:57 AM
> Subject: Re: Problems
> Hi,
>       My Java version is
> java version "1.6.0_25"
> Java(TM) SE Runtime Environment (build 1.6.0_25-b06) Java HotSpot(TM)
> Client
> VM (build 20.0-b11, mixed mode, sharing)
> Would you advise obtaining a later Java version?
> Sean
> -----Original Message-----
> From: Jean-Marc Spaggiari
> Sent: Thursday, January 17, 2013 2:52 PM
> Subject: Re: Problems
> Hi Sean,
> This is an issue with your JVM. Not related to hadoop.
> Which JVM are you using, and can you try with the last from Sun?
Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray,
Co. Wicklow, Ireland

Registered in Ireland CRO No.507541 This email and any attachments to it
is, unless otherwise stated, confidential, may contain copyright material
and is for the use of the intended recipient only. If you have received
this email in error, please notify the sender by return and deleting all
copies. Any views expressed in this email are those of the sender and do
not form part of any contract between Ostia Software Solutions Limited and
any other party.