Sean Hudson 2013-01-17, 14:56
Mohammad Tariq 2013-01-17, 14:58
Leo Leung 2013-01-17, 18:46
谢良 2013-01-18, 03:44
Chris Mawata 2013-01-17, 20:10
Sean Hudson 2013-01-22, 12:09
Will you be able to run the memtest86 on this VM? Maybe it's an issue
with the way the VM is managing the memory?
I ran HBase+Hadoop on a desktop with only 1.5G. So you should not have
any issue with 6GB.
I don't think the issue you are facing is related to hadoop. Can you
try to run a simple Java application in you JVM? Something which will
use lot of memory. And see if it works?
2013/1/22, Sean Hudson <[EMAIL PROTECTED]>:
> Hi Jean-Marc,
> The Linux machine on which I am attempting to get
> Hadoop running is actually Linux running in a VM partition. This VM
> partition had 2 Gigs of RAM when I first encountered the problem. This RAM
> allocation has been bumped up to 6 Gigs, but the problem still persists,
> bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+' still
> crashes out as before.
> Is there a minimum RAM size requirement?
> Will Hadoop run correctly on Linux in a VM partition?
> I had attempted to run Hadoop in Pseudo-Distributed
> Operation mode and this included modifying the conf/core-site.xml,
> conf/hdfs-site.xml and the conf/mapred-site.xml files as per the Quick Start
> instructions. I also formatted a new distributed-filesystem as per the
> instructions. To re-test in Standalone mode with 6 Gigs of RAM, I reversed
> the changes to the above three .xml files in /conf. However, I don't see a
> way to back-out the distributed-filesystem. Will the existence of this
> distributed-filesystem interfere with my Standalone tests?
> Sean Hudson
> -----Original Message-----
> From: Jean-Marc Spaggiari
> Sent: Friday, January 18, 2013 3:24 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Problems
> Hi Sean,
> It's strange. You should not faced that. I faced same kind of issues
> on a desktop with memory errors. Can you install memtest86 and fullty
> test your memory (one pass is enought) to make sure you don't have
> issues on that side?
> 2013/1/18, Sean Hudson <[EMAIL PROTECTED]>:
>> I downloaded the suggested 1.6.0_32 Java version to my home
>> directory, but I am still experiencing the same problem (See error
>> The only thing that I have set in my hadoop-env.sh file is the JAVA_HOME
>> environment variable. I have also tried it with the Java directory added
>> export JAVA_HOME=/home/shu/jre1.6.0_32
>> export PATH=$PATH:/home/shu/jre1.6.0_32
>> Every other environment variable is defaulted.
>> Just to clarify, I have tried this in Local Standalone mode and also in
>> Pseudo-Distributed Mode with the same result.
>> Frustrating to say the least,
>> Sean Hudson
>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar
>> input output 'dfs[a-z.]+'
>> # A fatal error has been detected by the Java Runtime Environment:
>> # SIGFPE (0x8) at pc=0xb7fc51fb, pid=23112, tid=3075554208
>> # JRE version: 6.0_32-b05
>> # Java VM: Java HotSpot(TM) Client VM (20.7-b02 mixed mode, sharing
>> linux-x86 )
>> # Problematic frame:
>> # C [ld-linux.so.2+0x91fb] double+0xab
>> # An error report file with more information is saved as:
>> # /home/shu/hadoop-1.0.4/hs_err_pid23112.log
>> # If you would like to submit a bug report, please visit:
>> # http://java.sun.com/webapps/bugreport/crash.jsp
>> # The crash happened outside the Java Virtual Machine in native code.
>> # See problematic frame for where to report the bug.
>> -----Original Message-----
>> From: Leo Leung
>> Sent: Thursday, January 17, 2013 6:46 PM
>> To: [EMAIL PROTECTED]
>> Subject: RE: Problems
>> Use Sun/Oracle 1.6.0_32+ Build should be 20.7-b02+
>> 1.7 causes failure and AFAIK, not supported, but you are free to try
>> latest version and report back.
>> -----Original Message-----
>> From: Sean Hudson [mailto:[EMAIL PROTECTED]]
>> Sent: Thursday, January 17, 2013 6:57 AM
ke yuan 2013-01-25, 02:45