Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: using hadoop on zLinux (Linux on S390)


Copy link to this message
-
Re: using hadoop on zLinux (Linux on S390)
Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package hadoop?

Answer:
I didn't compile the package since I followed the instructions in the official documentation (http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no talk about compiling the code first.
By the way I am using the binary version I downloaded from the official download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile
-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <[EMAIL PROTECTED]>
> An: [EMAIL PROTECTED]
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
>
>  I have a couple of questions for you:
>
> 1. What version and vendor of JDK did you use to compile and package
> hadoop?
>
> 2. What version and vendor of JVM are you running? You can type java
> -version from the console to see this.
>
> Thanks,
> Kumar
>
> Kumar Ravi
> IBM Linux Technology Center
>
>
>
>
> From:
> "Emile Kao" <[EMAIL PROTECTED]>
> To:
> [EMAIL PROTECTED],
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
>
>
> No, this is the general available version...
>
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <[EMAIL PROTECTED]>
> > An: [EMAIL PROTECTED]
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
>
> > Well, on the surface....
> >
> > It looks like its either a missing class, or you don't have your class
> > path set up right.
> >
> > I'm assuming you got this version of Hadoop from IBM, so I would suggest
> > contacting their support and opening up a ticket.
> >
> >
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <[EMAIL PROTECTED]> wrote:
> >
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > >
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build > > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException:
> failure
> > to login
> > >        at
> >
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> >
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB