Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # dev >> Exception while using HBase trunk with hadoop - 2.0.3


+
ramkrishna vasudevan 2013-02-21, 15:24
+
Ted Yu 2013-02-21, 17:00
+
ramkrishna vasudevan 2013-02-22, 02:26
+
ramkrishna vasudevan 2013-02-22, 03:35
+
Ted Yu 2013-02-21, 15:42
+
ramkrishna vasudevan 2013-02-21, 15:47
+
Anoop John 2013-02-21, 15:48
+
Ted Yu 2013-02-21, 15:54
+
ramkrishna vasudevan 2013-02-22, 04:42
+
Ted Yu 2013-02-22, 04:56
+
Ted Yu 2013-02-22, 05:18
+
ramkrishna vasudevan 2013-02-22, 05:36
Copy link to this message
-
Re: Exception while using HBase trunk with hadoop - 2.0.3
hadoop-2.0.3-alpha is the HDFS that am running.

REgards
Ram

On Fri, Feb 22, 2013 at 10:26 AM, Ted Yu <[EMAIL PROTECTED]> wrote:

> This indicates that the hadoop 2.0 HBase got built with lags the binary
> running as Namenode.
>
> Cheers
>
> On Thu, Feb 21, 2013 at 8:42 PM, ramkrishna vasudevan <
> [EMAIL PROTECTED]> wrote:
>
> > During this time NN says
> >
> > Incorrect header or version mismatch from 127.0.0.1:34789 got version 7
> > expected version 8
> >
> > Regards
> > Ram
> >
> >
> > On Thu, Feb 21, 2013 at 8:54 PM, ramkrishna vasudevan <
> > [EMAIL PROTECTED]> wrote:
> >
> > > Hi Devs
> > >
> > > I tried to run HBase current trunk snapshot with Hadoop 2.0.3 alpha.
> > >
> > > I got the following exception
> > > java.io.IOException: Failed on local exception:
> > > com.google.protobuf.InvalidProtocolBufferException: Message missing
> > > required fields: callId, status; Host Details : local host is: "ram/
> > > 10.239.47.144"; destination host is: "localhost":9000;
> > >  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1168)
> > >  at
> > >
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
> > > at $Proxy10.setSafeMode(Unknown Source)
> > >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >  at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > >  at
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
> > > at
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
> > >  at $Proxy10.setSafeMode(Unknown Source)
> > > at
> > >
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:514)
> > >  at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896)
> > > at
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:660)
> > >  at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:261)
> > > at
> org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:650)
> > >  at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:389)
> > > at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
> > >  at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
> > > at
> > >
> >
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:654)
> > >  at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:476)
> > > at java.lang.Thread.run(Thread.java:662)
> > > Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> > > missing required fields: callId, status
> > > at
> > >
> >
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
> > >  at
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
> > > at
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
> > >  at
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
> > > at
> > org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:886)
> > >  at org.apache.hadoop.ipc.Client$Connection.run(Client.java:817)
> > > 2013-02-20 20:44:01,928 INFO org.apache.hadoop.hbase.master.HMaster:
> > > Aborting
> > >
> > > I tried if there was something similar raised in the dev list.  Could
+
ramkrishna vasudevan 2013-02-22, 04:27
+
Seth Yang 2013-06-13, 06:05
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB