Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Exception : ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper


Copy link to this message
-
Re: Exception : ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper
Your classpath is probably wrong. Run "bin/hbase classpath" and grep
for all the hadoop jars. Make sure you only have those for the version
you want to use.

J-D

On Tue, Feb 26, 2013 at 8:20 AM, tarang dawer <[EMAIL PROTECTED]> wrote:
> Hi
> I am trying to use HBase 0.94.2 with Hadoop 1.1.1
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/net/SocketInputWrapper
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient.createConnection(HBaseClient.java:281)
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1137)
>     at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1000)
>     at
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
>     at $Proxy5.getProtocolVersion(Unknown Source)
>     at
> org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:183)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:335)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:312)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:364)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:236)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1313)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1269)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1256)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:965)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:860)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:962)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:864)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:821)
>     at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
>     at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
>     at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
>     at
> org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(TableInputFormat.java:96)
>     at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>     at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>     at
> org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1021)
>     at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
>     at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
>     at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
>     at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
>     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
>  Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.net.SocketInputWrapper
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB