Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # dev >> 0.95 Error in Connecting


Copy link to this message
-
Re: 0.95 Error in Connecting
On Mon, Sep 9, 2013 at 6:19 PM, David Williams <[EMAIL PROTECTED]>wrote:

>
> Hi all,
>
> I am working on a api demo that talks to hbase.  Today I upgraded to 0.95
> to get access to the hbase-client 0.95 libraries.
>
0.95 poms have not had the cleanup that is in the 0.96.0RC pom.  You should
use the latter.  See this note for location of staging repositories:
http://search-hadoop.com/m/7W1PfyzHy51

>  I unpacked the 0.95 binaries on my system, and started hbase.  I logged
> into Hbase shell, and checked status etc.  Then I added the client libs a
> hadoop 1.2.1 and hbase 0.95 to my pom.xml and ran a unit test which checks
> if I can read and write a simple test value to a table, which I created
> before hand.  The output is a stack trace and some timeouts.  The ip
> addresses correspond to my machine on the local network.  It then repeats
> this on the command line. What should I try next?  My goal is to simply
> programmatically read and write to a local hbase on Mac OS X running in
> pseudo distributed mode.
>
>
>
Here is a pom I made testing downstreamers' experience.  It may be of help
(the versions, etc., may have changed since -- be warned):

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>org.hbase.downstreamer</groupId>
  <artifactId>hbase-downstreamer</artifactId>
  <packaging>jar</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>hbase-downstreamer</name>
  <url>https://github.com/saintstack/hbase-downstreamer</url>
  <properties>
    <hbase.version>0.95.2-hadoop1-SNAPSHOT</hbase.version>
    <hadoop.version>1.1.2</hadoop.version>
  </properties>
  <dependencies>
    <!--START OF TEST SCOPE-->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <!--hbase-hadoop-compat comes in transtively but we need the test-jar
        to and this does not come in transitively.  Ditto for the
        hbase-hadoopX-compat.
     -->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-hadoop-compat</artifactId>
      <version>${hbase.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-hadoop1-compat</artifactId>
      <version>${hbase.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
    <!--This has the actual HBaseTestingUtility in it.
      -->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-server</artifactId>
      <version>${hbase.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
    <!--hbase-server test has dependency on this test jar.
        Seemingly in maven, test scoped objects are not
        transitively included, at least that is how I
        read the table in the maven oreilly book and up on
        the stonatype site on transitive includes where above
        is a direct include (with test scope) and below is
        a test scope
      -->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-common</artifactId>
      <version>${hbase.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
    <!--We need hadoop test jar.  It has minidfs in it.
      It is not transitively included.
      -->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-test</artifactId>
      <version>${hadoop.version}</version>
      <scope>test</scope>
    </dependency>
    <!--We need this class for hbase servers at test time.
      -->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-server</artifactId>
      <type>jar</type>
      <version>${hbase.version}</version>
      <scope>test</scope>
    </dependency>
    <!--END OF TEST SCOPE-->

    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-client</artifactId>
      <version>${hbase.version}</version>
    </dependency>
  </dependencies>
</project>

On the exceptions, what Nicolas said.  EOFException usually means some mess
in your filesystem -- i.e. files that do not make sense (be warned that if
you are running against local filesystem, that the hadoop implementation of
local filesystem does not support sync so an unclean shutdown can make for
'interesting' state).  Was the hbase.rootdir clean when you started?

St.Ack
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB