Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Runs in Eclipse but not from jar


Copy link to this message
-
Runs in Eclipse but not from jar
I am running HBase 0.94.2 running on 6 servers with Zookeeper 3.4.5 running on 3.  HBase works from its shell and from within Eclipse but not as a jar file.  When I run within Eclipse I can see it worked properly by using the HBase shell commands (such as scan).

I seem to have 2 separate problems.

Problem 1: when I create a jar file from Eclipse it won't run at all:
ngc@hadoop1:~/hadoop-1.0.4$ bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable -classpath "/home/ngc/hbase-0.94.2/*"
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at HBase.CreateBiTable.run(CreateBiTable.java:26) [line 26 is: Configuration conf = HBaseConfiguration.create();]

Problem 2: when I create a "runnable" jar file from Eclipse it communicates with Zookeeper but then dies with:
Exception in thread "main" java.lang.IllegalArgumentException: Not a host:port pair: \ufffd
      [EMAIL PROTECTED],60000,1353949574468

I'd prefer to use a regular jar (5 KB) rather than a runnable jar (100 MB).  But I assume that if I fix Problem 1 then it will proceed until it crashes with Problem 2.

Thanks in advance for any suggestions --- Alan.

-----------------------------
CLASSPATH
ngc@hadoop1:~/hadoop-1.0.4$ env | grep CLASSPATH
CLASSPATH=/home/ngc/hadoop-1.0.4:/home/ngc/hbase-0.94.2/bin:/home/ngc/zookeeper-3.4.5/bin:/home/ngc/accumulo-1.3.5-incubating

-----------------------------
HBASE PROGRAM
package HBase;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;

public class CreateBiTable extends Configured implements Tool {
            public static String TableName = new String ("BiIPTable");
            public static String cf = "cf";  //column family
            public static String c1 = "c1";  //column1

            public static void main(String[] args) throws Exception {
                        long startTime = System.currentTimeMillis();
                        int res = ToolRunner.run(new Configuration(), new CreateBiTable(), args);
                        double duration = (System.currentTimeMillis() - startTime)/1000.0;
                        System.out.println(">>>> Job Finished in " + duration + " seconds");
                        System.exit(res);
            }

            public int run(String[] arg0) throws Exception {
        Configuration conf = HBaseConfiguration.create();
//      System.out.println("Configuration created");
      System.out.println("\t"+conf.toString());
      HBaseAdmin admin = new HBaseAdmin(conf);
//      System.out.println("\t"+admin.toString());
      if (admin.tableExists(TableName)) {
          // Disable and delete the table if it exists
          admin.disableTable(TableName);
          admin.deleteTable(TableName);
          System.out.println(TableName+" exists so deleted");
      }
      // Create table
      HTableDescriptor htd = new HTableDescriptor(TableName);
      HColumnDescriptor hcd = new HColumnDescriptor(cf);
      htd.addFamily(hcd);
      admin.createTable(htd);
      System.out.println("Table created: "+htd);
     // Does the table exist now?
      if (admin.tableExists(TableName))
            System.out.println(TableName+" creation succeeded");
      else
            System.out.println(TableName+" creation failed");
                        return 0;
            }
}

-----------------------------
OUTPUT FROM RUNNING WITHIN ECLIPSE
            Configuration: core-default.xml, core-site.xml, hbase-default.xml, hbase-site.xml
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:host.name=hadoop1.aj.c2fse.northgrum.com
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_25
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.home=/home/ngc/jdk1.6.0_25/jre
12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/ngc/AlanSpace/HadoopPrograms/bin:/home/ngc/hadoop-1.0.4/hadoop-core-1.0.4.jar:/home/ngc/zookeeper-3.4.5/zookeeper-3.4.5.jar:/home/ngc/JavaLibraries/Jama/Jama-1.0.2.jar:/home/ngc/AlansOpenCVStuff/core.jar:/home/ngc/OpenCV-2.2.0/javacv/javacpp.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-macosx-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv.jar:/home/ngc/OpenCV-2.2.0/lib:/home/ngc/javafaces/lib/colt.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0-javadoc.jar:/home/ngc/Downloads/jtransforms-2.4.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7-job.jar:/home/ngc/mahout-distribution-0.7/mahout-integration-0.7.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2.jar:/home/ngc/mahout-distribution-0.7/mahout-math-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar:/home
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB