Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: No matching handler for protocol MyTestProtocol in region


Copy link to this message
-
Re: org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: No matching handler for protocol MyTestProtocol in region
You used createTable() API but the log said:

Table already exists

On Jul 12, 2013, at 2:22 AM, ch huang <[EMAIL PROTECTED]> wrote:

> hi,all:
>         i spend all day for the problem ,and now totally exhausted,hope
> anyone can help me
>
> i code myself endpoint ,the logic is sample run the scan in some region
> with a filter and count the found records,
> i do not want my endpoint work for each region,i just need it work for my
> test table region.i compile and pack the MyTestProtocol and MyTestEndpoint
> into jar
> and put the jar into HDFS,and write the info into HTableDescriptor ,and use
> it create the test table.
>
> my testing code
>
> import java.io.IOException;
> import java.util.Map;
>
> import org.apache.commons.net.bsd.RExecClient;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.Path;
> import org.apache.hadoop.hbase.Coprocessor.*;
> import org.apache.hadoop.hbase.Coprocessor;
> import org.apache.hadoop.hbase.HBaseConfiguration;
> import org.apache.hadoop.hbase.HColumnDescriptor;
> import org.apache.hadoop.hbase.HTableDescriptor;
> import org.apache.hadoop.hbase.client.HBaseAdmin;
> import org.apache.hadoop.hbase.client.HTable;
> import org.apache.hadoop.hbase.client.Put;
> import org.apache.hadoop.hbase.client.Scan;
> import org.apache.hadoop.hbase.client.coprocessor.Batch;
> import org.apache.hadoop.hbase.filter.CompareFilter;
> import org.apache.hadoop.hbase.filter.Filter;
> import org.apache.hadoop.hbase.filter.RegexStringComparator;
> import org.apache.hadoop.hbase.filter.ValueFilter;
> import org.apache.hadoop.hbase.util.Bytes;
>
> public class TestMyCo {
> /**
>  * @param args
>  */
> public static void main(String[] args) throws IOException {
>  // TODO Auto-generated method stub
>  Configuration conf = HBaseConfiguration.create();
>  conf.addResource( "hbase-site.xml");
>  String tableName = "mytest";
>
>  HBaseAdmin admin = new HBaseAdmin(conf);
>       if (admin.tableExists(tableName)) {
>             System. out.println("table already exists!drop it\n" );
>             admin.disableTable(tableName);
>             admin.deleteTable(tableName);
>         }
>
>  final Scan scan = new Scan();
>  scan.addColumn("myfl".getBytes(), "myqf".getBytes());
>  final Filter filter = new ValueFilter(CompareFilter.CompareOp.EQUAL,new
> RegexStringComparator(".*\\.5"));
>  HTableDescriptor htd = new HTableDescriptor();
>  HColumnDescriptor hcd = new HColumnDescriptor("myfl".getBytes());
>  htd.addFamily(hcd);
>  htd.setName(tableName.getBytes());
>  Path path = new Path("hdfs:///192.168.10.22:9000/alex/test.jar");
>  System.out.println(":
> "+path.toString()+"|"+TestMyCo.class.getCanonicalName()+"|"+Coprocessor.PRIORITY_USER);
>
>  htd.setValue("COPROCESSOR$1", path.toString()+"|"
>    + TestMyCo.class.getCanonicalName()+"|"+Coprocessor.PRIORITY_USER);
>
>       admin.createTable(htd);
>       HTable table = new HTable(conf,tableName);
>       Put put = new Put(Bytes.toBytes("row1"));
>       put.add("myfl".getBytes(), "myqf".getBytes(), "myv.5".getBytes());
>       table.put(put);
>  try{
>        Map<byte[],Long> results > table.coprocessorExec(MyTestProtocol.class, null, null,
>          new Batch.Call<MyTestProtocol, Long>() {
>                  public Long call(MyTestProtocol mycheck) throws
> IOException {
>                   return mycheck.myFilter(scan, filter);
>                  }
>    });
>        for(Map.Entry<byte[], Long> entry : results.entrySet()){
>         System.out.println("find : " +entry.getKey() + " : " +
> entry.getValue() );
>        }
>
>     }catch(Throwable throwable){
>      throwable.printStackTrace();
>     }
> }
> }
>
> hbase error info
>
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB