Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> Is "synchronized" required?


+
Bing Li 2013-02-04, 20:20
+
Harsh J 2013-02-04, 20:21
+
Ted Yu 2013-02-04, 20:25
+
Bing Li 2013-02-04, 20:32
+
Haijia Zhou 2013-02-04, 20:42
+
Adrien Mogenet 2013-02-04, 21:13
+
Nicolas Liochon 2013-02-04, 21:31
+
Bing Li 2013-02-04, 22:40
+
Nicolas Liochon 2013-02-04, 22:49
+
Bing Li 2013-02-05, 16:54
Copy link to this message
-
Re: Is "synchronized" required?
Are you sharing this.rankTable between threads? HTable is not thread safe.

-- Lars

________________________________
 From: Bing Li <[EMAIL PROTECTED]>
To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>; user <[EMAIL PROTECTED]>
Sent: Tuesday, February 5, 2013 8:54 AM
Subject: Re: Is "synchronized" required?
 
Dear all,

After "synchronized" is removed from the method of writing, I get the
following exceptions when reading. Before the removal, no such
exceptions.

Could you help me how to solve it?

Thanks so much!

Best wishes,
Bing

     [java] Feb 6, 2013 12:21:31 AM
org.apache.hadoop.hbase.ipc.HBaseClient$Connection run
     [java] WARNING: Unexpected exception receiving call responses
     [java] java.lang.NullPointerException
     [java]     at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
     [java]     at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
     [java] Feb 6, 2013 12:21:31 AM
org.apache.hadoop.hbase.client.ScannerCallable close
     [java] WARNING: Ignore, probably already closed
     [java] java.io.IOException: Call to greatfreeweb/127.0.1.1:60020
failed on local exception: java.io.IOException: Unexpected exception
receiving call responses
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:934)
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:903)
     [java]     at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
     [java]     at $Proxy6.close(Unknown Source)
     [java]     at
org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:112)
     [java]     at
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:74)
     [java]     at
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:39)
     [java]     at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1325)
     [java]     at
org.apache.hadoop.hbase.client.HTable$ClientScanner.nextScanner(HTable.java:1167)
     [java]     at
org.apache.hadoop.hbase.client.HTable$ClientScanner.next(HTable.java:1296)
     [java]     at
org.apache.hadoop.hbase.client.HTable$ClientScanner$1.hasNext(HTable.java:1356)
     [java]     at
com.greatfree.hbase.rank.NodeRankRetriever.LoadNodeGroupNodeRankRowKeys(NodeRankRetriever.java:348)
     [java]     at
com.greatfree.ranking.PersistNodeGroupNodeRanksThread.run(PersistNodeGroupNodeRanksThread.java:29)
     [java]     at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
     [java]     at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
     [java]     at java.lang.Thread.run(Thread.java:662)
     [java] Caused by: java.io.IOException: Unexpected exception
receiving call responses
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:509)
     [java] Caused by: java.lang.NullPointerException
     [java]     at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
     [java]     at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
     [java]     at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
The code that causes the exceptions is as follows.

        public Set<String> LoadNodeGroupNodeRankRowKeys(String
hostNodeKey, String groupKey, int timingScale)
        {
                List<Filter> nodeGroupFilterList = new ArrayList<Filter>();

                SingleColumnValueFilter hostNodeKeyFilter = new
SingleColumnValueFilter(RankStructure.NODE_GROUP_NODE_RANK_FAMILY,
RankStructure.NODE_GROUP_NODE_RANK_HOST_NODE_KEY_COLUMN,
CompareFilter.CompareOp.EQUAL, new SubstringComparator(hostNodeKey));
                hostNodeKeyFilter.setFilterIfMissing(true);
                nodeGroupFilterList.add(hostNodeKeyFilter);

                SingleColumnValueFilter groupKeyFilter = new
SingleColumnValueFilter(RankStructure.NODE_GROUP_NODE_RANK_FAMILY,
RankStructure.NODE_GROUP_NODE_RANK_GROUP_KEY_COLUMN,
CompareFilter.CompareOp.EQUAL, new SubstringComparator(groupKey));
                groupKeyFilter.setFilterIfMissing(true);
                nodeGroupFilterList.add(groupKeyFilter);

                SingleColumnValueFilter timingScaleFilter = new
SingleColumnValueFilter(RankStructure.NODE_GROUP_NODE_RANK_FAMILY,
RankStructure.NODE_GROUP_NODE_RANK_TIMING_SCALE_COLUMN,
CompareFilter.CompareOp.EQUAL, new
BinaryComparator(Bytes.toBytes(timingScale)));
                timingScaleFilter.setFilterIfMissing(true);
                nodeGroupFilterList.add(timingScaleFilter);

                FilterList nodeGroupFilter = new
FilterList(nodeGroupFilterList);
                Scan scan = new Scan();
                scan.setFilter(nodeGroupFilter);
                scan.setCaching(Parameters.CACHING_SIZE);
                scan.setBatch(Parameters.BATCHING_SIZE);

                Set<String> rowKeySet = Sets.newHashSet();
                try
                {
                        ResultScanner scanner = this.rankTable.getScanner(scan);
                        for (Result result : scanner)          //
<---- EXCEPTIONS are raised at this line.
                        {
                                for (KeyValue kv : result.raw())
                                {

rowKeySet.add(Bytes.toString(kv.getRow()));
                                        break;
                                }
                        }
                        scanner.close();
         
+
Bing Li 2013-02-07, 08:10
+
lars hofhansl 2013-02-07, 17:24
+
Bing Li 2013-02-06, 06:36
+
Adrien Mogenet 2013-02-06, 07:45
+
lars hofhansl 2013-02-06, 07:44
+
Bing Li 2013-02-06, 10:31
+
lars hofhansl 2013-02-06, 18:54
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB