Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> errors after upgrade


Copy link to this message
-
Re: errors after upgrade
No custom code. and I did enable RPC logging to see what might be
wrong, but nothing is showing that would be considered an error.  Its
happening on two of our clusters that run different functions, one
uses thrift, and the other is REST.  From the looks of the stack trace
it seems like a low level java error.   One of our clusters runs scans
without filters, and the other is just PUT and GET, both get errors.
We can add some more debug code into the source and try if you can
suggest of a way to produce more debugging info.

Thanks.

-jack

On Mon, Nov 14, 2011 at 2:27 PM, Stack <[EMAIL PROTECTED]> wrote:
> On Wed, Nov 9, 2011 at 7:24 PM, Jack Levin <[EMAIL PROTECTED]> wrote:
>> Hey guys, I am getting those errors after moving into 0.90.4:
>>
>
> You have custom code on the server-side Jack?  A filter or something?
>
> You could turn on rpc logging.  It could give you more clues on what
> is messing up.  You could turn it on on a single node in the UI w/o
> having to restart a node; see the 'Log Level' servlet... its along the
> top of the UI.  Set the class
> log4j.logger.org.apache.hadoop.ipc.HBaseServer to DEBUG level.  It'll
> spew a bunch of logs and hopefully you can see whats off.  You can
> disable it again similarly.
>
> St.Ack
>
>> 2011-11-09 19:22:51,220 ERROR
>> org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
>> java.io.EOFException
>>        at java.io.DataInputStream.readInt(DataInputStream.java:375)
>>        at org.apache.hadoop.hbase.client.Get.readFields(Get.java:377)
>>        at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
>>        at org.apache.hadoop.hbase.ipc.HBaseRPC$Invocation.readFields(HBaseRPC.java:127)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:978)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:946)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:522)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:316)
>>        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>        at java.lang.Thread.run(Thread.java:662)
>> 2011-11-09 19:22:51,220 WARN org.apache.hadoop.ipc.HBaseServer: IPC
>> Server listener on 60020: readAndProcess threw exception
>> java.io.IOException: Error in readFields. Count of bytes read: 0
>> java.io.IOException: Error in readFields
>>        at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:524)
>>        at org.apache.hadoop.hbase.ipc.HBaseRPC$Invocation.readFields(HBaseRPC.java:127)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:978)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:946)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:522)
>>        at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:316)
>>        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>        at java.lang.Thread.run(Thread.java:662)
>> Caused by: java.io.EOFException
>>        at java.io.DataInputStream.readInt(DataInputStream.java:375)
>>        at org.apache.hadoop.hbase.client.Get.readFields(Get.java:377)
>>        at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
>>        ... 8 more
>>
>>
>> I would be really said if this was the case of reading a row and
>> getting zero bytes.  Perhaps its an exception for a query when a row
>> does not exist?
>>
>> -Jack
>>
>