Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Avro >> mail # user >> Avro RPC: Python to Java isn't working for me...


Copy link to this message
-
Re: Avro RPC: Python to Java isn't working for me...
Hi Atin,

Thanks for the response. Yes I understand I could use HTTPServer on the
java side and things would work. However I'm after a solution where I can
still have the java side use the NettyServer.

Cheers,

Stefan
On Wed, May 22, 2013 at 4:11 AM, Atin Sood <[EMAIL PROTECTED]> wrote:

> You can try looking into something that I wrote as an example
>
>
> https://github.com/atinsood/HESDataAnalyticsFinalProject/tree/master/javaXPython
>
> https://github.com/atinsood/HESDataAnalyticsFinalProject#javaxpython
>
> --
> Atin Sood
> Sent with Sparrow <http://www.sparrowmailapp.com/?sig>
>
> On Tuesday, May 21, 2013 at 11:18 PM, Stefan Krawczyk wrote:
>
> Hi,
>
> I am trying to use Avro RPC and have a python client talk to a java
> server, using the avro-rpc-quickstart<https://github.com/phunt/avro-rpc-quickstart> on
> github as a base (I made sure the avro version being pulled in was 1.7.4).
> However when I get my python client to talk to the java server I see this
> error:
>
> 2013-05-20 19:38:32,512 (pool-5-thread-2) [WARN -
> org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.exceptionCaught(NettyServer.java:201)]
> Unexpected exception from downstream.
> org.apache.avro.AvroRuntimeException: Excessively large list allocation
> request detected: 539959368 items! Connection closed.
> at
> org.apache.avro.ipc.NettyTransportCodec$NettyFrameDecoder.decodePackHeader(NettyTransportCodec.java:167)
>  at
> org.apache.avro.ipc.NettyTransportCodec$NettyFrameDecoder.decode(NettyTransportCodec.java:139)
> at
> org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:286)
>  at
> org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:208)
> at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
>  at
> org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
> at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:94)
>  at
> org.jboss.netty.channel.socket.nio.AbstractNioWorker.processSelectedKeys(AbstractNioWorker.java:364)
> at
> org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:238)
>  at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:38)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>  at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
>
> From digging around on the web I understand this is a NettyTransceiver
> issue, i.e. the python client isn't using it because it uses the
> HTTPTransceiver.
>
> I was wondering, what are my options for moving forward, other than
> getting the java server to use the HTTPTransceiver?
>
> Apologies if I have overlooked something that points out what I can do.
>
> Cheers,
>
> Stefan
>
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB