Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Sinking data to a Node.js server/listener


Copy link to this message
-
Re: Sinking data to a Node.js server/listener
Should it be assumed that structured data are transferred using flume and
that's why Avro is mentioned? One easy solution I can see would be to use a
syslog output that would actually point to your nodejs instance (and not a
real syslog endpoint).

Regards

Bertrand

On Sun, Feb 17, 2013 at 5:50 AM, Erik Bertrand <[EMAIL PROTECTED]> wrote:

> I'm looking to get Flume data transferred to a Node.js server listening on
> a specific port.  I'm not interested in actually storing the data anywhere,
> as it's simply using that data to display "current activity" and doesn't
> need to be persist it anywhere.  I'm transferring a very small amount of
> data - just a series of IP addresses, actually.  So I'd like to keep the
> architecture simple, too.
>
> At first I was thinking I could use the Avro sink to send the data
> directly to Node.js using a dnode <https://npmjs.org/package/dnode> server
> object (i.e. RPC), but there seems to be more to it than the basic setup.
>  I'm just not sure how to configure the Node.js side to understand the Avro
> sink RPC request (or if that's even possible).
>
> I've been looking at creating a custom sink to do this; I've not written
> one before, much less written anything in Java, so that'd be new to me.
>  Any pointers?
>
> Erik
>
>
--
Bertrand Dechoux
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB