Hi,

I was wondering if anyone could assist me with this issue. I am looking to
create a customized Event serializer very similar to the Splitting event
serializer found at this
link<https://blogs.apache.org/flume/entry/streaming_data_into_apache_hbase>

While I am very familiar with the concept being explained with the source
code what I would like to know is:

1) How can I proceed to develop my own Custom Hbase Event Serializer? What
libraries would I have to extend/import? The source code does not show what
packages it is defined against nor what libraries are included to have the
source code compile successfully.

2) Where would I place my deliverable class files?

3) Do I need to make any changes to any environment variables such as
$CLASSPATH to have my Flume agents work with my custom Serializer?

Anyone with prior experience creating this sort of Splitting event
serializer, please I need assistance..

As I said I am extremely familiar with what the code is trying to achieve I
am looking for guidance regarding HOW to integrate this into my existing
Flume environment.

Thanks in advance.

 
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB