Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> HDFS escape sequence and use of timestamp header


Copy link to this message
-
Re: HDFS escape sequence and use of timestamp header
On Wed, Aug 22, 2012 at 2:16 PM, Kathleen Ting <[EMAIL PROTECTED]> wrote:

> Hi Mohit,
>
> You can configure a timestamp interceptor onto your source as follows:
>
> agent.sources.src-0.interceptors.ts.type = TIMESTAMP
>
> Once the timestamp interceptor is in place, you can use, in sinks, the
> timestamp it writes. Here is an example of using the timestamp in an
> HDFS sink:
>
> agent.sinks.sink-0.hdfs.filePrefix = FlumeData.%Y-%m-%d
>
> Here is more info about the timestamp interceptor:
> http://flume.apache.org/FlumeUserGuide.html#timestamp-interceptor
>
Thanks this helps. The type specified in the example is a fully qualified
class name, however in the table it says use type as TIMESTAMP. I guess
example is not correct and I should just use TIMESTAMP.

> Regards, Kathleen
>
> On Wed, Aug 22, 2012 at 11:50 AM, Mohit Anchlia <[EMAIL PROTECTED]>
> wrote:
> > I see this JIRA https://issues.apache.org/jira/browse/FLUME-1215 but
> how do
> > I take advantage of it? I am using 1.2.0 but %Y %m doesn't work. I just
> get
> > number format exceptions.
> >
> >
> > On Tue, Aug 21, 2012 at 5:52 PM, Mohit Anchlia <[EMAIL PROTECTED]>
> > wrote:
> >>
> >> I am using flume-ng 1.2.0 and I need to use %Y%M%D escape sequence. Do I
> >> need to write some custom interceptor? Could you please point me to an
> >> example? Currently my AvroClient looks like this:
> >>
> >>
> >> public void sendDataToFlume(String data) {
> >>
> >> // Create flume event object
> >>
> >> Event event = EventBuilder.withBody(data, Charset.forName(
> >>
> >> "UTF-8"));
> >>
> >> Map<String,String> headers > >>
> >> new HashMap<String,String>();
> >>
> >> headers.put(
> >>
> >> "host", hostName);
> >>
> >> event.setHeaders(headers);
> >>
> >> try {
> >>
> >> rpcClient.append(event);
> >>
> >> }
> >>
> >> catch (EventDeliveryException e) {
> >>
> >> connect();
> >>
> >> }
> >>
> >> }
> >
> >
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB