Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> About Mine Flume Demand


Copy link to this message
-
Re: About Mine Flume Demand
hi Yanzhi,
    Can I guess your requirement with "sending original files to HDFS by
path"? If my understanding is yes, the possible solution may be looks like
this:
    When you got Event from source (log-files), you could set the log file
name or other identifier to Event header. May be the folder potentially,
then the HDFS path can be set with pattern '/topfolder/${your log file
folder/name}/'. Thus, different file logs comes into different HDFS file.
     Could it satisfy your requirement?

-Regards
Denny Ye

2012/8/2 荆棘鸟 <[EMAIL PROTECTED]>

>   hello people:
>   I am going to custom the flume source.But I have a little problem.I want
> to monitor the source path ,but in my path has many different log-files.I
> hope to merger the same log-files and don't to merger the different
> log-files.At last , throught sink's configuration sends these log-files to
> the hadoop-hdfs.About my demand ,I have to talk out a good way with
> everyone.
> Thanks very much!
> My Name: Yanzhi liu.
>