Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> using tail to HDFS sink


Copy link to this message
-
Re: using tail to HDFS sink
Most people in the US are asleep at this time so I wouldn't expect a
fast response. People will respond when they have time and are able to.

As to your problem, as I said earlier on, unless you've changed your
config since then, you don't have a host header defined. The logs don't
seem to show anything wrong, so I suspect that it may be sending data to
your hdfs without replacing the %{host} with your actual host. Remove
the %{host} and replace it with a hardcoded string. If that works, you
know that is your problem. Try using the HostInterceptor to add a host
header:
http://people.apache.org/~juhanic/flume-docs/FlumeUserGuide.html#host-interceptor

There is also a pair of quotation marks in your path string... is that
intended? agent2.sinks.HDFS.hdfs.path =
hdfs://10.5.114.110:54310/user/flume/'%{host}
<http://10.5.114.110:54310/user/flume/%27%%7Bhost%7D>'

I can't remember what happens when you  try to use a header that doesn't
exist... Probably gets replaced by an empty string as it's not throwing
exceptions.
Try checking hdfs in /user/flume and see if there is a poorly named
directory there with your data.

On 07/12/2012 06:41 PM, prabhu k wrote:
> can you please any one respond on the below issue?
>
> On Thu, Jul 12, 2012 at 12:57 PM, prabhu k <[EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]>> wrote:
>
>     As per mohammad suggestion i have executed like below.
>
>     root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT
>     <mailto:root@md-trngpoc1:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT>#
>     bin/flume-ng agent -n agent2 -c /conf -f conf/agent2.conf
>
>
>     Info: Including Hadoop libraries found via
>     (/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>     from classpath
>     Info: Excluding
>     /usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from
>     classpath
>     + exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
>     '/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB