Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - flume-ng agent startup problem


Copy link to this message
-
Re: flume-ng agent startup problem
Jagadish Bihani 2012-08-11, 09:09
Hi

In my case flume is not transferring data to HDFS with my hadoop version
being 0.20.1 and it doesn't show any error even in DEBUG log mode.
It works fine for other sinks.

Is there any known compatibility problem with hadoop 0.20.1 ? OR
  can there be a problem due to an particular hadoop version?
(I know its an old version but it is on production machine and cant upgrade
as of now...)

Details of configuration and log records are in the following mail

Thanks ,
Jagadish

On 08/10/2012 03:30 PM, Jagadish Bihani wrote:
> Hi
>
> Thanks all for the inputs. After the initial problem I was able to
> start flume except in one scenario in
> which I use HDFS as sink.
>
> I have a production machine with hadoop-0.20.1 installed. I have
> installed latest flume 1.2.0.
> It works fine for all the configurations (at least which I tried)
> except when HDFS sink is used.
>
> Test:
> ---------
>  I used both netcat listener as the source of the agent and HDFS is
> sink. Then I start the agent using
> the command *"bin/flume-ng agent -n agent1 -c conf -f
> conf/flume_hdfs.conf --classpath
> /MachineLearning/OTFA/hadoop-0.20.1-cluster1/hadoop-0.20.1-core.jar
> -Dflume.root.logger=DEBUG,console"*
>  with DEBUG logging mode enabled. I don't get any error/exception. I
> use *"/usr/sbin/lsof -i:<port_no>"* command to check whether the source
> is actually bound to that port and it doesn't return any port. But
> when I use *file sink instead of HDFS sink* and run lsof it correctly
> shows me the port on which
> it is listening.
> Thus when HDFS sink is used even source part of agent doesn't work and
> it doesn't give any exception. And nothing is written to
> HDFS sink.
>
> P.S. I have checked the user,permission details of HDFS. They are fine.
>
> I have run flume on my other machines with different version of hadoop
> (0.23 & 1.0). It has run HDFS sink properly there.
> Does flume support hadoop-0.20.1 or there is something I am missing???
>
> This is my Configuration:
> -----------------------------------------
> agent1.sources = sequencer
> agent1.sinks  =hdfsSink fileSink
> agent1.sinks  =fileSink
> agent1.channels =memoryChannel fileChannel
>
> agent1.sources.sequencer.type=seq
>
> agent1.sinks.hdfsSink.type=hdfs
> agent1.sinks.hdfsSink.hdfs.path=hdfs://MLNameNode2001:54310/flume
>
> agent1.sinks.fileSink.type=file_roll
> agent1.sinks.fileSink.sink.directory=/home/hadoop/flume/output
>
>
> agent1.channels.memoryChannel.type=memory
> agent1.channels.memoryChannel.capacity=10000
> agent1.channels.memoryChannel.transactionCapacity=100
>
>
> agent1.channels.fileChannel.type=file
> agent1.channels.fileChannel.checkpointDir=/home/hadoop/flume/channel/checkpointDir
> agent1.channels.fileChannel.dataDirs=/home/hadoop/flume/channel/dataDir
>
> agent1.sources.sequencer.channels = fileChannel
> agent1.sinks.hdfsSink.channel = fileChannel
>
> This is the log which I get:
> ----------------------------------------------------------
>
> bin/flume-ng agent -n agent1 -c conf -f conf/flume_hdfs.conf
> --classpath /MachineLearning/OTFA/hadoop-0.20.1-cluster1/hadoop
> -0.20.1-core.jar -Dflume.root.logger=DEBUG,console
> + exec /usr/java/jdk1.6.0_12/bin/java -Xmx20m
> -Dflume.root.logger=DEBUG,console -cp
> '/home/hadoop/flume/apache-flume-1.2.0/conf:/home/hadoop/flume/apache-flume-1.2.0/li
> b/*:/MachineLearning/OTFA/hadoop-0.20.1-cluster1/hadoop-0.20.1-core.jar'
> -Djava.library.path= org.apache.flume.node.Application -n agent1 -f
> conf/flume_hdfs.conf
> 2012-08-10 10:56:50,604 (main) [INFO -
> org.apache.flume.lifecycle.LifecycleSupervisor.start(LifecycleSupervisor.java:67)]
> Starting lifecycle supervisor 1
> 2012-08-10 10:56:50,607 (main) [INFO -
> org.apache.flume.node.FlumeNode.start(FlumeNode.java:54)] Flume node
> starting - agent1
> 2012-08-10 10:56:50,611 (lifecycleSupervisor-1-2) [INFO -
> org.apache.flume.conf.file.AbstractFileConfigurationProvider.start(AbstractFileConfigurationProvider.java:67)]
>  Configuration provider starting