Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Unable to setup HDFS sink


+
Vikram Kulkarni 2013-01-14, 06:43
+
Alexander Alten-Lorenz 2013-01-14, 06:57
+
Vikram Kulkarni 2013-01-14, 07:04
+
Nitin Pawar 2013-01-14, 07:07
Copy link to this message
-
Re: Unable to setup HDFS sink
the correct value maps to this fs.default.name in your core-site.xml

so whatever value you have there, you will need to use same for flume hdfs
sink
On Mon, Jan 14, 2013 at 12:37 PM, Nitin Pawar <[EMAIL PROTECTED]>wrote:

> Its a jobtracker uri
>
> There shd be a conf in ur hdfs-site.xml and core-site.xml which looks like
> hdfs://localhost:9100/
>
> You need to use that value
> On Jan 14, 2013 12:34 PM, "Vikram Kulkarni" <[EMAIL PROTECTED]>
> wrote:
>
>> I was able to write using the same hdfs conf from a different sink.
>> Also, I can open the MapRed administration page successfully at
>> http://localhost:50030/jobtracker.jsp So that should indicate that the
>> hdfs path below is valid right? Any other way to check?
>>
>> Thanks.
>>
>> On 1/13/13 10:57 PM, "Alexander Alten-Lorenz" <[EMAIL PROTECTED]>
>> wrote:
>>
>> >Hi,
>> >
>> >Check your HDFS cluster, he's not responding on localhost/
>> 127.0.0.1:50030
>> >
>> >- Alex
>> >
>> >On Jan 14, 2013, at 7:43 AM, Vikram Kulkarni <[EMAIL PROTECTED]>
>> >wrote:
>> >
>> >> I am trying to setup a sink for hdfs for HTTPSource . But I get the
>> >>following exception when I try to send a simple Json event. I am also
>> >>using a logger sink and I can clearly see the event output to the
>> >>console window but it fails to write to hdfs. I have also in a separate
>> >>conf file successfully written to hdfs sink.
>> >>
>> >> Thanks,
>> >> Vikram
>> >>
>> >> Exception:
>> >> [WARN -
>>
>> >>org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:456)]
>> >> HDFS IO error
>> >> java.io.IOException: Call to localhost/127.0.0.1:50030 failed on local
>> >>exception: java.io.EOFException
>> >> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1144)
>> >>
>> >> My conf file is as follows:
>> >> # flume-httphdfs.conf: A single-node Flume with Http Source and hdfs
>> >>sink configuration
>> >>
>> >> # Name the components on this agent
>> >> agent1.sources = r1
>> >> agent1.channels = c1
>> >>
>> >> # Describe/configure the source
>> >> agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
>> >> agent1.sources.r1.port = 5140
>> >> agent1.sources.r1.handler = org.apache.flume.source.http.JSONHandler
>> >> agent1.sources.r1.handler.nickname = random props
>> >>
>> >> # Describe the sink
>> >> agent1.sinks = logsink hdfssink
>> >> agent1.sinks.logsink.type = logger
>> >>
>> >> agent1.sinks.hdfssink.type = hdfs
>> >> agent1.sinks.hdfssink.hdfs.path = hdfs://localhost:50030/flume/events
>> >> agent1.sinks.hdfssink.hdfs.file.Type = DataStream
>> >>
>> >> # Use a channel which buffers events in memory
>> >> agent1.channels.c1.type = memory
>> >> agent1.channels.c1.capacity = 1000
>> >> agent1.channels.c1.transactionCapacity = 100
>> >>
>> >> # Bind the source and sink to the channel
>> >> agent1.sources.r1.channels = c1
>> >> agent1.sinks.logsink.channel = c1
>> >> agent1.sinks.hdfssink.channel = c1
>> >>
>> >>
>> >
>> >--
>> >Alexander Alten-Lorenz
>> >http://mapredit.blogspot.com
>> >German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>> >
>>
>>
--
Nitin Pawar
+
Vikram Kulkarni 2013-01-14, 07:36
+
Nitin Pawar 2013-01-14, 07:39
+
Vikram Kulkarni 2013-01-14, 07:41
+
Nitin Pawar 2013-01-14, 07:04
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB