Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Re: error writing to collectorSink on mac os x


+
Abhi Dubey 2012-12-02, 02:01
+
Alexander Alten-Lorenz 2012-11-30, 07:17
Copy link to this message
-
Re: error writing to collectorSink on mac os x
Alex,

Thanks for the quick reply. I am afraid I don't understand which config file are you asking for. Do you mean the config file I used to setup flume source and sink? If that is so, I just used the web interface http://localhost:35871/flumemaster.jsp to set the config.
I used source: text("/etc/services")
I used Sink: collectorSink("hdfs://localhost:8020/user/abhi/flume/","test file")

This throws the error that I wrote in the previous message. I also tried a different datacollectod sink - writing to file system.
source: text("/etc/services")
Sink: collectorSink("file:///Users/abhi/","testfile")
It throws the same error.

But if I use
source: text("/etc/services")
sink: cosole

It works. It also works if I use a text sink.
source: text("/etc/services")
sink: text("services.copy")
Can you tell me once again which config file do you need to look at?

As per the homebrew formulas, hadoop is version 1.0.4. And flume is version 0.9.4-cdh3u2. After installing flume, I replaced the jar file with that from hadoop install.
Thanks,

Abhi
On Nov 29, 2012, at 11:17 PM, Alexander Alten-Lorenz <[EMAIL PROTECTED]> wrote:

> Hi,
>
> Can you connect to your HDFS instance? Attach the config file for further debug.
> Btw, per homebrew installed HDFS / Flume / whatever depends on the recipes homebrew uses. Check this first please.
>
> Best
> - Alex
>
>
> On Nov 30, 2012, at 6:50 AM, Abhi Dubey <[EMAIL PROTECTED]> wrote:
>
>>> Hi,
>>>
>>> I am running hadoop in pseudo-distributed mode on mac os x 10.8.2. I installed flume (via homebrew) and I can write to console and local file system using text. Hadoop was also installed via homebrew.
>>>
>>> But if I use collectorSink and either a file target file:/// or hdfs://, I get an error. The error is same for both types of targets.
>>>
>>>
>>> 2012-11-29 20:22:47,182 [logicalNode new-host-2.home-21] INFO debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): failure to login
>>> 2012-11-29 20:23:47,181 [pool-8-thread-1] INFO hdfs.EscapedCustomDfsSink: Opening file:///Users/abhi/flume/testfile20121129-202317290-0800.1354249397290310000.00000037
>>> 2012-11-29 20:23:47,183 [logicalNode new-host-2.home-21] INFO debug.StubbornAppendSink: append failed on event 'new-host-2.home [INFO Thu Nov 29 20:21:44 PST 2012] #' with error: failure to login
>>> 2012-11-29 20:23:47,183 [logicalNode new-host-2.home-21] INFO rolling.RollSink: closing RollSink 'escapedCustomDfs("file:///Users/abhi/flume","testfile%{rolltag}" )'
>>> 2012-11-29 20:23:47,183 [logicalNode new-host-2.home-21] INFO rolling.RollSink: opening RollSink  'escapedCustomDfs("file:///Users/abhi/flume","testfile%{rolltag}" )'
>>> 2012-11-29 20:23:47,184 [logicalNode new-host-2.home-21] INFO debug.InsistentOpenDecorator: Opened MaskDecorator on try 0
>>> 2012-11-29 20:23:47,185 [pool-9-thread-1] INFO hdfs.EscapedCustomDfsSink: Opening file:///Users/abhi/flume/testfile20121129-202347184-0800.1354249427184178000.00000021
>>> 2012-11-29 20:23:47,192 [logicalNode new-host-2.home-21] INFO debug.InsistentAppendDecorator: append attempt 7 failed, backoff (60000ms): failure to login
>>>
>>> I am not sure what is going on. The permissions to the directory is 777. Can anyone help with this error?
>>>
>>>
>>> Thanks,
>>>
>>> Abhi
>>
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>

+
Alexander Alten-Lorenz 2012-11-30, 08:28
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB