Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Broken Pip error


+
mardan Khan 2012-07-25, 14:33
+
Mohammad Tariq 2012-07-25, 14:43
+
mardan Khan 2012-07-25, 18:49
Copy link to this message
-
Re: Broken Pip error
It would be better to uninstall cloudera's distribution first..It
seems flume is trying to contact "hadoopmr.brunel.ac.uk" instead of
"134.83.35.24"..Also add "hadoop.tmp.dir" property in your
"core-site.xml" file and "dfs.data.dir" and "dfs.name.dir" properties
in "hdfs-site.xml" file, else you'll loose your data and metadata at
each restart, as the values of these properties default to "/tmp" dir.
Regards,
    Mohammad Tariq
On Thu, Jul 26, 2012 at 12:19 AM, mardan Khan <[EMAIL PROTECTED]> wrote:
> Tariq
>
> Let to telling you that i am using hadoop 0.20.0 which i have downloaded
> from apache website. I did not configure the hadoop which is automatically
> downloaded with CHD4
> I think my hadoop is accessible.
>
> The configuration file as:
>
>
> Core-site.xml
>
> <configuration>
> <property>
>     <name>fs.default.name</name>
>     <value>hdfs://134.83.35.24:9000</value>
>   </property>
> </configuration>
>
>
> hdfs-site.xml
>
>
> <configuration>
>
>  <property>
>     <name>dfs.replication</name>
>     <value>1</value>
>   </property>
>
> </configuration>
>
>
>
> mapred-site.xml
>
> <configuration>
>
> <property>
>     <name>mapred.job.tracker</name>
>     <value>134.83.35.24:9001</value>
>   </property>
> </configuration>
>
>
>
>
> Flume configuration
>
> agent.sources = avro-AppSrv-source
> agent.sinks = hdfs-Cluster1-sink
> agent.channels = mem-channel-1
> # set channel for sources, sinks
> # properties of avro-AppSrv-source
> agent.sources.avro-AppSrv-source.type = SEQ
>
> agent.sources.avro-AppSrv-source.bind = localhost
> agent.sources.avro-AppSrv-source.port = 10000
>
> agent.sources.avro-AppSrv-source.channels = mem-channel-1
>
> # properties of mem-channel-1
> agent.channels.mem-channel-1.type = memory
> agent.channels.mem-channel-1.capacity = 1000
> agent.channels.mem-channel-1.transactionCapacity = 100
> # properties of hdfs-Cluster1-sink
> agent.sinks.hdfs-Cluster1-sink.type = hdfs
>
> agent.sinks.hdfs-Cluster1-sink.channel = mem-channel-1
> agent.sinks.hdfs-Cluster1-sink.hdfs.path = hdfs://134.83.35.24:9000/flume
> agent.sinks.hdfs-Cluster-sin.hdfs.rollInterval = 30
> agent.sinks.hdfs-Cluster-sin.hdfs.rollsize= 1024
> agent.sinks.hdfs-Cluster-sin.hdfs.batchSize= 1
> agent.sinks.hdfs-Cluster-sin.hdfs.fileType = DataStream
> agent.sinks.hdfs-Cluster-sin.hdfs.writeFormat = writable
>
>
>
> Please any suggestion
>
>
> Thanks
>
>
> On Wed, Jul 25, 2012 at 3:43 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:
>>
>> Hello mardan,
>>
>>        It seems the host where your NameNode is running is not
>> reachable or your are trying catch some other host. Could you please
>> show us your conf file??
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jul 25, 2012 at 8:03 PM, mardan Khan <[EMAIL PROTECTED]> wrote:
>> > Hi,>>
>> >
>> > I am getting the following warring and error message
>> >
>> >
>> >
>> > Warring:               Unexpected error reading responses on connection
>> > Thread[IPC Client (421539177) connection to hadoopmr/134.83.35.24:9000
>> > from
>> > root,5,main]
>> >
>> >
>> > Error: Broken Pipe
>> >
>> >
>> > 12/07/25 15:26:26 ERROR hdfs.HDFSEventSink: close on
>> > hdfs://hadoopmr.brunel.ac.uk:9000/flume/FlumeData; called
>> > org.apache.flume.sink.hdfs.HDFSEventSink$3@1d162212
>> > java.io.IOException: Failed on local exception: java.io.IOException:
>> > Broken
>> > pipe; Host Details : local host is: "java.net.UnknownHostException:
>> > brunel:
>> > brunel"; destination host is: "hadoopmr.brunel.ac.uk":9000;
>> >     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:765)
>> >     at org.apache.hadoop.ipc.Client.call(Client.java:1165)
>> >     at
>> >
>> > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
>> >     at $Proxy9.getFileInfo(Unknown Source)
>> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >     at
>> >
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >     at
>> >
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB