It seems like your pipes mapper is exiting before consuming all the input. Did you check the task-logs on the web UI?
On Nov 5, 2013, at 7:25 AM, Basu,Indrashish wrote:
> Can anyone kindly assist on this ?
> On Mon, 04 Nov 2013 10:23:23 -0500, Basu,Indrashish wrote:
>> Hi All,
>> Any update on the below post ?
>> I came across some old post regarding the same issue. It explains the
>> solution as " The *nopipe* example needs more documentation. It
>> assumes that it is run with the InputFormat from
>> *WordCountInputFormat*.java, which has a very specific input split
>> format. By running with a TextInputFormat, it will send binary bytes
>> as the input split and won't work right. The *nopipe* example should
>> probably be recoded *to* use libhdfs *too*, but that is more
>> complicated *to* get running as a unit test. Also note that since the
>> C++ example is using local file reads, it will only work on a cluster
>> if you have nfs or something working across the cluster. "
>> I would need some more light on the above explanation , so if anyone
>> could elaborate a bit about the same as what needs to be done exactly.
>> To mention, I am trying to run a sample KMeans algorithm on a GPU
>> using Hadoop.
>> Thanks in advance.
>> On Thu, 31 Oct 2013 20:00:10 -0400, Basu,Indrashish wrote:
>>> I am trying to run a sample Hadoop GPU source code (kmeans algorithm)
>>> on an ARM processor and getting the below error. Can anyone please
>>> throw some light on this ?
>>> rmr: cannot remove output: No such file or directory.
>>> 13/10/31 13:43:12 WARN mapred.JobClient: No job jar file set. User
>>> classes may not be found. See JobConf(Class) or
>>> 13/10/31 13:43:12 INFO mapred.FileInputFormat: Total input paths to
>>> process : 1
>>> 13/10/31 13:43:13 INFO mapred.JobClient: Running job: job_201310311320_0001
>>> 13/10/31 13:43:14 INFO mapred.JobClient: map 0% reduce 0%
>>> 13/10/31 13:43:39 INFO mapred.JobClient: Task Id :
>>> attempt_201310311320_0001_m_000000_0, Status : FAILED
>>> java.io.IOException: pipe child exception
>>> at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:191)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:363)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>> at org.apache.hadoop.mapred.Child.main(Child.java:170)
>>> Caused by: java.net.SocketException: Broken pipe
>>> at java.net.SocketOutputStream.socketWrite0(Native Method)
>>> at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113)
>>> at java.net.SocketOutputStream.write(SocketOutputStream.java:159)
>>> at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
>>> at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
>>> at java.io.DataOutputStream.write(DataOutputStream.java:107)
>>> ... 3 more
>>> attempt_201310311320_0001_m_000000_0: cmd: [bash, -c, exec
>>> '0' < /dev/null 1>>
>>> 2>> /usr/local/hadoop/hadoop-gpu-0.20.1/bin/../logs/userlogs/
> Indrashish Basu
> Graduate Student
> Department of Electrical and Computer Engineering
> University of Florida
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.