Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Job fails while re attempting the task in multiple outputs case


Copy link to this message
-
Re: Job fails while re attempting the task in multiple outputs case
Harsh J 2013-12-30, 14:12
Are you using the MultipleOutputs class shipped with Apache Hadoop or
one of your own?

If its the latter, please take a look at gotchas to take care of
described at http://wiki.apache.org/hadoop/FAQ#Can_I_write_create.2Fwrite-to_hdfs_files_directly_from_map.2Freduce_tasks.3F

On Mon, Dec 30, 2013 at 4:22 PM, AnilKumar B <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I am  using multiple outputs in our job. So whenever any reduce task fails,
> all it's next task attempts are failing with file exist exception.
>
>
> The output file name should also append the task attempt right? But it's
> only appending the task id. Is this the bug or Some thing wrong from my
> side?
>
> Where should look in src code? I went through  code at
> FileOutputFormat$getTaskOutputPath(), but there it's only considering task
> id.
>
>
> Exception Trace:
> 13/12/29 09:13:00 INFO mapred.JobClient: Task Id :
> attempt_201312162255_60465_r_000008_0, Status : FAILED
> 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_0&filter=stdout
> 13/12/29 09:14:42 WARN mapred.JobClient: Error reading task
> outputhttp://localhost:50050/tasklog?plaintext=true&attemptid=attempt_201312162255_60465_r_000008_0&filter=stderr
> 13/12/29 09:15:04 INFO mapred.JobClient:  map 100% reduce 93%
> 13/12/29 09:15:23 INFO mapred.JobClient:  map 100% reduce 96%
> 13/12/29 09:17:31 INFO mapred.JobClient:  map 100% reduce 97%
> 13/12/29 09:19:34 INFO mapred.JobClient: Task Id :
> attempt_201312162255_60465_r_000008_1, Status : FAILED
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to create
> /x/y/z/2013/12/29/04/o_2013_12_29_03-r-00008.gz on client 10.103.10.31
> either because the filename is invalid or the file exists
>                 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1672)
>                 at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1599)
>                 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:732)
>                 at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:711)
>                 at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown
> Source)
>                 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>                 at java.lang.reflect.Method.invoke(Method.java:597)
>                 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
>                 at
> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1448)
>                 at
> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444)
>                 at java.security.AccessController.doPrivileged(Native
> Method)
>                 at javax.security.auth.Subject.doAs(Subject.java:396)
>                 at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>                 at
> org.apache.hadoop.ipc.Server$Handler.run(Server.java:1442)
>
>                 at org.apache.hadoop.ipc.Client.call(Client.java:1118)
>                 at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>                 at $Proxy7.create(Unknown Source)
>                 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>                 at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>                 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>                 at java.lang.reflect.Method.invoke(Method.java:597)
>                 at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
>                 at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
>                 at $Proxy7.create(Unknown Source)
>                 at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:3753)

Harsh J