Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Re: Secondary Sort example error


+
Ravi Chandran 2013-02-07, 18:41
+
Harsh J 2013-02-07, 18:49
Copy link to this message
-
Re: Secondary Sort example error
unsubscribe

On 08/02/13 12:19 AM, "Harsh J" <[EMAIL PROTECTED]> wrote:

>The JIRA https://issues.apache.org/jira/browse/MAPREDUCE-2584 should
>help such cases, if what I speculated above is indeed the case.
>
>On Fri, Feb 8, 2013 at 12:16 AM, Harsh J <[EMAIL PROTECTED]> wrote:
>> Thanks, I managed to correlate proper line numbers.
>>
>> Are you using some form of custom serialization in your job code? That
>> is, are your keys non-Writable types and are of some other type? The
>> specific NPE is arising from the SerializationFactory not being able
>> to find a serializer for your Map-Output key class. You may want to
>> look into that direction, or share your code for the list to spot it
>> instead.
>>
>> On Fri, Feb 8, 2013 at 12:11 AM, Ravi Chandran
>> <[EMAIL PROTECTED]> wrote:
>>> hi,
>>>
>>> it is Hadoop 2.0.0-cdh4.1.1. the whole output is given below:
>>>
>>> Hadoop 2.0.0-cdh4.1.1
>>> Subversion
>>>
>>>file:///data/1/jenkins/workspace/generic-package-centos32-6/topdir/BUILD
>>>/hadoop-2.0.0-cdh4.1.1/src/hadoop-common-project/hadoop-common
>>> -r 581959ba23e4af85afd8db98b7687662fe9c5f20
>>>
>>>
>>>
>>> On Fri, Feb 8, 2013 at 12:04 AM, Harsh J <[EMAIL PROTECTED]> wrote:
>>>>
>>>> Hey Ravi,
>>>>
>>>> What version of Hadoop is this exactly? (Type and send output of
>>>> "hadoop version" if unsure)
>>>>
>>>> On Thu, Feb 7, 2013 at 11:55 PM, Ravi Chandran
>>>> <[EMAIL PROTECTED]> wrote:
>>>> > Hi,
>>>> >
>>>> > I am trying to do a name sorting using secondary sort. I have a
>>>>working
>>>> > example, which I am taking as a reference. But I am getting a null
>>>> > pointer
>>>> > error in the MapTask class. I am not able to locate the reason. as
>>>>the
>>>> > logic
>>>> > to create the Custom Object from a given file has been tested
>>>>through a
>>>> > java
>>>> > class..
>>>> > I am getting this error:
>>>> >
>>>> > 13/02/07 12:23:42 WARN snappy.LoadSnappy: Snappy native library is
>>>> > available
>>>> > 13/02/07 12:23:42 INFO snappy.LoadSnappy: Snappy native library
>>>>loaded
>>>> > 13/02/07 12:23:42 INFO mapred.FileInputFormat: Total input paths to
>>>> > process
>>>> > : 1
>>>> > 13/02/07 12:23:43 INFO mapred.JobClient: Running job:
>>>> > job_201301301056_0014
>>>> > 13/02/07 12:23:44 INFO mapred.JobClient:  map 0% reduce 0%
>>>> > 13/02/07 12:23:56 INFO mapred.JobClient: Task Id :
>>>> > attempt_201301301056_0014_m_000000_0, Status : FAILED
>>>> > java.lang.NullPointerException
>>>> >  at
>>>> >
>>>> >
>>>>org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:81
>>>>4)
>>>> >  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:385)
>>>> >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
>>>> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>> >  at java.security.AccessController.doPrivileged(Native Method)
>>>> >  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>> >  at
>>>> >
>>>> >
>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati
>>>>on.java:1332)
>>>> >  at org.apache.hadoop.mapred.Child.main(Child.java:262)
>>>> > 13/02/07 12:23:57 INFO mapred.JobClient: Task Id :
>>>> > attempt_201301301056_0014_m_000001_0, Status : FAILED
>>>> > java.lang.NullPointerException
>>>> >  at
>>>> >
>>>> >
>>>>org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:81
>>>>4)
>>>> >  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:385)
>>>> >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
>>>> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>> >  at java.security.AccessController.doPrivileged(Native Method)
>>>> >  at javax.security.auth.Subject.doAs(Subject.java:396)
>>>> >  at
>>>> >
>>>> >
>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati
>>>>on.java:1332)
>>>> >  at org.apache.hadoop.mapred.Child.main(Child.java:262)
>>>> >
>>>> > I am giving the Mapper code below:
>>>> >
>>>> > import java.io.IOException;
>>>> > import org.apache.hadoop.io.LongWritable;