Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # user - java.lang.RuntimeException: java.io.EOFException         at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:103)


Copy link to this message
-
Re: java.lang.RuntimeException: java.io.EOFException at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:103)
Ted Yu 2010-09-30, 17:42
Line 84 is empty.
Line 83 is:
          out.writeUTF(query_id);

Please send the stack trace that corresponds to your attachment.

>From previous discussion:
In the very begining of readFields(), clear all available fields (lists,
primitives, etc).
The best way to to do that is to create a clearFields() function, that will
be called both from "readFields()" and from the empty constructor.

On Thu, Sep 30, 2010 at 10:16 AM, Tali K <[EMAIL PROTECTED]> wrote:

>  *You are right, there is no readInt , I have only 2 String fields in
> MsRead
> Here are lines: I'll also will send both files in attch.
> Thanks in advance for your help.*
>
> @Override
>         public void readFields(DataInput in) throws IOException {
>
>           query_id = in.readUTF();
>           record = in.readUTF();
>
>     }
>     @Override
>      public void write(DataOutput out) throws IOException {
>           out.writeUTF(query_id);
>
>           out.writeUTF(record);
>
>      }
>
>
>
>
>
>     public static class FirstComparator extends WritableComparator {
>
>         private static final Text.Comparator TEXT_COMPARATOR = new
> Text.Comparator();
>
>         public FirstComparator() {
>           super(MsRead.class);
>         }
>
>         @Override
>         public int compare(byte[] b1, int s1, int l1,
>                            byte[] b2, int s2, int l2) {
>
>           try {
>             int firstL1 = WritableUtils.decodeVIntSize(b1[s1]) +
> readVInt(b1, s1);
>             int firstL2 = WritableUtils.decodeVIntSize(b2[s2]) +
> readVInt(b2, s2);
>             return TEXT_COMPARATOR.compare(b1, s1, firstL1, b2, s2,
> firstL2);
>           } catch (IOException e) {
>             throw new IllegalArgumentException(e);
>           }
>         }
>
>         @Override
>         public int compare(WritableComparable a, WritableComparable b) {
>           if (a instanceof MsRead && b instanceof MsRead) {
>
>             //System.err.println("COMPARE " + ((MsRead)a).getType() + "\t"
> + ((MsRead)b).getType() + "\t"
>                 //    + (((MsRead) a).toString().compareTo(((MsRead)
> b).toString())));
>             return (((MsRead) a).toString().compareTo(((MsRead)
> b).toString()));
>
>           }
>           return super.compare(a, b);
>         }
>
>
>       }
>
>         @Override
>         public int compareTo(MsRead o) {
>          return this.toString().compareTo(o.toString());
>       }
>         @Override
>         public boolean equals(Object right) {
>             if (right instanceof MsRead )
>             {
>                 return (query_id.equals(((MsRead)right).query_id));
>             }
>             else
>                 return false;
>         }
>         @Override
>         public int hashCode() {
>             return query_id.hashCode() ;
>         }
>
> > Date: Wed, 29 Sep 2010 22:27:15 -0700
> > Subject: Re: java.lang.RuntimeException: java.io.EOFException at
> org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:103)
> > From: [EMAIL PROTECTED]
> > To: [EMAIL PROTECTED]
>
> >
> > Your MsRead.readFields() doesn't contain readInt().
> > Can you show us the lines around line 84 of MsRead.java ?
> >
> > On Wed, Sep 29, 2010 at 2:44 PM, Tali K <[EMAIL PROTECTED]> wrote:
> >
> > >
> > > HI All,
> > >
> > > I am getting this Exception on a cluster(10 nodes) when I am running
> > > simple hadoop map / reduce job.
> > > I don't have this Exception while running it on my desktop in hadoop's
> > > pseudo distributed mode.
> > > Can somebody help? I would really appreciate it.
> > >
> > >
> > > 10/09/29 14:28:34 INFO mapred.JobClient: map 100% reduce 30%
> > > 10/09/29 14:28:36 INFO mapred.JobClient: Task Id :
> > > attempt_201009291306_0004_r_000000_0, Status : FAILED
> > > java.lang.RuntimeException: java.io.EOFException
> > > at
> > >
> org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:103)
> > > at
> > > org.apache.hadoop.mapred.Merger$MergeQueue.lessThan(Merger.java:373)