Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Hadoop-MapReduce


Copy link to this message
-
Re: Hadoop-MapReduce
Hi,

The driver class and my Mapper class i have used
org.apache.hadoop.mapreduce.lib

and in the XmlInputFormat.java class also i have used the
org.apache.hadoop.mapreduce.lib

but still iam getting this error.

Please suggest.

Thanks in advance

Ranjini

On Tue, Dec 17, 2013 at 2:07 PM, Shekhar Sharma <[EMAIL PROTECTED]>wrote:

> Hello Ranjini,
> This error will come when you use mix and match newer and older API..
>
> You might have written program using newer API and the the XML input
> format is using older api..
> The older api has package structure of org.apache.hadoop.mapred
>
> The newer api has package structure package of
> org.apache.hadoop.mapreduce.lib
>
> Check out the XMLINputFormat.java, which package of FileInputFormat
> they have used...
>
>
> Regards,
> Som Shekhar Sharma
> +91-8197243810
>
>
> On Tue, Dec 17, 2013 at 12:55 PM, Ranjini Rathinam
>  <[EMAIL PROTECTED]> wrote:
> > Hi,
> >
> > I am using hadoop 0.20 version
> >
> > In that while exceuting the XmlInformat class
> > I am getting the error as
> >
> > "Error: Found Class  org.apache.hadoop.mapreduce.TaskAttemptContext, but
> > interface was excepted,."
> >
> > Please suggest to fix the error.
> >
> > Thanks in advance.
> >
> > Ranjini
> >
> > On Wed, Dec 11, 2013 at 12:30 PM, Ranjini Rathinam <
> [EMAIL PROTECTED]>
> > wrote:
> >>
> >> hi,
> >>
> >> I have fixed the error , the code is running fine, but this code just
> >> split the part of the tag.
> >>
> >> i want to convert into text format so that i can load them into tables
> of
> >> hbase and hive.
> >>
> >> I have used the DOM Parser but this parser uses File as Object  but hdfs
> >> uses FileSystem.
> >>
> >> Eg,
> >>
> >> File fXmlFile = new File("D:/elango/test.xml");
> >>
> >>  System.out.println(g);
> >>  DocumentBuilderFactory dbFactory > DocumentBuilderFactory.newInstance();
> >>  DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
> >>  Document doc = dBuilder.parse(fXmlFile);
> >>
> >>
> >> This cant be used as hdfs, because hdfs path  is accessed through
> >> FileSystem.
> >>
> >> I kindly request u to , Please suggest me to fix the above issue.
> >>
> >> Thanks in advance
> >>
> >> Ranjini R
> >>
> >>
> >>
> >>
> >> On Tue, Dec 10, 2013 at 11:07 AM, Ranjini Rathinam
> >> <[EMAIL PROTECTED]> wrote:
> >>>
> >>>
> >>>
> >>> ---------- Forwarded message ----------
> >>> From: Shekhar Sharma <[EMAIL PROTECTED]>
> >>> Date: Mon, Dec 9, 2013 at 10:23 PM
> >>> Subject: Re: Hadoop-MapReduce
> >>> To: [EMAIL PROTECTED]
> >>> Cc: [EMAIL PROTECTED]
> >>>
> >>>
> >>> It does work i have used it long back..
> >>>
> >>> BTW if it is not working, write the custom input format and implement
> >>> your record reader. That would be far more easy than breaking your
> >>> head with others code.
> >>>
> >>> Break your problem in step:
> >>>
> >>> (1) First the XML data is multiline...Meaning multiple lines makes a
> >>> single record for you...May be a record for you would be
> >>>
> >>> <person>
> >>>  <fname>x</fname>
> >>>   <lname>y</lname>
> >>> </person>
> >>>
> >>> (2) Implement a record reader that looks out for the starting and
> >>> ending person tag ( Checkout how RecordReader.java is written)
> >>>
> >>> (3) Once you got the contents between starting and ending tag, now you
> >>> can use a xml parser to parse the contents into an java object and
> >>> form your own key value pairs ( custom key and custom value)
> >>>
> >>>
> >>> Hope you have enough pointers to write the code.
> >>>
> >>>
> >>> Regards,
> >>> Som Shekhar Sharma
> >>> +91-8197243810
> >>>
> >>>
> >>> On Mon, Dec 9, 2013 at 6:30 PM, Ranjini Rathinam <
> [EMAIL PROTECTED]>
> >>> wrote:
> >>> > Hi Subroto Sanyal,
> >>> >
> >>> > The link  provided about xml, it does not work . The Class written
> >>> > XmlContent is not allowed in the XmlInputFormat.
> >>> >
> >>> > I request you to help , whether this scenaio some one has coded, and
> >>> > needed
> >>> > working code.
> >>> >
> >>>