Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Error in configuring object


Copy link to this message
-
Re: Error in configuring object
Shreya
       Like Harsh mentioned it is really straight forward. May be a few points to keep in mind

1. In default the separator between key and value is tab character in the output file(Text Output Format). But in default, csv is comma delimited/separated. So you may need to change the separator to comma in your map reduce output.

mapred.textoutputformat.separator=','

2. Mostly multiple entities combined together makes your key as well as your value. So within the key as well as value use a common separator to separate the entities.(Make it comma if you need a comma delimited csv) And the same separator should be used to separate  key and value in output file as mentioned in point 1.

NOTE: Do a output data analysis and make sure that the separator is never a part of your data. Choose a different  separator if you already have comma in your data.

If points 1 and 2 are satisfied then just follow Harsh's reply. With no further reformating your csv would be good for further processing.

Hope it helps

Regards
Bejoy K S

-----Original Message-----
From: Harsh J <[EMAIL PROTECTED]>
Date: Thu, 15 Sep 2011 15:43:08
To: <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Error in configuring object

Hello Shreya,

Looks like the tasks are failing when they try to get the Mapper class
(Sentiment_Analysis_Twitter_mapper.class) instantiated. Possibly something
you might be doing in its empty constructor? Hard to tell without looking at
the mapper class since that doesn't seem to be attached.

On Thu, Sep 15, 2011 at 2:37 PM, <[EMAIL PROTECTED]> wrote:

> Hi,****
>
> ** **
>
> Main method looks like below:****
>
> *public* *static* *void* main(String[] args) {****
>
>             ****
>
>             JobConf conf = *new*JobConf(Sentiment_Analysis_Twitter_Driver.
> *class*);****
>
>             *sLogger*.setLevel(Level.*ALL*);****
>
>             conf.setOutputKeyClass(Text.*class*);****
>
>             conf.setOutputValueClass(Text.*class*);****
>
>             ****
>
>             *try* {****
>
>                   *cacheStopWordList*(conf);****
>
>                   *cachePositiveWordList*(conf);****
>
>                   *cacheNegativeWordList*(conf);****
>
>                   *cacheSearchList*(conf);****
>
>             } *catch* (IOException e1) {****
>
>                   // *TODO* Auto-generated catch block****
>
>                   *sLogger*.error("########copying error");****
>
>                   e1.printStackTrace();****
>
>             ****
>
>             }****
>
>             ****
>
>             *sLogger*.debug("***copied All");****
>
>             // *TODO*: specify input and output DIRECTORIES (not files)***
> *
>
>             FileInputFormat.*setInputPaths*(conf,*new* Path(
> "/home/hadoop/sa/input"));****
>
>             FileOutputFormat.*setOutputPath*(conf,*new* Path(
> "/home/hadoop/sa/output"));****
>
> ** **
>
>             // *TODO*: specify a *mapper*****
>
>             conf.setMapperClass(Sentiment_Analysis_Twitter_mapper.*class*
> );****
>
>             *sLogger*.debug("***Set All");****
>
>             *try* {****
>
>                   JobClient.*runJob*(conf); //Error comes here****
>
>             } *catch* (Exception e) {****
>
>                   *sLogger*.error("########error calling run method" +
> e.getMessage());****
>
>                   e.printStackTrace();****
>
>                   ****
>
>             }****
>
>       }****
>
> ** **
>
> Iam trying to run a MR job and get the following error:****
>
> java.lang.RuntimeException: Error in configuring object****
>
>         at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
> ****
>
>         at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)***
> *
>
>         at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:431)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:371)****

Harsh J