Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> Re: map phase does not read intermediate results with SequenceFileInputFormat


+
Dieter De Witte 2013-10-25, 10:16
Copy link to this message
-
Re: map phase does not read intermediate results with SequenceFileInputFormat
yes.. thanks for the reply..
On Fri, Oct 25, 2013 at 1:46 PM, Dieter De Witte <[EMAIL PROTECTED]> wrote:

> The question is also on stackoverflow, the problem is that she divides by
> zero in the second mapper I think. (the logs show that both jobs have a
> data flow..
>
>
> 2013/10/25 Robin East <[EMAIL PROTECTED]>
>
>> Hi
>>
>> Are you sure job1 created output where you expected it and in the format
>> you expect? Have you tested job2 on its own with some hand-crafted test
>> data? Is the output from job1 consistent with your hand-crafted test data
>> for job2?
>>
>> regards
>> On 25 Oct 2013, at 06:46, Anseh Danesh <[EMAIL PROTECTED]> wrote:
>>
>> Hi all.
>>
>> I have a mapreduce program with two jobs. second job's key and value
>> comes from first job output. but I think the second map does not get the
>> result from first job. in other words I think my second job did not read
>> the output of my first job.. what should I do?
>>
>> here is the code:
>>
>> public class dewpoint extends Configured implements Tool
>> {
>>   private static final Logger logger = LoggerFactory.getLogger(dewpoint.class);
>>
>> static final String KEYSPACE = "weather";
>> static final String COLUMN_FAMILY = "user";
>> private static final String OUTPUT_PATH1 = "/tmp/intermediate1";
>> private static final String OUTPUT_PATH2 = "/tmp/intermediate2";
>> private static final String OUTPUT_PATH3 = "/tmp/intermediate3";
>> private static final String INPUT_PATH1 = "/tmp/intermediate1";
>>
>> public static void main(String[] args) throws Exception
>> {
>>
>>     ToolRunner.run(new Configuration(), new dewpoint(), args);
>>     System.exit(0);
>> }
>>
>> ///////////////////////////////////////////////////////////
>>
>> public static class dpmap1 extends Mapper<Map<String, ByteBuffer>, Map<FloatWritable, ByteBuffer>, Text, DoubleWritable>
>> {
>>     DoubleWritable val1 = new DoubleWritable();
>>     Text word = new Text();
>>     String date;
>>     float temp;
>>     public void map(Map<String, ByteBuffer> keys, Map<FloatWritable, ByteBuffer> columns, Context context) throws IOException, InterruptedException
>>     {
>>
>>          for (Entry<String, ByteBuffer> key : keys.entrySet())
>>          {
>>              //System.out.println(key.getKey());
>>              if (!"date".equals(key.getKey()))
>>                  continue;
>>              date = ByteBufferUtil.string(key.getValue());
>>              word.set(date);
>>          }
>>
>>
>>         for (Entry<FloatWritable, ByteBuffer> column : columns.entrySet())
>>         {
>>             if (!"temprature".equals(column.getKey()))
>>                 continue;
>>             temp = ByteBufferUtil.toFloat(column.getValue());
>>             val1.set(temp);
>>             //System.out.println(temp);
>>        }
>>         context.write(word, val1);
>>     }
>> }
>>
>> ///////////////////////////////////////////////////////////
>>
>> public static class dpred1 extends Reducer<Text, DoubleWritable, Text, DoubleWritable>
>> {
>>    public void reduce(Text key, Iterable<DoubleWritable> values, Context context) throws IOException, InterruptedException
>>     {
>>         double beta = 17.62;
>>         double landa = 243.12;
>>         DoubleWritable result1 = new DoubleWritable();
>>         DoubleWritable result2 = new DoubleWritable();
>>          for (DoubleWritable val : values){
>>          //  System.out.println(val.get());
>>            beta *= val.get();
>>            landa+=val.get();
>>            }
>>          result1.set(beta);
>>          result2.set(landa);
>>
>>          context.write(key, result1);
>>          context.write(key, result2);
>>      }
>> }
>> ///////////////////////////////////////////////////////////
>>
>> public static class dpmap2 extends Mapper <Text, DoubleWritable, Text, DoubleWritable>{
>>
>>     Text key2 = new Text();
>>     double temp1, temp2 =0;
>>
>>     public void map(Text key, Iterable<DoubleWritable> values, Context context) throws IOException, InterruptedException {
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB