Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Why it don't print anything into part-00000 file


Copy link to this message
-
Re: Why it don't print anything into part-00000 file
Steve Lewis 2012-01-31, 16:09
1 run locally ether problem is small and until you can run a small local
problem do not mover to the cluster.
2 make sure ether mapper writes something set a breakpoint at the write
call.
3 make sure the reducer reads something.
4 add write call s in the reducers setup and cleanup calls until something
is written in between
On Jan 31, 2012 3:39 AM, "Luiz Antonio Falaguasta Barbosa" <
[EMAIL PROTECTED]> wrote:

> Ronald,
>
> It is the directory where _SUCCESS and part-00000 are generated, but they
> are empty. In my Eclipse, I put in 'Run Configurations' the following, in
> the tab Arguments:
>
> input.small out.ivory.small (as the file where is words and the directory
> where Hadoop should write results, respectively)
>
> Any idea that you could give me, I'll appreciate it.
>
> Thanks until now and thanks in advance!
>
> Regards,
>
> Luiz
>
> 2012/1/30 Ronald Petty <[EMAIL PROTECTED]>
>
>> Luiz,
>>
>> What is in this file hdfs://10.22.1.2:54310/user/hadoop/out.ivory.small
>>
>> Kindest regards.
>>
>> Ron
>>
>> On Mon, Jan 30, 2012 at 7:24 AM, Luiz Antonio Falaguasta Barbosa <
>> [EMAIL PROTECTED]> wrote:
>>
>>> Hi Ronald,
>>>
>>> I didn't try to run it locally. I used a cluster in the university where
>>> I study.
>>>
>>> The console of Eclipse returns the following:
>>>
>>> 2012-01-28 10:34:54.450 java[22689:1903] Unable to load realm info from
>>> SCDynamicStore
>>>
>>> 12/01/28 10:34:58 INFO mapred.FileInputFormat: Total input paths to
>>> process : 2
>>>
>>> 12/01/28 10:34:58 INFO mapred.JobClient: Running job: job_local_0001
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: numReduceTasks: 1
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: io.sort.mb = 100
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: data buffer = 79691776/99614720
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: record buffer = 262144/327680
>>>
>>> 12/01/28 10:34:59 INFO mapred.JobClient:  map 0% reduce 0%
>>>
>>> 12/01/28 10:35:05 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CHANGES.txt:0+412413
>>>
>>> 12/01/28 10:35:27 INFO mapred.MapTask: Starting flush of map output
>>>
>>> 12/01/28 10:35:27 INFO mapred.Task: Task:attempt_local_0001_m_000000_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:29 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CHANGES.txt:0+412413
>>>
>>> 12/01/28 10:35:29 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0'
>>> done.
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: numReduceTasks: 1
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: io.sort.mb = 100
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: data buffer = 79691776/99614720
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: record buffer = 262144/327680
>>>
>>> 12/01/28 10:35:29 INFO mapred.JobClient:  map 100% reduce 0%
>>>
>>> 12/01/28 10:35:30 INFO mapred.MapTask: Starting flush of map output
>>>
>>> 12/01/28 10:35:30 INFO mapred.Task: Task:attempt_local_0001_m_000001_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CETEMPublico.small:0+3801
>>>
>>> 12/01/28 10:35:32 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0'
>>> done.
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:32 INFO mapred.Merger: Merging 2 sorted segments
>>>
>>> 12/01/28 10:35:32 INFO mapred.Merger: Down to the last merge-pass, with
>>> 0 segments left of total size: 0 bytes
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:33 INFO mapred.Task: Task:attempt_local_0001_r_000000_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:33 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:33 INFO mapred.Task: Task attempt_local_0001_r_000000_0
>>> is allowed to commit now
>>>
>>> 12/01/28 10:35:35 INFO mapred.FileOutputCommitter: Saved output of task
>>> 'attempt_local_0001_r_000000_0' to hdfs://
>>> 10.22.1.2:54310/user/hadoop/out.ivory.small