Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Job History files location of 2.0.4


Copy link to this message
-
Re: Job History files location of 2.0.4
Thanks for letting us know the solution.

Regards,
Shahab
On Mon, Jun 10, 2013 at 3:40 PM, Boyu Zhang <[EMAIL PROTECTED]> wrote:

> I solved the problem, in my case, it is because I did not start the job
> history server daemon. After starting it, the history logs are generated in
> 2 places: the HADOOP_MAPRED_LOG_DIR directory, and in hdfs, /tmp folder.
>
> Hope this is helpful for whoever also has the same problem. By the way,
> the documentation for hadoop 2.0.x is out of date, this creates some
> trouble in figuring out where the conf/logs are stored and where to set
> them.
>
> Thanks,
> Boyu
>
>
> On Fri, Jun 7, 2013 at 2:39 PM, Boyu Zhang <[EMAIL PROTECTED]> wrote:
>
>> I used a directory that is local to every slave node: export
>> HADOOP_LOG_DIR="/scratch/$USER/$PBS_JOBID/hadoop-$USER/log".
>>
>> I did not change the " hadoop.job.history.user.location", I thought if I
>> don't change this property, the job history is going to be stored in hdfs
>> under output/_logs dir.
>>
>> Then after the job completes, I copied back the logs to the server.
>>
>> Thanks a lot,
>> Boyu
>>
>>
>> On Fri, Jun 7, 2013 at 2:32 PM, Shahab Yunus <[EMAIL PROTECTED]>wrote:
>>
>>> What value do you have for hadoop.log.dir property?
>>>
>>>
>>> On Fri, Jun 7, 2013 at 5:20 PM, Boyu Zhang <[EMAIL PROTECTED]>wrote:
>>>
>>>> Hi Shahab,
>>>>
>>>>
>>>> How old were they?
>>>>>
>>>> They are new, I did the copy automatically right after the job
>>>> completed, in a script.
>>>>
>>>> I am assuming they were from the jobs run on the older version, right?
>>>>>
>>>> I run the job using the hadoop version 2.0.4 if this is what you mean.
>>>>
>>>> Or are you looking for new jobs's log that you are running after the
>>>>> upgrade?
>>>>>
>>>> I am looking for new job's log, I did a fresh install of hadoop2.0.4,
>>>> then run the job, then copied back the entire hdfs directory.
>>>>
>>>>
>>>>> What about the local file systems? Are the logs there still?
>>>>>
>>>> Where should I find the logs (job logs, not daemon logs) in the local
>>>> file systems?
>>>>
>>>> Thanks,
>>>> Boyu
>>>>
>>>>
>>>>>
>>>>> Regards,
>>>>> Shahab
>>>>>
>>>>>
>>>>> On Fri, Jun 7, 2013 at 4:56 PM, Boyu Zhang <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Thanks Shahab,
>>>>>>
>>>>>> I saw the link, but it is not the case for me. I copied everything
>>>>>> from hdfs ($HADOOP_HOME/bin/hdfs dfs -copyToLocal / $local_dir). But did
>>>>>> not see the logs.
>>>>>>
>>>>>> Did it work for you?
>>>>>>
>>>>>> Thanks,
>>>>>> Boyu
>>>>>>
>>>>>>
>>>>>> On Fri, Jun 7, 2013 at 1:52 PM, Shahab Yunus <[EMAIL PROTECTED]>wrote:
>>>>>>
>>>>>>> See this;
>>>>>>>
>>>>>>> http://mail-archives.apache.org/mod_mbox/hadoop-common-user/201302.mbox/%[EMAIL PROTECTED]%3E
>>>>>>>
>>>>>>> Regards,
>>>>>>> Shahab
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Jun 7, 2013 at 4:33 PM, Boyu Zhang <[EMAIL PROTECTED]>wrote:
>>>>>>>
>>>>>>>> Dear All,
>>>>>>>>
>>>>>>>> I recently moved from Hadoop0.20.2 to 2.0.4, and I am trying to
>>>>>>>> find the old job history files (used to be in hdfs, output/_logs/history),
>>>>>>>> it records detailed time information for every task attempts.
>>>>>>>>
>>>>>>>> But now it is not on hdfs anymore, I copied the entire "/" from
>>>>>>>> hdfs to my local dir, but am not able to find this location.
>>>>>>>>
>>>>>>>> Could anyone give any advise on where are the files?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Boyu
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB