Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig >> mail # user >> Failure to run pig jobs using HbaseStorage in Oozie


+
Praveen Bysani 2013-03-14, 09:28
+
Rohini Palaniswamy 2013-03-14, 20:43
+
Praveen Bysani 2013-03-15, 04:19
+
Rohini Palaniswamy 2013-03-15, 15:08
+
Praveen Bysani 2013-03-19, 04:16
Copy link to this message
-
Re: Failure to run pig jobs using HbaseStorage in Oozie
The oozie pig launcher log cannot have emtpy stdout. Can you rerun your
oozie workflow and check what is the stack trace in the pig launcher
stdout/stderr log?

Regards,
Rohini
On Mon, Mar 18, 2013 at 9:16 PM, Praveen Bysani <[EMAIL PROTECTED]>wrote:

> Hi,
>
> When i checked in the Job Tracker UI, the job is in retired section and
> cannot retrieve any log,
>
> Problem accessing /jobdetailshistory.jsp. Reason:
>
>     File
> /var/log/hadoop-0.20-mapreduce/history/done/server.epicoders.com_1362996434042_/2013/03/14/000000/job_201303111007_0042_1363246172007_hadoopuser_PigLatin%3Ahbasetable.pig
> does not exist
>
> Caused by:
>
> java.io.FileNotFoundException: File
>
> /var/log/hadoop-0.20-mapreduce/history/done/server.epicoders.com_1362996434042_/2013/03/14/000000/job_201303111007_0042_1363246172007_hadoopuser_PigLatin%3Ahbasetable.pig
> does not exist
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:468)
>
>
> But when i checked the logs manually at
> /var/log/hadoop-mapreduce.0.2/userlogs/<job_dir> for a similar job, the
> stderr and stdout are empty and syslog has no exception/errors.
>
> On 15 March 2013 23:08, Rohini Palaniswamy <[EMAIL PROTECTED]>
> wrote:
>
> > Pig 3206 fixed an issue when the hbase cluster was secure. You must be
> > facing a different issue. Need to see the stack trace in the hadoop job
> log
> > for the actual error. Click on your workflow in oozie UI and it will show
> > the tracker url to the actual hadoop job. Click on it and look at the
> > syslog of the map task.
> > On Mar 14, 2013 9:20 PM, "Praveen Bysani" <[EMAIL PROTECTED]>
> wrote:
> >
> > > Hi,
> > >
> > > Following is the configuration from my core-site and hbase-site.xml,
> > >
> > >   <property>
> > >      <name>hadoop.security.
> > > authentication</name>
> > >     <value>simple</value>
> > >   </property>
> > >   <property>
> > >      <name>hadoop.rpc.protection</name>
> > >      <value>authentication</value>
> > >    </property>
> > >    <property>
> > >      <name>hadoop.security.auth_to_local</name>
> > >      <value>DEFAULT</value>
> > >    </property>
> > >
> > > So i guess i may not be using a secure hadoop/hbase. I am not sure what
> > you
> > > meant by the log of pig launcher job of hadoop oozie. Do you mean the
> log
> > > in Job Tracker for this job id ?
> > >
> > >
> > > On 15 March 2013 04:43, Rohini Palaniswamy <[EMAIL PROTECTED]>
> > > wrote:
> > >
> > > > Hi Praveen,
> > > >    Are you running a secure cluster - secure hadoop and hbase? Can
> you
> > > > check what is the stacktrace on the pig launcher job log of Hadoop
> > Oozie?
> > > >
> > > > Regards,
> > > > Rohini
> > > >
> > > >
> > > > On Thu, Mar 14, 2013 at 2:28 AM, Praveen Bysani <
> > [EMAIL PROTECTED]
> > > > >wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I am trying to run a simple pig script that uses HbaseStorage class
> > to
> > > > load
> > > > > data from a hbase table. The pig script runs perfectly fine when
> run
> > > > > standalone in mapreduce mode. But when i submit it as a action in
> > oozie
> > > > > workflow, the job always fails. The oozie job log for that workflow
> > > gives
> > > > > following errrors which is not very useful,
> > > > >
> > > > > JOB[0000004-130312101540251-oozie-oozi-W]
> > > > > ACTION[0000004-130312101540251-oozie-oozi-W@pig-node] action
> > > completed,
> > > > > external ID [job_201303111007_0043]
> > > > > 2013-03-14 08:41:22,658 WARN
> > > > > org.apache.oozie.action.hadoop.PigActionExecutor: USER[hadoopuser]
> > > > GROUP[-]
> > > > > TOKEN[] APP[pig-wf] JOB[0000004-130312101540251-oozie-oozi-W]
> > > > > ACTION[0000004-130312101540251-oozie-oozi-W@pig-node] *Launcher
> > > > > ERROR,*reason: Main class
> > > > > *[org.apache.oozie.action.hadoop.PigMain], exit code [2]*
> > > > > 2013-03-14 08:41:22,833 INFO
> > > > org.apache.oozie.command.wf.ActionEndXCommand:
> > > > > USER[hadoopuser] GROUP[-] TOKEN[] APP[pig-wf]
> > > > > JOB[0000004-130312101540251-oozie-oozi-W]
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB