Jeff Yuan 2013-03-20, 21:00
Cheolsoo Park 2013-03-21, 17:13
Rohini Palaniswamy 2013-03-22, 20:41
I just want to capture stdout and stderr stream into files for each
job. I did some work with Hive in the past, and Hive allows you to get
the stdout and stderr streams for each job. I thought that's what the
ExecJob interface provides, but I guess the concrete implementation is
not there for now.
I'll look into writing a log4j appender, thanks Rohini.
On Fri, Mar 22, 2013 at 1:41 PM, Rohini Palaniswamy
<[EMAIL PROTECTED]> wrote:
> Not sure what you are exactly trying to capture, but one workaround I can
> think of is writing your own log4j appender and capturing the log
> On Thu, Mar 21, 2013 at 10:13 AM, Cheolsoo Park <[EMAIL PROTECTED]>wrote:
>> Hi Jeff,
>> You're right that those methods in HJob.java throw a
>> now. I think they are simply not implemented yet. Probably, we should.
>> On Wed, Mar 20, 2013 at 2:00 PM, Jeff Yuan <[EMAIL PROTECTED]> wrote:
>> > Is there an interface to get the standard out and standard error
>> > streams for a pig execution? I'm using the Java interface and directly
>> > calling PigServer.executeBatch() for example and getting back
>> > List<ExecJob>. The ExecJob interface has some interface for getSTDOut
>> > and getSTDError, but any calls to these result in
>> > UnsupportedOperationException.
>> > Thanks,
>> > Jeff