Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig, mail # user - using hdfs shell commands in a pig script


Copy link to this message
-
Re: using hdfs shell commands in a pig script
Panshul Whisper 2013-03-06, 21:16
Thnx...
This worked.. and really helped...

Regards,
On Wed, Mar 6, 2013 at 3:58 PM, shashwat shriparv <[EMAIL PROTECTED]
> wrote:

> Check out this link
>
> http://pig.apache.org/docs/r0.9.1/cmds.html
>
>
>
> ∞
> Shashwat Shriparv
>
>
>
> On Wed, Mar 6, 2013 at 8:23 PM, Harsha <[EMAIL PROTECTED]> wrote:
>
> > Hi Panshul,
> >         You can write "fs -getmerge source destination;" in your pig
> > script.
> > Thanks,
> > Harsha
> >
> >
> > On Wednesday, March 6, 2013 at 3:33 AM, Panshul Whisper wrote:
> >
> > > Hello,
> > >
> > > Is it possible to use hadoop fs commands in a pig script?
> > >
> > > What i exactly want to do is, at the end of my pig script, after the
> > > execution of store in a file command, I want the pig script to merge
> the
> > > multiple output files it creates.
> > >
> > > So either I can try and execute the hadoop fs -getmerge command in the
> > pig
> > > script, or is there some other way to do it in pig?
> > >
> > > I got this idea, since grunt supports hadoop fs commands so I thought
> > maybe
> > > it is possible to somehow execute them from within a pig script file.
> > >
> > > Any suggestions are welcome.
> > >
> > > Thanking You,
> > >
> > > --
> > > Regards,
> > > Ouch Whisper
> > > 010101010101
> > >
> > >
> >
> >
> >
>

--
Regards,
Ouch Whisper
010101010101