Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> pig ship tar files


Copy link to this message
-
RE: pig ship tar files
Thanks, but I'm still not quite clear on how to do it.

"...One way to work around this limitation is to tar all the dependencies into a tar file that accurately reflects the structure needed on the compute nodes, then have a wrapper for your script that un-tars the dependencies prior to execution.",

Can you show an example of how to do it?

Thanks.

-----Original Message-----
From: Alan Gates [mailto:[EMAIL PROTECTED]]
Sent: Thursday, December 20, 2012 10:57 AM
To: [EMAIL PROTECTED]
Subject: Re: pig ship tar files

See http://pig.apache.org/docs/r0.10.0/basic.html#define-udfs especially the section on SHIP.

Alan.

On Dec 20, 2012, at 10:01 AM, Danfeng Li wrote:

> I read alot of about pig can ship a tar file and untar it before execution. However, I couldn't find any example. Can someone provide an example?
>
> What I would like to do is to ship a python module, such as nltk, for my streaming.
>
> Thanks.
>
> Dan
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB