Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig >> mail # user >> How can I load external files within jython UDF?


+
Young Ng 2012-12-09, 20:52
Copy link to this message
-
RE: How can I load external files within jython UDF?
If you ship the file explicitly, you can use this syntax from there.  It will pack it with the job jar and make sure it is in the working directory wherever the job runs.  Be careful of shipping very large files, it is probably better to refactor your logic into multiple tiplevel pig statements on data loaded from hdfs if you find yourself shipping fixed, very large files.
________________________________
From: Young Ng
Sent: 12/9/2012 12:53 PM
To: [EMAIL PROTECTED]
Subject: How can I load external files within jython UDF?

Hi,

I am trying to load some external resources within my jython udf functions, e.g:

@outputSchema(....)
def test():
    f = open('test.txt.')
    text = f.read()
    f.close()
    return text

I have place the 'test.txt' on both working folder and hdfs, and I got the following error:
   IOError: (2, 'No such file or directory', 'test.txt')

I have also try to print out the working path of jython, with os.getcwd(), below is what I got:
  /home/hduser/tmp/mapred/local/taskTracker/hduser/jobcache/job_201212080111_0007/attempt_201212080111_0007_m_000000_0/work
  ....

I suspect that I can use absolute path within udf, but how can I transfer the external resources to
other hadoop datanodes?
Thanks,
Young Wu

+
Young Ng 2012-12-09, 23:18