I am trying to load some external resources within my jython udf functions, e.g:
f = open('test.txt.')
text = f.read()
I have place the 'test.txt' on both working folder and hdfs, and I got the following error:
IOError: (2, 'No such file or directory', 'test.txt')
I have also try to print out the working path of jython, with os.getcwd(), below is what I got:
I suspect that I can use absolute path within udf, but how can I transfer the external resources to
other hadoop datanodes?