Keith Wiley 2012-08-01, 20:33
Dhruv 2012-08-01, 22:34
Keith Wiley 2012-08-01, 23:37
Jim Donofrio 2012-08-02, 00:57
It's just easier that way. I don't have to link in any hadoop libraries or bring in any other hadoop related code. It keeps the two environments fundamentally separated. I suppose I could wrap hadoop into the exterior code, but I do kinda like the idea of keeping my various worlds separate. I'll consider it, but I don't really like the idea. I don't want the program to be very dependent on hadoop. Simply removing a call to execing it is a lot easier than gutting hadoop code and linked .jars.
I'll take a look at it, maybe there's a way to do that with relative ease.
On Aug 1, 2012, at 17:57 , Jim Donofrio wrote:
> Why would you call the hadoop script, why not just call the part of the hadoop shell api you are trying to call directly from java?
> On 08/01/2012 07:37 PM, Keith Wiley wrote:
>> Hmmm, at first glance that does appear to be similar to my situation. I'll have to delve through it in detail to see if it squarely addresses (and fixes) my problem. Mine is sporadic and I suspect dependent on the current memory situation (it isn't a deterministic and guaranteed failure). I am not sure if that is true of the stackoverflow question you referenced...but it is certainly worth reading over.
>> On Aug 1, 2012, at 15:34 , Dhruv wrote:
>>> Is this related?
Keith Wiley [EMAIL PROTECTED] keithwiley.com music.keithwiley.com
"Luminous beings are we, not this crude matter."