I was actually planning on filing a bug for this. I think it is supposed to remove the temp. hadoop results from this dir but it doesn't seem to do that anymore. For now I use a workaround to simply remove anything from there older then x days however that still means you have results from really large jobs laying around there for x days.
From: shangan [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, August 24, 2010 10:36 AM
Subject: scrach space grows too fast
what's the parameter "hive.exec.scratchdir" for ? save the job plan ?
the space of this directory is growing too fast and I don't know why files under this directory is not removed after the job is finished. Do they have other functions ?