I assume you do not wish to be using the DistributedCache (or a HDFS
location for DistributedCache), which is the most ideal way to ship
You can place your jars onto the TT classpaths by placing them at an
arbitrary location such as /opt/jars, and editing the TT's
hadoop-env.sh to extend HADOOP_CLASSPATH to include this extra
location. This would still require administrative configuration edits,
and service restarts each time you want to add a new jar or change a
jar. With DC, these aren't required.
On Fri, Aug 9, 2013 at 6:59 AM, Sanjeev Verma <[EMAIL PROTECTED]> wrote:
> On 08/08/2013 09:23 PM, John Hancock wrote:
>> Where else might one put .jar files that a map/reduce job will need?
> Why do you need an alternative location? Is there a constraint on being able
> to place your library jars under $HADOOP_HOME/lib?