Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive, mail # user - Exception when hive submits M/R jobs


+
Sam William 2012-01-31, 19:50
Copy link to this message
-
Re: Exception when hive submits M/R jobs
Sam William 2012-02-01, 22:52
I have resolved this, so  I ll share what the issue was,
I had set  HIVE_AUX_JARS_PATH in my hive-env.sh

as
  HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH,$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar.

The empty HIVE_AUX_JARS_PATH was causing the exception.

The following fix made it work

if [ -z "$HIVE_AUX_JARS_PATH" ]; then
        HIVE_AUX_JARS_PATH=$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar
else
       HIVE_AUX_JARS_PATH=$HIVE_AUX_JARS_PATH,$HIVE_HOME/lib/jar1.jar,$HIVE_HOME/lib/jar2.jar,$HIVE_HOME/lib/jar3.jar
Thanks,
Sam
On Jan 31, 2012, at 11:50 AM, Sam William wrote:

>
> I have a new Hive installation . Im able to create  tables and do select * queries from them.  But as soon as I try to execute a query that would involve a Hadoop M/R job,  I get this exception .
>
>
>
> java.lang.IllegalArgumentException: Can not create a Path from an empty string
>        at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
>        at org.apache.hadoop.fs.Path.<init>(Path.java:90)
>        at org.apache.hadoop.fs.Path.<init>(Path.java:50)
>        at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
>        at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
>        at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
>        at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157)
>        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
>
>
>
> The table is pretty simple .  It is an external table on the  HDFS  and does not have  any partitions.       Any idea why this could be happening ?
>
>
>
> Thanks,
> Sam William
> [EMAIL PROTECTED]
>
>
>

Sam William
[EMAIL PROTECTED]