Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Accumulo >> mail # dev >> hadoop classpath causing an exception (sub-command not defined?)

Copy link to this message
Re: hadoop classpath causing an exception (sub-command not defined?)
It looks to me like the change of Nov 21, 2012 added the 'hadoop
classpath' call to the accumulo script.

  ACCUMULO-708 initial implementation of VFS class loader …
  git-svn-id: https://svn.apache.org/repos/asf/accumulo/trunk@1412398
  Dave Marion authored 23 days ago

Could the classpath sub-command be part of a newer version (>0.20.2) of hadoop?

On Fri, Dec 14, 2012 at 12:18 AM, John Vines <[EMAIL PROTECTED]> wrote:
> I didn't think hadoop had a classpath argument, just Accumulo.
> Sent from my phone, please pardon the typos and brevity.
> On Dec 13, 2012 10:43 PM, "David Medinets" <[EMAIL PROTECTED]> wrote:
>> I am at a loss to explain what I am seeing. I have installed Accumulo
>> many times without a hitch. But today, I am running into a problem
>> getting the hadoop classpath.
>> $ /usr/local/hadoop/bin/hadoop
>> Usage: hadoop [--config confdir] COMMAND
>> where COMMAND is one of:
>>   namenode -format     format the DFS filesystem
>>   secondarynamenode    run the DFS secondary namenode
>>   namenode             run the DFS namenode
>>   datanode             run a DFS datanode
>>   dfsadmin             run a DFS admin client
>>   mradmin              run a Map-Reduce admin client
>>   fsck                 run a DFS filesystem checking utility
>>   fs                   run a generic filesystem user client
>>   balancer             run a cluster balancing utility
>>   jobtracker           run the MapReduce job Tracker node
>>   pipes                run a Pipes job
>>   tasktracker          run a MapReduce task Tracker node
>>   job                  manipulate MapReduce jobs
>>   queue                get information regarding JobQueues
>>   version              print the version
>>   jar <jar>            run a jar file
>>   distcp <srcurl> <desturl> copy file or directories recursively
>>   archive -archiveName NAME <src>* <dest> create a hadoop archive
>>   daemonlog            get/set the log level for each daemon
>>  or
>>   CLASSNAME            run the class named CLASSNAME
>> Most commands print help when invoked w/o parameters.
>> I am using using the following version of hadoop:
>> $ hadoop version
>> Hadoop 0.20.2
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> -r 911707
>> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
>> Inside the accumulo script is the line:
>> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
>> This line results in the following exception:
>> $ $HADOOP_HOME/bin/hadoop classpath
>> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
>> Caused by: java.lang.ClassNotFoundException: classpath
>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> Could not find the main class: classpath. Program will exit.
>> Am I missing something basic? What?