Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Flume on CDH 4.5: java.lang.NoClassDefFoundError when starting agent


Copy link to this message
-
Re: Flume on CDH 4.5: java.lang.NoClassDefFoundError when starting agent
Brock Noland 2014-01-24, 16:13
I wonder if FLUME_HOME is defined...
On Thu, Jan 23, 2014 at 8:04 PM, ed <[EMAIL PROTECTED]> wrote:

> I've been using Flume with CDH 4.5.0 and it has been working without any
> problems.  Today all of a sudden I'm getting what I think is some sort of
> classpath error when I try to start my agent from the command line.
>  Running the default agent from Cloudera Manager seems to work fine.
>
> Info: Including Hadoop libraries found via (/usr/bin/hadoop) for HDFS
>> access
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/flume/tools/GetJavaProperty
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.flume.tools.GetJavaProperty
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
>
>
> I don't recall changing anything in regards to classpaths.   I'm trying to
> run the agent directly on the master node (namenode, secondarynamenode,
> hdfs gateway, mapreduce gateway) but I'm running it from the command line
> (versus from Cloudera Manager).
>
> which hadoop
>
> /usr/bin/hadoop
>
>
> which java
>
> /usr/bin/java
>
>
> hadoop classpath
>
>
>> /etc/hadoop/conf:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/./:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/lib/*:/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop/libexec/../../hadoop-0.20-mapreduce/.//*
>
>
> Running normal hadoop HDFS commands like "hadoop fs -ls" works just fine
> without having to specify the HDFS host.   I tried redeploying all the
> client configs and even restarting all the services to no avail.
>  Previously I had no issues running fume from the command line so I know I
> must have messed up a setting somewhere.
>
> Does anyone have any ideas as to what would be causing this issue?  Thank
> you!
>
> Best Regards,
>
> Ed
>

--
Apache MRUnit - Unit testing MapReduce - http://mrunit.apache.org