Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Exception of import-tool with Sqoop 1.4.3 and Hadoop 1.1.1


Copy link to this message
-
Re: Exception of import-tool with Sqoop 1.4.3 and Hadoop 1.1.1
Jarek Jarcec Cecho 2013-07-12, 15:27
Hi Sam,
The exception java.lang.IncompatibleClassChangeError: Found class but interface was expected is very common when one is using code compiled against Hadoop 1 on Hadoop 2 or vice versa. You seem to be using correctly the -Dhadoopversion=100 to build Sqoop against Hadoop 1, however ant target tar won't recompile the sources in case that they are already compiled (even when they are compiled against wrong hadoop version). Thus I would suggest to run following command instead:

  ant clean tar -Dhadoopversion=100

Jarcec

On Fri, Jul 12, 2013 at 03:31:59PM +0800, sam liu wrote:
> Hi Experts,
>
> I built sqoop project to generate sqoop-1.4.3.bin__hadoop-1.0.0.tar.gz
> using command:'ant -Dhadoopversion=100 tar'. Before that, I clean up my ivy
> repository.
>
> However, using the generated sqoop-1.4.4 project, I still encountered
> IncompatibleClassChangeError as below. My command likes 'sqoop import
> --connect jdbc:db2://hostname:50000/SAMPLE --table DB2ADMIN.DB2TEST_TBL
> --username
> user --password passwrd --target-dir /tmp/DB2TEST_TBL --split-by id'.
>
> Any commnts? Thanks!
>
> 13/07/11 23:17:31 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://
> 127.0.0.1:9010/home/temp/hadoop/mapred/staging/root/.staging/job_201307112228_0013
> Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
> class org.apache.hadoop.mapreduce.JobContext, but interface was expected
>         at
> org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:53)
>         at
> com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
>         at
> org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:121)
>         at
> org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
>         at
> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
>         at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
>         at
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:545)
>         at
> org.apache.sqoop.manager.Db2Manager.importTable(Db2Manager.java:64)
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>
>
>
>
> --
>
> Sam Liu