Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Export not working


+
Sarath 2012-10-25, 13:53
+
Jarek Jarcec Cecho 2012-10-25, 16:05
+
Sarath 2012-10-26, 05:00
+
Jarek Jarcec Cecho 2012-10-26, 14:36
Copy link to this message
-
Re: Export not working
Sure will check that part. But I'm still struck making export work. I'm
facing issues exporting data to Oracle database.

The issue is with the Timestamp fields. Sqoop generated code uses
java.sql.Timestamp which expects date field value in a particular
format. But in our data, available on hadoop, the format of the date
field is not guaranteed as the data is placed in this location by
multiple sources.

I even tried putting hadoop data in the format expected by Timestamp but
still it complains with IllegalArgumentException.

Is there any workaround?

~Sarath.

On Friday 26 October 2012 08:06 PM, Jarek Jarcec Cecho wrote:
> Hi Sarath,
> I'm glad that Sqoop has started working for you.
>
> The internet advise with parameter -Dhadoopversion=100 should indeed do the trick, so I'm not sure what has went wrong with your build. Maybe some previous classes were found and not recompiled, would you mind trying "ant clean package -Dhadoopversion=100" to see if that helps?
>
> Jarcec
>
> On Fri, Oct 26, 2012 at 10:30:06AM +0530, Sarath wrote:
>> Thanks Jarcec. I downloaded the binary artifact and it's working now.
>>
>> I actually built my previous sqoop binary from the sources using the
>> option -Dhadoopversion=100 (since my hadoop version was 1.0.3) after
>> reading some blogs on the net. Not sure why it was still giving me
>> that exception.
>>
>> Sarath.
>>
>> On Thursday 25 October 2012 09:35 PM, Jarek Jarcec Cecho wrote:
>>> Hi Sarah,
>>> this exception is very typical when someone is messing together incompatible hadoop binaries and applications (for example sqoop compiled for hadoop 2 running on hadoop 1). Would you mind checking that you've downloaded appropriate binary distribution for your cluster? You have to use binary artifact sqoop-1.4.2.bin__hadoop-1.0.0.tar.gz for hadoop 1.0.3.
>>>
>>> Jarcec
>>>
>>> On Thu, Oct 25, 2012 at 07:23:49PM +0530, Sarath wrote:
>>>> Hi,
>>>>
>>>> I'm new to Sqoop. I have sqoop 1.4.2 with hadoop 1.0.3. I have both
>>>> hadoop and sqoop home environment variables set.
>>>>
>>>> I'm trying to export a file on HDFS to a table in Oracle database. I
>>>> included all the required parameters inside a file and then ran -
>>>> /sqoop --options-file export_params/
>>>>
>>>> I got the below exception -
>>>> /Exception in thread "main" java.lang.IncompatibleClassChangeError:
>>>> Found class org.apache.hadoop.mapreduce.JobContext, but interface
>>>> was expected//
>>>> //    at org.apache.sqoop.mapreduce.ExportOutputFormat.checkOutputSpecs(ExportOutputFormat.java:57)//
>>>> //    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:887)//
>>>> //    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)//
>>>> //    at java.security.AccessController.doPrivileged(Native Method)//
>>>> //    ..../
>>>>
>>>> Is there anything more to be configured?
>>>>
>>>> Regards,
>>>> Sarath.
+
Jarek Jarcec Cecho 2012-10-26, 15:05
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB