Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> sqoop-export with sequence files doesn't work.


Copy link to this message
-
Re: sqoop-export with sequence files doesn't work.
Ah I believe you're correct. Was this data imported with Sqoop? If so, does
the table you're exporting to differ from the table you imported from?
On Thu, Aug 15, 2013 at 11:38 PM, Deepak Konidena <[EMAIL PROTECTED]>wrote:

> Does sqoop-export support --as-sequence option? I know sqoop-import does.
>
>
> -Deepak
>
>
>
> On Thu, Aug 15, 2013 at 11:34 PM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
>
>> Hey There,
>>
>> I believe you're missing the --as-sequence directive!
>>
>> -Abe
>>
>>
>> On Thu, Aug 15, 2013 at 7:16 PM, Deepak Konidena <[EMAIL PROTECTED]>wrote:
>>
>>> Hi,
>>>
>>> I have a sequence file with with both (key,value) as
>>> org.apache.hadoop.io.Text
>>>
>>> I am trying to export the data into a mysql table with (key,value)
>>> mapped to (varchar, blob) since the value is pretty big. and I get the
>>> following error:
>>>
>>> (command) - sqoop export -m "1" -connect
>>> "jdbc:mysql://<host>:3306/database" --username "sqoop" --password
>>> "sqooppwd" --table "tablename"  --export-dir "/path/to/sequencefile"
>>> --verbose
>>>
>>> java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast
>>> to org.apache.hadoop.io.LongWritable
>>>     at
>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
>>>     at
>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
>>>     at
>>> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
>>>     at
>>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:461)
>>>     at
>>> org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:66)
>>>     at
>>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:75)
>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
>>>     at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>
>>> The export works fine when I create a text file like so,
>>>
>>> <key,value1,value2,value3>
>>>
>>> and upload it to hdfs using -CopyFromLocal
>>>
>>> But, its only with sequence files that it doesn't seem to work. Any
>>> thoughts?
>>>
>>> Thanks,
>>> Deepak
>>>
>>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB