Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> sqoop-export with sequence files doesn't work.


+
Deepak Konidena 2013-08-16, 02:16
+
Abraham Elmahrek 2013-08-16, 06:34
+
Deepak Konidena 2013-08-16, 06:38
+
Abraham Elmahrek 2013-08-16, 16:26
Copy link to this message
-
Re: sqoop-export with sequence files doesn't work.
I've run into this problem as well. I ended up copying the table into a
non-sequenceFile table just so I could sqoop it out (something along the
lines of CREATE TABLE nonSeqTbl LIKE seqTbl; INSERT OVERWRITE nonSeqTbl
SELECT * FROM seqTbl;).

Is there a plan to allow sqoop-exporting of sequence file tables?

Krishna
On 16 August 2013 17:26, Abraham Elmahrek <[EMAIL PROTECTED]> wrote:

> Ah I believe you're correct. Was this data imported with Sqoop? If so,
> does the table you're exporting to differ from the table you imported from?
>
>
> On Thu, Aug 15, 2013 at 11:38 PM, Deepak Konidena <[EMAIL PROTECTED]>wrote:
>
>> Does sqoop-export support --as-sequence option? I know sqoop-import does.
>>
>>
>> -Deepak
>>
>>
>>
>> On Thu, Aug 15, 2013 at 11:34 PM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
>>
>>> Hey There,
>>>
>>> I believe you're missing the --as-sequence directive!
>>>
>>> -Abe
>>>
>>>
>>> On Thu, Aug 15, 2013 at 7:16 PM, Deepak Konidena <[EMAIL PROTECTED]>wrote:
>>>
>>>> Hi,
>>>>
>>>> I have a sequence file with with both (key,value) as
>>>> org.apache.hadoop.io.Text
>>>>
>>>> I am trying to export the data into a mysql table with (key,value)
>>>> mapped to (varchar, blob) since the value is pretty big. and I get the
>>>> following error:
>>>>
>>>> (command) - sqoop export -m "1" -connect
>>>> "jdbc:mysql://<host>:3306/database" --username "sqoop" --password
>>>> "sqooppwd" --table "tablename"  --export-dir "/path/to/sequencefile"
>>>> --verbose
>>>>
>>>> java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast
>>>> to org.apache.hadoop.io.LongWritable
>>>>     at
>>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
>>>>     at
>>>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
>>>>     at
>>>> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
>>>>     at
>>>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:461)
>>>>     at
>>>> org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:66)
>>>>     at
>>>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:75)
>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
>>>>     at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>>
>>>> The export works fine when I create a text file like so,
>>>>
>>>> <key,value1,value2,value3>
>>>>
>>>> and upload it to hdfs using -CopyFromLocal
>>>>
>>>> But, its only with sequence files that it doesn't seem to work. Any
>>>> thoughts?
>>>>
>>>> Thanks,
>>>> Deepak
>>>>
>>>>
>>>
>>
>
+
Venkat Ranganathan 2013-08-16, 17:14
+
Deepak Konidena 2013-08-16, 17:28
+
Venkat Ranganathan 2013-08-16, 18:04
+
Jarek Jarcec Cecho 2013-08-24, 23:07
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB