Vaibhav V Nirkhe 2013-10-09, 12:49
Abraham Elmahrek 2013-10-09, 17:04
Vaibhav V Nirkhe 2013-10-09, 17:12
-Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql
Abraham Elmahrek 2013-10-09, 17:29
Ah I see! Sorry, I didn't understand that!
Hbase should dump data into a sequence file I believe. You might have to
transform that into another format (perhaps with Hive or Pig?) and then
export the data.
Alternatively, you might be able to upgrade and use the HCatalog
On Wed, Oct 9, 2013 at 10:12 AM, Vaibhav V Nirkhe <
[EMAIL PROTECTED]> wrote:
> Hi Abraham,
> Thanks !! I have done the same I have exported the
> data from hbase to hdfs and exported file is placed at */user/hduser/esr_data
> *still I am getting this exception . Please let me know what is wrong
> below .
> One thing I could observe is exported file from Hbase seems to have
> serialized objects instead of tsv data . But dont know how to get .tsv
> format through hbase export .
> Thanks and regards,
> Vaibhav Nirkhe
> *From:* Abraham Elmahrek [[EMAIL PROTECTED]]
> *Sent:* Wednesday, October 09, 2013 10:34 PM
> *To:* [EMAIL PROTECTED]
> *Subject:* Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql
> Hbase exporting is currently not supported in Sqoop.
> What you can do is export the Hbase data into HDFS first, then use Sqoop
> to transfer it into MySQL.
> On Wed, Oct 9, 2013 at 5:49 AM, Vaibhav V Nirkhe <
> [EMAIL PROTECTED]> wrote:
>> Hi ,
>> I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export
>> HBase data placed in HDFS to MySQL , however I am getting following
>> ClassCastException :-
>> I am using following command :-
>> sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P
>> --table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH --export-dir
>> /user/hduser/esr_data --verbose -m 1
>> I guess Sqoop is trying to fetch the record by its key and not able to
>> cast the key :-
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable cannot be cast to
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> I don't understand why the key is always expected to be LongWritable here
>> ? Please suggest asap .
>> Thanks in advance,
>> NOTE: This message may contain information that is confidential,
>> proprietary, privileged or otherwise protected by law. The message is
>> intended solely for the named addressee. If received in error, please
>> destroy and notify the sender. Any use of this email is prohibited when
>> received in error. Impetus does not represent, warrant and/or guarantee,
>> that the integrity of this communication has been maintained nor that the
>> communication is free of errors, virus, interception or interference.