-Re: Error while exporting table data from hive to Oracle through Sqoop
abhijeet gaikwad 2013-03-06, 06:41
It seems your violating unique key constraint in second task attempt which
is obvious if there is already some data committed by Sqoop in first
attempt. This is an issue with Sqoop!
>From the exception in first attempt it looks like there is some issue when
auto-generated class (BTTN_BKP.java in your case) tries parsing the data.
Can you validate the data being inserted (not the unique constraint but
some invalid data for some column in input files)?
If data is huge - a way of narrowing down your search for the problematic
Run sqoop with just one mapper (-m 1) . Make sure you have one file that
contains the whole data. If multiple files, the way I can think of right
now is to run sqoop per file or merge the data in one file. I am not sure
if this info is enough for you to control number of mappers but the motive
here is to make sure only one map task is created for the job.
After running sqoop and getting that exception; looking at the inserted
data in the table, you can figure out which was the next batch that Sqoop
was trying to insert!
Try and find which is the erroneous row - usually a batch would contain
1000 rows (rows per insert statement (100) * no. of insert statements
(100)) before a commit is fired. So, check next 1000 rows! If you want to
narrow this down further set this in your command line:
$ sqoop import *-Dsqoop.export.records.per.statement=1 -D
sqoop.export.statements.per.transaction=1* --connect ...
This will make sure you commit after insertion of every row. Haven't tried
this with Oracle, but I was able to set the batch size using these options
Let us hope this helps you find any invalid data values if any! If no
invalid data, I would suggest continuing this discussion in sqoop
user/sqoop dev mailing lists; your still posting via hive user.
On Wed, Mar 6, 2013 at 11:14 AM, Ajit Kumar Shreevastava <
[EMAIL PROTECTED]> wrote:
> Hi Abhijeet,****
> Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.****
> ** **
> Ajit Kumar Shreevastava****
> abhijeet gaikwad <[EMAIL PROTECTED]> wrote:****
> ** **
> + sqoop user
> The answer is in your exception! Check your data, your hitting unique key
> On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <
> [EMAIL PROTECTED]> wrote:****
> Hi All,****
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER --table
> BTTN_BKP --export-dir /home/hadoop/user/hive/warehouse/bttn -P --verbose
> -m 1 --input-fields-terminated-by '\001'*****
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
> Please set $HBASE_HOME to the root of your HBase installation.****
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
> Enter password:****
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****