Then, I suggest:
1) Increase your session per user in RAC equal or greater than your map
2) I suppose you want 100 map tasks, then write a little java code to
generate 100 files, each file has only one line, you can write any in this
3) Define a STEP for all maps. then each map task select STEP records from
4) Put all your files to the HDFS
5) Create Map(no reduce here), execute "select * from Oracle where rownum
>= lineno and rownum < maptask_id * STEP" in each map task.
On Fri, Dec 14, 2012 at 11:36 AM, Mehmet Simsek <[EMAIL PROTECTED]>wrote:
> Hello Azuryy,
> Oracle in RAC.because of too long running sql statement datanode cannot
> respond to namenode about statistics.because of that exception is thrown by
> namenode like as "cannot respond from data node in 600sec..." In sqoop tool.
> Is there an another solution about this problem?
> On 14 Ara 2012, at 05:11, Azuryy Yu <[EMAIL PROTECTED]> wrote:
> > Hello Mehmet,
> > what about your oracle? a single node or RAC ?
> > On Thu, Dec 13, 2012 at 11:46 PM, Mehmet Simsek <
> [EMAIL PROTECTED]>wrote:
> >> Hi,I want to load 5 billion rows from oracle table to hbase table. Which
> >> technique is the best for this bulk loading?
> >> Thanks
> >> M.S