Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> ISSUE :Hadoop with HANA using sqoop


Copy link to this message
-
Re: ISSUE :Hadoop with HANA using sqoop
Hi Sameer

The query

"SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"

Is first executed by SQOOP  to fetch the metadata.

The actual data fetch happens as part of individual queries from each task which would be a sub query of the whole input query.

Regards
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Alexander Alten-Lorenz <[EMAIL PROTECTED]>
Date: Thu, 21 Feb 2013 07:58:13
To: [EMAIL PROTECTED]<[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Hey Samir,

Since you've posted this already @CDH users, please go ahead there.

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <[EMAIL PROTECTED]> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
>
>
> Error:
>
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB