Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop 2 Import


+
Kyle B 2013-05-03, 19:11
+
Jarek Jarcec Cecho 2013-05-03, 21:41
+
Kyle B 2013-05-03, 21:52
Copy link to this message
-
Re: Sqoop 2 Import
Hi Keyle,
I'm having issues to find that precise version. It do not seems to be available on Apache archive [1] and I did not find any branch or tag related to it on github mirror [2]. Would you mind sharing with us where did you downloaded it?

Jarcec

Links:
1: http://archive.apache.org/dist/hadoop/common/
2: https://github.com/apache/hadoop-common

On Fri, May 03, 2013 at 02:52:00PM -0700, Kyle B wrote:
> Hi Jarcec,
>
> I'm currently running Hadoop 2.0.2.1-alpha. So it should have that fix in
> theory.
>
> -Kyle
>
> On Fri, May 3, 2013 at 2:41 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
> > Hi Kyle,
> > what Hadoop version are you using?
> >
> > Based on the exception it seems that you're using YARN which is suggesting
> > Hadoop 2. I hit similar NullPointerException during release candidate
> > testing on Hadoop 2.0.0-alpha. There is a known HDFS bug HADOOP-8726 [1]
> > that is causing this. It was fixed in 2.0.2-alpha, so I'm wondering if
> > you're using 2.0.0 or 2.0.1 by any chance?
> >
> > Jarcec
> >
> > Links:
> > 1: https://issues.apache.org/jira/browse/HADOOP-8726
> >
> > On Fri, May 03, 2013 at 12:11:05PM -0700, Kyle B wrote:
> > > Hello,
> > >
> > > I am migrating from Sqoop 1.4.3 to Sqoop 2, and am kind of stumbling my
> > way
> > > through the intro stuff. I'd like to import data from a MySQL database
> > into
> > > HDFS, and a simple import seems to be failing in 2, which works fine in
> > 1.
> > >
> > > - From (1.4.3) -
> > > sqoop import --connect jdbc:mysql://127.0.0.1:3306/db --username kyle -P
> > > --table kyle_table --target-dir /user/kyle/table
> > >
> > > - To (1.99.2) -
> > > sqoop:000> create job --xid 1 --type import
> > > Creating job for connection with id 1
> > > Please fill following values to create new job object
> > > Name: test
> > >
> > > Database configuration
> > >
> > > Schema name:
> > > Table name: kyle_table
> > > Table SQL statement:
> > > Table column names: *
> > > Partition column name:
> > > Boundary query:
> > >
> > > Output configuration
> > >
> > > Storage type:
> > >   0 : HDFS
> > > Choose: 0
> > > Output format:
> > >   0 : TEXT_FILE
> > >   1 : SEQUENCE_FILE
> > > Choose: 0
> > > Output directory: /user/kyle/table2
> > >
> > > Throttling resources
> > >
> > > Extractors:
> > > Loaders:
> > > New job was successfully created with validation status FINE  and
> > > persistent id 12
> > > sqoop:000> submission start --jid 12
> > > Submission details
> > > Job id: 12
> > > Status: BOOTING
> > > Creation date: 2013-05-03 11:38:01 MST
> > > Last update date: 2013-05-03 11:38:01 MST
> > > External Id: job_1367275490217_0075
> > >         http://server:8088/proxy/application_1367275490217_0075/
> > > Progress: Progress is not available
> > > sqoop:000> submission status --jid 12
> > > Exception has occurred during processing command
> > > Server has returned exception: Exception: java.lang.NullPointerException
> > > Message:
> > >
> > >
> > > --
> > > On both cases, I see the jobs made their way to the job history, and on
> > the
> > > first, the output was saved to HDFS. On the second, I just have a blank
> > > /user/kyle/table2 folder, and the task failed in Hadoop.
> > >
> > >
> > >  - Hadoop Logs -
> > >  2013-05-03 11:38:10,172 WARN [main] org.apache.hadoop.mapred.YarnChild:
> > > Exception running child : java.lang.NullPointerException
> > >  at java.lang.String.<init>(Unknown Source)
> > >  at
> > >
> > org.apache.sqoop.job.mr.ConfigurationUtils.loadConfiguration(ConfigurationUtils.java:77)
> > >  at
> > >
> > org.apache.sqoop.job.mr.ConfigurationUtils.getConnectorConnection(ConfigurationUtils.java:38)
> > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:69)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
> > >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
> > >  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
> > >  at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Unknown Source)
+
Kyle B 2013-05-06, 19:01
+
Jarek Jarcec Cecho 2013-05-08, 01:39
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB