Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Installing sqoop on Amazon EMR


+
Ibrahim Yakti 2013-01-14, 15:38
+
Jarek Jarcec Cecho 2013-01-15, 06:07
+
Ibrahim Yakti 2013-01-15, 10:27
+
Jarek Jarcec Cecho 2013-01-15, 10:41
Copy link to this message
-
Re: Installing sqoop on Amazon EMR
Thanks Jarec, seems it was a permission issue of one of the trackers, I
have granted access to % on the mysql server and it worked like a charm.

Although I am working on a single node EMR but it seems it uses more than
one IP because part of it is connecting while other IPs are timed out.

Thanks.
--
Ibrahim
On Tue, Jan 15, 2013 at 1:41 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:

> Hi Ibrahim,
> based on your current exception it seems that you correctly resolved
> previous exception about Hadoop incompatibility.
>
> > java.lang.RuntimeException: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
>
> This exception is raised when MySQL JDBC driver can't create connection to
> your MySQL box. Sqoop requires direct access to your MySQL box not only
> from node where you're executing Sqoop, but also from all TaskTracker nodes
> in your cluster. Would you mind checking that you can connect to your MySQL
> box from all nodes? I would start by checking if the user on MySQL side is
> correctly defined. Additional information can be found in Sqoop
> troubleshooting guide [1].
>
> Jarcec
>
> Links:
> 1:
> http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_mysql_connection_failure
>
> On Tue, Jan 15, 2013 at 01:27:14PM +0300, Ibrahim Yakti wrote:
> > Hello Jarek,
> >
> > I have tried that and I got the following error:
> >
> > 13/01/15 09:49:17 INFO mapred.JobClient: Task Id :
> > attempt_201212160928_0044_m_000000_0, Status : FAILED
> > java.lang.RuntimeException: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
> >
> > The last packet sent successfully to the server was 0 milliseconds ago.
> The
> > driver has not received any packets from the server.
> >         at
> >
> org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
> >         at
> > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> >         at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:730)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
> >
> >
> >
> > when I checked the processlist on the MySQL server I got the following:
> >
> > >      Id: 10006
> > >    User: DB_USER
> > >    Host: DB_HOST
> > >      db: DB
> > > Command: Sleep
> > >    Time: 10
> > >   State:
> > >    Info: NULL
> >
> >
> >
> >
> > I have removed all the old sqoop files and downloaded a fresh one, I used
> > "ant clean package -Dhadoopversion=100"
> >
> > moreover I have tried to compile it with -Dhadoopversion=20 &
> > -Dhadoopversion=23 ... still getting errors with no success.
> >
> >
> >
> > --
> > Ibrahim
> >
> >
> > On Tue, Jan 15, 2013 at 9:07 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]
> >wrote:
> >
> > > Hi Ibrahim,
> > > how did you compiled the Sqoop? Would you mind trying the following
> > > command?
> > >
> > > ant clean package -Dhadoopversion=100
> > >
> > > * clean - remove any previous compilation outputs, this is to ensure
> that
> > > there are not left any files compiled for different Hadoop version
> > > * package - compile and create package in build/ directory
> > > * -Dhadopversion=100 compile all files for Hadoop 1.0.x
> > >
> > > Jarcec
> > >
> > > On Mon, Jan 14, 2013 at 06:38:36PM +0300, Ibrahim Yakti wrote:
> > > > Hello,
> > > >
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB