Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Installing sqoop on Amazon EMR


Copy link to this message
-
Re: Installing sqoop on Amazon EMR
Hello Jarek,

I have tried that and I got the following error:

13/01/15 09:49:17 INFO mapred.JobClient: Task Id :
attempt_201212160928_0044_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.RuntimeException:
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
link failure

The last packet sent successfully to the server was 0 milliseconds ago. The
driver has not received any packets from the server.
        at
org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:730)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException:
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
link failure

when I checked the processlist on the MySQL server I got the following:

>      Id: 10006
>    User: DB_USER
>    Host: DB_HOST
>      db: DB
> Command: Sleep
>    Time: 10
>   State:
>    Info: NULL
I have removed all the old sqoop files and downloaded a fresh one, I used
"ant clean package -Dhadoopversion=100"

moreover I have tried to compile it with -Dhadoopversion=20 &
-Dhadoopversion=23 ... still getting errors with no success.

--
Ibrahim
On Tue, Jan 15, 2013 at 9:07 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:

> Hi Ibrahim,
> how did you compiled the Sqoop? Would you mind trying the following
> command?
>
> ant clean package -Dhadoopversion=100
>
> * clean - remove any previous compilation outputs, this is to ensure that
> there are not left any files compiled for different Hadoop version
> * package - compile and create package in build/ directory
> * -Dhadopversion=100 compile all files for Hadoop 1.0.x
>
> Jarcec
>
> On Mon, Jan 14, 2013 at 06:38:36PM +0300, Ibrahim Yakti wrote:
> > Hello,
> >
> > I am trying to install sqoop on EMR instance (Hadoop 1.0.3), I tried to
> > compile it from source (1.4.2) but I am getting this error:
> >
> > *Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
> > class org.apache.hadoop.mapreduce.JobContext, but interface was expected*
> >
> > I did as the FAQ page (
> https://cwiki.apache.org/confluence/display/SQOOP/FAQ)
> > but still with the same error.
> >
> > I tried to use another versions and all with the same error.
> >
> > the only tutorial I was able to fine was this:
> >
> http://blog.kylemulka.com/2012/04/how-to-install-sqoop-on-amazon-elastic-map-reduce-emr/
> > but as I said I am not using S3 and I need to run it on the same server.
> >
> >
> > Any suggestion how to install it on Amazon EMR?
> >
> > Thanks.
> >
> > --
> > Ibrahim
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB