Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Need help and tips for tthe following issue: No data get exported from hadoop to mysql using sqoop.


Copy link to this message
-
Re: Need help and tips for tthe following issue: No data get exported from hadoop to mysql using sqoop.
I'm glad that you've managed to resolve the issue. Happy sqooping!

Jarcec

On Wed, Oct 17, 2012 at 05:53:24PM -0400, Matthieu Labour wrote:
> Jarceq
> My bad. My mistake. Amazon attaches the slave and the master to 2 different
> security groups and both had to be added to the Amazon RDS
> It is working now
> Thank you for your help
>
> On Mon, Oct 15, 2012 at 5:40 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
> > Hi Matt,
> > thanks for getting back to me with actual task log. I'm adding Sqoop user
> > mailing list back in loop in so that others might jump in. I've however
> > removed entire log to prevent disclosure of any sensitive data.
> >
> > The log contained only original Connection exception with no further
> > wrapped exceptions that would help finding out the cause. I would recommend
> > to do following:
> >
> > Connect to each of your slave nodes (ssh) and try to connect to your MySQL
> > box, e.g. something like
> >
> > mysql -h mysql.server -u myuser -pmypassword database
> >
> > It should be working from each node. If this won't work (and I'm expecting
> > that it won't), then there might be some firewall issues or other
> > networking problems that you will have to solve.
> >
> > Jarcec
> >
> > On Mon, Oct 15, 2012 at 12:34:08PM -0400, Matthieu Labour wrote:
> > > Hi Jarcec
> > > Please find emclosed the screenshot using the hadoop web interfaces:
> > >
> > http://docs.amazonwebservices.com/ElasticMapReduce/latest/DeveloperGuide/UsingtheHadoopUserInterface.html
> > > I am sending an email directly to you as the log might contain some info
> > > that I would rather not have on the web/ mailing list
> > > The trace is the same as the one I can see when I ssh the master node and
> > > explore the logs under /mnt/var/log/hadoop/steps/ ...
> > > If you tell me the best place to add some logs in sqoop, then i can
> > > recompile and rerun
> > > The bizarre thing is that the select seems to work.
> > > Cheers
> > > Matthieu
> > >
> >
> > === SENSITIVE CONTENT REMOVED ==> > >
> > >
> > > On Thu, Oct 11, 2012 at 11:38 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]
> > >wrote:
> > >
> > > > Hi sir,
> > > > I'm sorry but it's hard to help without the actual task log that should
> > > > contain more details about the exception. I was able to dig following
> > > > Amazon documentation that deals with getting Hadoop Web UI. Would you
> > mind
> > > > trying it and see if you can reach map task log?
> > > >
> > > >
> > > >
> > http://docs.amazonwebservices.com/ElasticMapReduce/latest/DeveloperGuide/UsingtheHadoopUserInterface.html
> > > >
> > > > Jarcec
> > > >
> > > > On Thu, Oct 11, 2012 at 10:39:38AM -0400, Matthieu Labour wrote:
> > > > > Jarceq
> > > > > Thank you for your reply
> > > > > I have a hard time to believe that this is a jdbc connection issue
> > > > because
> > > > > when i execute the sqoop export command, it succesfully executes
> > > >  Executing
> > > > > SQL statement: SELECT t.* FROM `ml_ys_log_gmt_test` AS t LIMIT 1 and
> > if i
> > > > > cange the password in the sqoop export command then I
> > > > > get java.sql.SQLException: Access denied for user
> > > > > So sqoop export seems to be able to reach the Sql machine with that
> > > > > username and password
> > > > > I will use the postgresql for now as it works for me!
> > > > > Thank you for your help
> > > > >
> > > > >
> > > > > On Wed, Oct 10, 2012 at 7:58 PM, Jarek Jarcec Cecho <
> > [EMAIL PROTECTED]
> > > > >wrote:
> > > > >
> > > > > > Hi sir,
> > > > > > I have actually zero experience with amazon services, so I'm afraid
> > > > that I
> > > > > > can't much help you navigate to the map tasks logs. Usually on
> > normal
> > > > > > hadoop cluster, there is service call "Job Tracker" that is
> > serving as
> > > > > > central place for mapreduce jobs. I'm expecting that you should be
> > > > able to
> > > > > > find this webservice or something similar somehow somewhere. You
> > > > should see
> > > > > > job executed by hadoop there and you also should be able to get to
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB