Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Sqoop 1.4.x incremental job with hdfs error


Copy link to this message
-
Re: Sqoop 1.4.x incremental job with hdfs error
Hi Anthony,
would you mind sharing with us entire entire Sqoop log executed with parameter --verbose including entire exception stack trace?

Jarcec

On Wed, Jul 03, 2013 at 12:34:27AM +0900, corbacho anthony wrote:
> Hi.
>
> If I run this sqoop import as an import only (no job) and remove the
> incremental, I can use target-dir with hdfs.
>
> In fact I notice that if I don't use any incremental, I can use tartget-dir
> with hdfs... (job or normal import)
> On Jul 2, 2013 11:47 PM, "Jarek Jarcec Cecho" <[EMAIL PROTECTED]> wrote:
>
> > Hi sir,
> > Sqoop requires Hadoop configuration files available on the machine where
> > you run Sqoop. I'm wondering if the config files from machine "C" (hdfs
> > gateway I suppose) are also available on machine "A" where Sqoop is running.
> >
> > Jarcec
> >
> > On Tue, Jul 02, 2013 at 07:02:41PM +0900, corbacho anthony wrote:
> > > Hi!
> > >
> > > I am trying to create a sqoop job with incremental option.
> > > I want to save it into my hdfs, so I use the option --target-dir,
> > > but sqoop throw me an error: tool.ImportTool: Imported Failed: Wrong FS:
> > > hdfs://my.hdfs.com:54310/job_import_incrt, expected: file:///
> > >
> > > My sqoop job:
> > > sqoop job --verbose --create job_import_0 -- import --connect
> > jdbc:mysql://
> > > db.mysql.com:3306/DB --table TABLE_TEST --target-dir hdfs://
> > > my.hdfs.com:54310/db_import --username xxx --password xxx --incremental
> > > append --check-column id --last-value 1
> > >
> > > I run sqoop on a machine A, I have sqoop-metastore on a machine B and my
> > > hdfs on a machine C.
> > >
> > > What should I do to "force scoop" to save it into my hdfs and not on my
> > > local machine?
> > >
> > > PS: If i change --target-dir with a local directory, its work like a
> > charm.
> > >
> > > Thank you
> > > Anthony
> >
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB