Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop, mail # user - Sqoop 1.4.x incremental job with hdfs error


+
corbacho anthony 2013-07-02, 10:02
Copy link to this message
-
Re: Sqoop 1.4.x incremental job with hdfs error
Jarek Jarcec Cecho 2013-07-02, 14:45
Hi sir,
Sqoop requires Hadoop configuration files available on the machine where you run Sqoop. I'm wondering if the config files from machine "C" (hdfs gateway I suppose) are also available on machine "A" where Sqoop is running.

Jarcec

On Tue, Jul 02, 2013 at 07:02:41PM +0900, corbacho anthony wrote:
> Hi!
>
> I am trying to create a sqoop job with incremental option.
> I want to save it into my hdfs, so I use the option --target-dir,
> but sqoop throw me an error: tool.ImportTool: Imported Failed: Wrong FS:
> hdfs://my.hdfs.com:54310/job_import_incrt, expected: file:///
>
> My sqoop job:
> sqoop job --verbose --create job_import_0 -- import --connect jdbc:mysql://
> db.mysql.com:3306/DB --table TABLE_TEST --target-dir hdfs://
> my.hdfs.com:54310/db_import --username xxx --password xxx --incremental
> append --check-column id --last-value 1
>
> I run sqoop on a machine A, I have sqoop-metastore on a machine B and my
> hdfs on a machine C.
>
> What should I do to "force scoop" to save it into my hdfs and not on my
> local machine?
>
> PS: If i change --target-dir with a local directory, its work like a charm.
>
> Thank you
> Anthony
+
corbacho anthony 2013-07-02, 15:34
+
Jarek Jarcec Cecho 2013-07-04, 00:34
+
Dave Speer 2013-07-02, 12:45