Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Sqoop 1.4.2 checkout from trunk (installation problem) -sqoop 1.4.1 incompatible with MSSQL Server Connector


Copy link to this message
-
Re: Sqoop 1.4.2 checkout from trunk (installation problem) -sqoop 1.4.1 incompatible with MSSQL Server Connector
Cheolsoo Park 2012-06-28, 21:19
Hi Victor,

12/06/28 13:33:16 INFO mapreduce.Cluster: Failed to use
> org.apache.hadoop.mapred.LocalClientProtocolProvider due to error: Invalid
> "mapreduce.jobtracker.address" configuration value for LocalJobRunner :
> "hadooptest-01.mydomain:8021"
> 12/06/28 13:33:16 ERROR security.UserGroupInformation:
> PriviledgedActionException as:victor.sanchez (auth:SIMPLE)
> cause:java.io.IOException: Cannot initialize Cluster. Please check your
> configuration for mapreduce.framework.name and the correspond server
> addresses.
> 12/06/28 13:33:16 ERROR tool.ImportTool: Encountered IOException running
> import job: java.io.IOException: Cannot initialize Cluster. Please check
> your configuration for mapreduce.framework.name and the correspond server
> addresses.
The exception is thrown because sqoop is assuming local mode while hadoop
is configured in cluster mode. Provided that you're running hadoop in
cluster mode, the question is now why sqoop assumes local mode.

Would you mind providing the content of the following config files that CM4
generated for you in /etc/hadoop/conf?

   - core-site.xml
   - hdfs-site.xml
   - mapred-site.xml

The apache mailing list strips off any attached files, so you will have to
copy-and-paste them in an email.

Thanks a lot,
Cheolsoo

On Thu, Jun 28, 2012 at 2:49 AM, Victor Sanchez
<[EMAIL PROTECTED]>wrote:

>  Hi Cheolsoo,****
>
> ** **
>
> Well as you mention there was
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory inside
> /etc/sqoop/conf/managers.d/****
>
> ** **
>
> I removed and I now I can actually connect and list the tables but ….****
>
> $  sqoop list-tables --connect
> 'jdbc:sqlserver://hadooptest01;username=victor;password=victor;database=hadoopSQL01'
> ****
>
> 12/06/28 13:33:38 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> Table1****
>
> Table2****
>
> Table3****
>
> …****
>
> ** **
>
> If I try to import I ran into another issue. ****
>
> $ sqoop import --connect
> 'jdbc:sqlserver://hadooptest01;username=victor;password=victor;database=hadoopSQL01'
> --table Table1 --target-dir /test/Table1****
>
> 12/06/28 13:33:07 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 12/06/28 13:33:07 INFO tool.CodeGenTool: Beginning code generation****
>
> 12/06/28 13:33:08 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM Table1 AS t WHERE 1=0****
>
> 12/06/28 13:33:08 INFO orm.CompilationManager: HADOOP_HOME is
> /usr/lib/hadoop****
>
> Note: /tmp/sqoop-victor.sanchez/compile/5567c0bfbd9fd8af0ab8b0715c2245d3/
> Table1.java uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 12/06/28 13:33:12 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-victor.sanchez/compile/5567c0bfbd9fd8af0ab8b0715c2245d3/Table1.jar
> ****
>
> 12/06/28 13:33:13 INFO mapreduce.ImportJobBase: Beginning import of Table1
> ****
>
> 12/06/28 13:33:13 WARN conf.Configuration: mapred.job.tracker is
> deprecated. Instead, use mapreduce.jobtracker.address****
>
> 12/06/28 13:33:14 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/06/28 13:33:16 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/06/28 13:33:16 INFO mapreduce.Cluster: Failed to use
> org.apache.hadoop.mapred.LocalClientProtocolProvider due to error: Invalid
> "mapreduce.jobtracker.address" configuration value for LocalJobRunner :
> "hadooptest-01.mydomain:8021"****
>
> 12/06/28 13:33:16 ERROR security.UserGroupInformation:
> PriviledgedActionException as:victor.sanchez (auth:SIMPLE)
> cause:java.io.IOException: Cannot initialize Cluster. Please check your
> configuration for mapreduce.framework.name and the correspond server
> addresses.****
>
> 12/06/28 13:33:16 ERROR tool.ImportTool: Encountered IOException running
> import job: java.io.IOException: Cannot initialize Cluster. Please check
> your configuration for mapreduce.framework.name and the correspond server