Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Issue against Hadoop-2.0.4-alpha


Copy link to this message
-
Re: Issue against Hadoop-2.0.4-alpha
Hi Sam,
thank you for sharing all the details. As a next step I would suggest to explore the classpath that Sqoop will end up having (for example using jinfo) and verify whether there are HDFS jars available.

Jarcec

On Mon, Jul 22, 2013 at 12:08:08AM -0700, Abraham Elmahrek wrote:
> Sam,
>
> Have you tried removing the trailing '/' in the value of fs.default.name?
> Also, I believe that is deprecated and should be replaced by fs.defaultFS.
>
> -Abe
>
>
> On Sun, Jul 21, 2013 at 10:42 PM, sam liu <[EMAIL PROTECTED]> wrote:
>
> > Hi Jarek,
> >
> > Sqoop import tool still failed on my hadoop 2.x cluster, and above is my
> > core-site.xml. can you help take a look at it?
> >
> > Thanks!
> >
> >
> > 2013/7/19 sam liu <[EMAIL PROTECTED]>
> >
> >> Hi Jarek,
> >>
> >> Yes, the hdfs-site.xml file is in ${HADOOP_HOME}/etc/hadoop. I also added
> >> hadoop related jars into CLASSPATH:
> >> '/home/hadoop-2.0.3-alpha/share/hadoop/common/hadoop-common-2.0.3-alpha.jar:/home/hadoop-2.0.3-alpha/share/hadoop/mapreduce/*.jar:/home/hadoop-2.0.3-alpha/share/hadoop/yarn/*.jar:/home/hadoop-2.0.3-alpha/share/hadoop/hdfs/hadoop-hdfs-2.0.3-alpha.jar:...',
> >> but Sqoop import tool still failed due to same exception.
> >>
> >> Below is my core-site.xml:
> >> <configuration>
> >>   <property>
> >>     <name>fs.default.name</name>
> >>     <value>hdfs://namenode_hostname:9010/</value>
> >>   </property>
> >>
> >>   <property>
> >>      <name>hadoop.tmp.dir</name>
> >>      <value>/home/temp/hadoop/core_temp</value>
> >>   </property>
> >>
> >> </configuration>
> >>
> >>
> >> Thanks!
> >>
> >>
> >> 2013/7/17 Jarek Jarcec Cecho <[EMAIL PROTECTED]>
> >>
> >>> Hi Sam,
> >>> thank you for sharing the details. I'm assuming that the hdfs-site.xml
> >>> file is in ${HADOOP_HOME}/etc/hadoop, is that correct? As the hdfs-site.xml
> >>> contains more server side configuration of HDFS, I would be more interested
> >>> to know the content of core-site.xml file. I would suggest to explore the
> >>> classpath that Sqoop will end up having (for example using jinfo) and
> >>> verify whether there are HDFS jars available.
> >>>
> >>> Jarcec
> >>>
> >>> On Wed, Jul 17, 2013 at 10:54:05AM +0800, sam liu wrote:
> >>> > Hi Jarek,
> >>> >
> >>> > Below are my configurations:
> >>> >
> >>> > 1) Env Parameters:
> >>> > export HADOOP_HOME=/opt/hadoop-2.0.3-alpha
> >>> > export PATH=$HADOOP_HOME/bin:$PATH
> >>> > export PATH=$HADOOP_HOME/sbin:$PATH
> >>> > export HADOOP_MAPARED_HOME=${HADOOP_HOME}
> >>> > export HADOOP_COMMON_HOME=${HADOOP_HOME}
> >>> > export HADOOP_HDFS_HOME=${HADOOP_HOME}
> >>> > export YARN_HOME=${HADOOP_HOME}
> >>> > export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
> >>> > export HDFS_CONF_DIR=${HADOOP_HOME}/etc/hadoop
> >>> > export YARN_CONF_DIR=${HADOOP_HOME}/etc/hadoop
> >>> >
> >>> > 2) hdfs-site.xml:
> >>> > <?xml version="1.0"?>
> >>> > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >>> >
> >>> > <configuration>
> >>> >
> >>> >   <property>
> >>> >     <name>dfs.replication</name>
> >>> >     <value>1</value>
> >>> >   </property>
> >>> >
> >>> >   <property>
> >>> >     <name>dfs.name.dir</name>
> >>> >     <value>/home/temp/hadoop/dfs_name_dir</value>
> >>> >   </property>
> >>> >
> >>> >   <property>
> >>> >     <name>dfs.data.dir</name>
> >>> >     <value>/home/temp/hadoop/dfs_data_dir</value>
> >>> >   </property>
> >>> >
> >>> >   <property>
> >>> >     <name>dfs.webhdfs.enabled</name>
> >>> >     <value>true</value>
> >>> >   </property>
> >>> > </configuration>
> >>> >
> >>> >
> >>> >
> >>> >
> >>> > 2013/7/17 Jarek Jarcec Cecho <[EMAIL PROTECTED]>
> >>> >
> >>> > > Hi sir,
> >>> > > the exception is suggesting that FileSystem implementation for your
> >>> > > default FS can't be found. I would check HDFS configuration to
> >>> ensure that
> >>> > > it's configured properly and that Sqoop is properly picking the
> >>> > > configuration and all the HDFS libraries.
> >>> > >
> >>> > > Jarcec
> >>> > >
> >>> > > On Tue, Jul 16, 2013 at 11:29:07AM +0800, sam liu wrote:
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB