Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: Hadoop Debian Package


Copy link to this message
-
Re: Hadoop Debian Package
Ye its is asking for file:/// instead of hdfs:// just check if it is taking
setting configuration from other location...


Shashwat Shriparv

On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <[EMAIL PROTECTED]>wrote:

> Hi,
>
> What is the version of Hadoop you use?
>
> Try using fs.defaultFS instead of fs.default.name (see the list of all
> the deprecated properties here:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
> ).
> I remember I once had a similar error message and it was due to the
> change in properties names.
>
> Regards,
>
> Sourygna
>
> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
> <[EMAIL PROTECTED]> wrote:
> > Hi to all users of Hadoop,
> >
> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
> > configure it right. The conf dir is under templates in /usr/shar/hadoop.
> I
> > edit the core-site.xml, mapred-site.xml files to give
> > <property>
> > <name>fs.default.name</name>
> > <value>hdfs://localhost:9000</value>
> > </property>
> > and for mapred
> > <property>
> > <name>mapred.job.tracker</name>
> > <value>localhost:9001</value>
> > </property>
> >
> > but i get these errors, I assume that there is problem, Hadoop cannot
> read
> > the configuration file.
> > I chaned the hadoop-env.sh to
> > export
> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
> > but dosen't solve the problem.
> >
> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309) at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
> > at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
> > at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
> >
> > ________________________________
> >
> > FATAL org.apache.hadoop.mapred.JobTracker:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: local at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1883) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:312) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:303) at
> > org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:298)
> > at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4791)
> >
> > ________________________________
> >
> > ERROR org.apache.hadoop.hdfs.server.namenode.NameNode:
> > java.lang.IllegalArgumentException: Does not contain a valid host:port
> > authority: file:/// at
> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
> > at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:265)