Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> Hadoop Debian Package


Copy link to this message
-
Re: Hadoop Debian Package
try
find / -type f -iname "*site.xml"
it will show you where ever those files are..


Shashwat Shriparv

On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani
<[EMAIL PROTECTED]>wrote:

> The problem is I tried I read the configuration file by changing
> export HADOOP_CONF_DIR=${HADOOP_CONF_
> DIR:-"/usr/shar/hadoop/templates/conf"}
> but I think Hadoop dosen't get the configration from this dir, I trid and
> searched the system for conf dir the only dir is this one which I changed.
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
> [EMAIL PROTECTED]> wrote:
>
>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>> taking setting configuration from other location...
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <[EMAIL PROTECTED]>wrote:
>>
>>> Hi,
>>>
>>> What is the version of Hadoop you use?
>>>
>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>> the deprecated properties here:
>>>
>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>> ).
>>> I remember I once had a similar error message and it was due to the
>>> change in properties names.
>>>
>>> Regards,
>>>
>>> Sourygna
>>>
>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>> <[EMAIL PROTECTED]> wrote:
>>> > Hi to all users of Hadoop,
>>> >
>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could not
>>> > configure it right. The conf dir is under templates in
>>> /usr/shar/hadoop. I
>>> > edit the core-site.xml, mapred-site.xml files to give
>>> > <property>
>>> > <name>fs.default.name</name>
>>> > <value>hdfs://localhost:9000</value>
>>> > </property>
>>> > and for mapred
>>> > <property>
>>> > <name>mapred.job.tracker</name>
>>> > <value>localhost:9001</value>
>>> > </property>
>>> >
>>> > but i get these errors, I assume that there is problem, Hadoop cannot
>>> read
>>> > the configuration file.
>>> > I chaned the hadoop-env.sh to
>>> > export
>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>> > but dosen't solve the problem.
>>> >
>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: file:/// at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:201)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:231)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:225)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:347)
>>> > at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:309)
>>> at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
>>> > at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
>>> > at
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
>>> >
>>> > ________________________________
>>> >
>>> > FATAL org.apache.hadoop.mapred.JobTracker:
>>> > java.lang.IllegalArgumentException: Does not contain a valid host:port
>>> > authority: local at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164) at
>>> > org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130) at
>>> > org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2312) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:2070) at
>>> > org.apache.hadoop.mapred.JobTracker.(JobTracker.java:1889) at
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB