Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: Hadoop Debian Package


Copy link to this message
-
Re: Hadoop Debian Package
you can do that using these command :

sudo gedit ~/.bashrc

then go to the end of the file and add this line :
export HADOOP_HOME=/YOUR_FULL_HADOOP_PATH

after that use it to freeze the changes :
source ~/.bashrc

to check it :
echo $HADOOP_HOME

This will permanently set your HADOOP_HOME.

HTH
Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com
On Sun, Mar 17, 2013 at 11:46 PM, Mohammad Alkahtani
<[EMAIL PROTECTED]>wrote:

> Hi Tariq, Could you please tell me how to set HADOOP_HOME because I don't
> find it in the hadoop-env.sh
>
> Thank you Shashwat
> this is the output and it is already configured but hadoop don't read the
> configuration from here.
>
> /usr/share/maven-repo/org/apache
> /commons/commons-parent/22/commons-parent-22-site.xml
> /usr/share/maven-repo/org/apache/commons/commons-parent/debian
> /commons-parent-debian-site.xml
> /usr/share/maven-repo/org/apache/apache/10/apache-10-site.xml
> /usr/share/maven-repo/org/apache/apache/debian/apache-debian-site.xml
> /usr/share/compiz/composite.xml
> /usr/share/hadoop/templates/conf/mapred-site.xml
> /usr/share/hadoop/templates/conf/core-site.xml
> /usr/share/hadoop/templates/conf/hdfs-site.xml
>
> Mohammad Alkahtani
> P.O.Box 102275
> Riyadh 11675
> Saudi Arabia
> mobile: 00966 555 33 1717
>
>
> On Sun, Mar 17, 2013 at 9:07 PM, shashwat shriparv <
> [EMAIL PROTECTED]> wrote:
>
>> try
>> find / -type f -iname "*site.xml"
>> it will show you where ever those files are..
>>
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Sun, Mar 17, 2013 at 11:34 PM, Mohammad Alkahtani <
>> [EMAIL PROTECTED]> wrote:
>>
>>> The problem is I tried I read the configuration file by changing
>>> export HADOOP_CONF_DIR=${HADOOP_CONF_
>>> DIR:-"/usr/shar/hadoop/templates/conf"}
>>> but I think Hadoop dosen't get the configration from this dir, I trid
>>> and searched the system for conf dir the only dir is this one which I
>>> changed.
>>>
>>> Mohammad Alkahtani
>>> P.O.Box 102275
>>> Riyadh 11675
>>> Saudi Arabia
>>> mobile: 00966 555 33 1717
>>>
>>>
>>> On Sun, Mar 17, 2013 at 8:57 PM, shashwat shriparv <
>>> [EMAIL PROTECTED]> wrote:
>>>
>>>> Ye its is asking for file:/// instead of hdfs:// just check if it is
>>>> taking setting configuration from other location...
>>>>
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>> On Sun, Mar 17, 2013 at 11:07 PM, Luangsay Sourygna <[EMAIL PROTECTED]
>>>> > wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> What is the version of Hadoop you use?
>>>>>
>>>>> Try using fs.defaultFS instead of fs.default.name (see the list of all
>>>>> the deprecated properties here:
>>>>>
>>>>> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>>>>> ).
>>>>> I remember I once had a similar error message and it was due to the
>>>>> change in properties names.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Sourygna
>>>>>
>>>>> On Sun, Mar 17, 2013 at 2:32 PM, Mohammad Alkahtani
>>>>> <[EMAIL PROTECTED]> wrote:
>>>>> > Hi to all users of Hadoop,
>>>>> >
>>>>> > I installed Hadoop the .deb file on Ubuntu 12.04 but I might could
>>>>> not
>>>>> > configure it right. The conf dir is under templates in
>>>>> /usr/shar/hadoop. I
>>>>> > edit the core-site.xml, mapred-site.xml files to give
>>>>> > <property>
>>>>> > <name>fs.default.name</name>
>>>>> > <value>hdfs://localhost:9000</value>
>>>>> > </property>
>>>>> > and for mapred
>>>>> > <property>
>>>>> > <name>mapred.job.tracker</name>
>>>>> > <value>localhost:9001</value>
>>>>> > </property>
>>>>> >
>>>>> > but i get these errors, I assume that there is problem, Hadoop
>>>>> cannot read
>>>>> > the configuration file.
>>>>> > I chaned the hadoop-env.sh to
>>>>> > export
>>>>> HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/usr/shar/hadoop/templates/conf"}
>>>>> > but dosen't solve the problem.
>>>>> >
>>>>> > ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
>>>>> > java.lang.IllegalArgumentException: Does not contain a valid
>>>>> host:port
>>>>> > authority: file:/// at
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB