Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> about replication


Copy link to this message
-
Re: about replication
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <[EMAIL PROTECTED]> wrote:

> i have created these dir "wksp_data" and "wksp_name" on both datanode and
> namenode
> made the respective changes in "hdfs-site.xml" file
> formatted the namenode
> started the dfs
>
> but still, not able to browse the file system through web browser
> please refer below
>
> anything still missing ?
> please suggest
>
> [image: Inline image 1]
>
>
> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <[EMAIL PROTECTED]> wrote:
>
>> these dir needs to be created on all datanodes and namenodes ?
>> further,  hdfs-site.xml needs to be updated on both datanodes and
>> namenodes for these new dir?
>>
>> regards
>>
>>
>>
>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>
>>> Create 2 directories manually corresponding to the values of
>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>> these directories to 755. When you start pushing data into your HDFS, data
>>> will start going inside the directory specified by dfs.data.dir and the
>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>> cannot see this data directly on your local/native FS.
>>>
>>> Warm Regards,
>>> Tariq
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>
>>>> thanks.
>>>> however, i need this to be working on windows environment as project
>>>> requirement.
>>>> i will add/work on Linux later
>>>>
>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>>> to create it from command line ?
>>>>
>>>> please suggest
>>>>
>>>> regards,
>>>>
>>>>
>>>>
>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Hello Irfan,
>>>>>
>>>>> Sorry for being unresponsive. Got stuck with some imp work.
>>>>>
>>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>>> You can browse HDFS, view files, download files etc. But operation like
>>>>> create, move, copy etc are not supported.
>>>>>
>>>>> These values look fine to me.
>>>>>
>>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>>> messy.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> thanks.
>>>>>> when i browse the file system , i am getting following :
>>>>>> i haven't seen any make directory option there
>>>>>>
>>>>>> i need to create it from command line ?
>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>> are they correct ?
>>>>>>
>>>>>> <property>
>>>>>>   <name>dfs.data.dir</name>
>>>>>>   <value>c:\\wksp</value>
>>>>>>   </property>
>>>>>> <property>
>>>>>>   <name>dfs.name.dir</name>
>>>>>>   <value>c:\\wksp</value>
>>>>>>   </property>
>>>>>>
>>>>>> please suggest
>>>>>>
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <[EMAIL PROTECTED]>wrote:
>>>>>>
>>>>>>> *You are wrong at this:*
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.