Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Re: about replication


+
Irfan Sayed 2013-08-06, 06:50
+
manish dunani 2013-08-06, 07:10
+
Irfan Sayed 2013-08-05, 06:53
+
Mohammad Tariq 2013-08-05, 09:33
+
Irfan Sayed 2013-08-05, 11:16
+
Mohammad Tariq 2013-08-05, 11:48
+
Irfan Sayed 2013-08-05, 11:59
+
manish dunani 2013-08-05, 12:30
+
Irfan Sayed 2013-08-06, 04:59
+
manish dunani 2013-08-06, 06:10
+
Mohammad Tariq 2013-08-06, 20:22
+
Mohammad Tariq 2013-08-06, 11:49
+
Irfan Sayed 2013-08-06, 11:56
Copy link to this message
-
Re: about replication
Mohammad Tariq 2013-08-06, 12:00
Create 2 directories manually corresponding to the values of dfs.name.dir
and dfs.data.dir properties and change the permissions of these directories
to 755. When you start pushing data into your HDFS, data will start going
inside the directory specified by dfs.data.dir and the associated metadata
will go inside dfs.name.dir. Remember, you store data in HDFS, but it
eventually gets stored in your local/native FS. But you cannot see this
data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <[EMAIL PROTECTED]> wrote:

> thanks.
> however, i need this to be working on windows environment as project
> requirement.
> i will add/work on Linux later
>
> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
> create it from command line ?
>
> please suggest
>
> regards,
>
>
>
> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:
>
>> Hello Irfan,
>>
>> Sorry for being unresponsive. Got stuck with some imp work.
>>
>> HDFS webUI doesn't provide us the ability to create file or directory.
>> You can browse HDFS, view files, download files etc. But operation like
>> create, move, copy etc are not supported.
>>
>> These values look fine to me.
>>
>> One suggestion though. Try getting a Linux machine(if possible). Or at
>> least use a VM. I personally feel that using Hadoop on windows is always
>> messy.
>>
>> Warm Regards,
>> Tariq
>> cloudfront.blogspot.com
>>
>>
>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <[EMAIL PROTECTED]> wrote:
>>
>>> thanks.
>>> when i browse the file system , i am getting following :
>>> i haven't seen any make directory option there
>>>
>>> i need to create it from command line ?
>>> further, in the hdfs-site.xml file , i have given following entries. are
>>> they correct ?
>>>
>>> <property>
>>>   <name>dfs.data.dir</name>
>>>   <value>c:\\wksp</value>
>>>   </property>
>>> <property>
>>>   <name>dfs.name.dir</name>
>>>   <value>c:\\wksp</value>
>>>   </property>
>>>
>>> please suggest
>>>
>>>
>>> [image: Inline image 1]
>>>
>>>
>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <[EMAIL PROTECTED]>wrote:
>>>
>>>> *You are wrong at this:*
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>
>>>> Because,You had wrote both the paths local and You need not to copy
>>>> hadoop into hdfs...Hadoop is already working..
>>>>
>>>> Just check out in browser by after starting ur single node cluster :
>>>>
>>>> localhost:50070
>>>>
>>>> then go for browse the filesystem link in it..
>>>>
>>>> If there is no directory then make directory there.
>>>> That is your hdfs directory.
>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>> are going to do processing on that data in text file.That's why hadoop is
>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>> it...otherwise not possible..
>>>>
>>>> *Try this: *
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>> /hdfs/directory/path
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> thanks. yes , i am newbie.
>>>>> however, i need windows setup.
>>>>>
>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>> be working ...
>>>>> can you please help
>>>>>
>>>>> regards
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>
+
Irfan Sayed 2013-08-20, 09:26
+
Mohammad Tariq 2013-08-20, 09:53
+
Irfan Sayed 2013-08-20, 10:44
+
Mohammad Tariq 2013-08-21, 11:58
+
Irfan Sayed 2013-08-22, 04:22
+
Arpit Agarwal 2013-08-22, 04:29
+
Irfan Sayed 2013-08-22, 05:05
+
Arpit Agarwal 2013-08-22, 05:56
+
Irfan Sayed 2013-08-22, 06:19
+
Irfan Sayed 2013-08-22, 11:31
+
Mohammad Tariq 2013-08-22, 14:26
+
Irfan Sayed 2013-08-23, 04:15
+
Olivier Renault 2013-08-23, 07:27
+
Irfan Sayed 2013-08-23, 09:54
+
Olivier Renault 2013-08-23, 10:40
+
Olivier Renault 2013-09-03, 07:28
+
Irfan Sayed 2013-09-05, 10:33