Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # user - Re: about replication


Copy link to this message
-
Re: about replication
Mohammad Tariq 2013-08-23, 07:38
Are you running DN on both the machines? Could you please show me your DN
logs?

Also, consider Oliver's suggestion. It's definitely a better option.

Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault
<[EMAIL PROTECTED]>wrote:

> Irfu,
>
> If you want to quickly get Hadoop running on windows platform. You may
> want to try our distribution for Windows. You will be able to find the msi
> on our website.
>
> Regards
> Olivier
> On 23 Aug 2013 05:15, "Irfan Sayed" <[EMAIL PROTECTED]> wrote:
>
>> thanks.
>> ok. i think i need to change the plan over here
>> let me create two environments. 1: totally windows 2: totally Unix
>>
>> because, on windows , anyway i have to try and see how hadoop works
>> on UNIX, it is already known that ,  it is working fine.
>>
>> so, on windows , here is the setup:
>>
>> namenode : windows 2012 R2
>> datanode : windows 2012 R2
>>
>> now, the exact problem is :
>> 1: datanode is not getting started
>> 2: replication : if i put any file/folder on any datanode , it should get
>> replicated to all another available datanodes
>>
>> regards
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>
>>> Seriously??You are planning to develop something using Hadoop on
>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>> also need some additional info :
>>> -The exact problem which you are facing right now
>>> -Your cluster summary(no. of nodes etc)
>>> -Your latest configuration files
>>> -Your /etc.hosts file
>>>
>>> Warm Regards,
>>> Tariq
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>
>>>> ok. thanks
>>>> now, i need to start with all windows setup first as our product will
>>>> be based on windows
>>>> so, now, please tell me how to resolve the issue
>>>>
>>>> datanode is not starting . please suggest
>>>>
>>>> regards,
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing that.
>>>>> But it is not a very wise setup.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <[EMAIL PROTECTED]>wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> can i have setup like this :
>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>> etc )
>>>>>>>
>>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>>> separate ?
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>> [EMAIL PROTECTED]> wrote:
>>>>>>>
>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>> Linux if you are new to it.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <[EMAIL PROTECTED]
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> thanks
>>>>>>>>> here is what i did .
>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>> command
>>>>>>>>> then deleted all pid files for namenodes and datanodes