Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Why the official Hadoop Documents are so messy?


+
javaLee 2013-01-08, 11:35
+
Hemanth Yamijala 2013-01-08, 15:11
+
Mohammad Tariq 2013-01-08, 11:59
Copy link to this message
-
Re: Why the official Hadoop Documents are so messy?
Oleg Zhurakousky 2013-01-08, 12:13
Just a little clarification
This is NOT "how open source works" by any means as there are many Open Source projects with  well written and maintained documentation.
It all comes down to the 2 Open Source models
1. ASF Open Source - which is a pure democracy or may be even anarchy without any governing (individual or corporate) other then the ASF procedures/guidelines themselves
2. Stewardship-based Open Source - controlled and managed by an individual or company

Obviously in the second there is a vested interested by such individual or company to promote the product therefore things like documentation tend to be much crispier then its ASF counterparts. However the Stewardship-based Open Source model is much tighter with regard to control of what goes in, quality of code etc., then its ASF counterpart which allows a greater flow to free ideas from the community, so both are valid both are open source and both needs to exist and we developers just need to deal with it. After all its Open Source and the code is always a good source of documentation

Cheers
Oleg

On Jan 8, 2013, at 6:59 AM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:

> Hello there,
>
>      Thank you for the comments. But, just to let you know,
> it's a community work and no one in particular can be held
> responsible for these kind of small things. This is how open
> source works. Guys who are working on Hadoop have a lot
> of things to do. In spite of that, they are giving their best. In
> the process sometimes these kinda things might happen.
>
> I really appreciate your effort. But rather than this you can
> raise a JIRA if you find something wrong somewhere and
> fix it or let somebody else fix it.
>
> Many thanks.
>
>
> P.S. : Don't take it otherwise.
>
>
> Best Regards,
> Tariq
> +91-9741563634
> https://mtariq.jux.com/
>
>
> On Tue, Jan 8, 2013 at 5:05 PM, javaLee <[EMAIL PROTECTED]> wrote:
> For example,look at the documents about HDFS shell guide:
>
> In 0.17, the prefix of HDFS shell is hadoop dfs:
> http://hadoop.apache.org/docs/r0.17.2/hdfs_shell.html
>
> In 0.19, the prefix of HDFS shell is hadoop fs:
> http://hadoop.apache.org/docs/r0.19.1/hdfs_shell.html#lsr
>
> In 1.0.4,the prefix of HDFS shell is hdfs dfs:
> http://hadoop.apache.org/docs/r1.0.4/file_system_shell.html#ls
>  
>
> Reading official Hadoop ducuments is such a suffering.
> As a end user, I am confused...
>

+
Glen Mazza 2013-01-08, 13:20
+
Oleg Zhurakousky 2013-01-08, 13:28