maybe you've already seen this.
On Oct 9, 2013, at 2:16 PM, SF Hadoop <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
I am preparing to deploy multiple cluster / distros of Hadoop for testing / benchmarking.
In my research I have noticed discrepancies in the version of the JDK that various groups are using. Example: Hortonworks is suggesting JDK6u31, CDH recommends either 6 or 7 providing you stick to some guidelines for each and Apache Hadoop seems to be somewhat of a "no mans land"; a lot of people using a lot of different versions.
Does anyone have any insight they could share about how to approach choosing the best JDK release? (I'm a total Java newb, so any info / further reading you guys can provide is appreciated.)