Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # dev >> Building and Deploying MRv2


Copy link to this message
-
Re: Building and Deploying MRv2
MR-279 actually works fine with maven 3.0.3 (sans a few (IMO bogus)
warnings). You can leave out the "tar" target (which depends on the
"docs" target, which requires forrest 0.8) to unblock the progress, as
mvn-install would suffice for common and hdfs builds.

On Thu, Jun 16, 2011 at 7:55 PM, Praveen Sripati
<[EMAIL PROTECTED]> wrote:
> Tom,
>
> I downgraded maven and also changed from open-jdk to sun-jdk and there is
> not any progress. I am using Ubuntu 11.04 and could not find sun-java5-jdk
> in the Ubuntu repositories, so I installed sun-java6-jdk.
>
> praveensripati@praveensripati:~$ java -version
> java version "1.6.0_24"
> Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
>
> praveensripati@praveensripati:~$ mvn -version
> Apache Maven 2.2.1 (r801777; 2009-08-07 00:46:01+0530)
>
> Thanks,
> Praveen
>
>
> On Friday 17 June 2011 02:04 AM, Thomas Graves wrote:
>>
>> I know at one time maven 3.x didn't work so I've been using maven 2.x.
>>
>> Well I've never tried using java6 for java5 home but I would think it
>> wouldn't work.  I thought it was forrest that required java5. I would
>> suggest using java5.
>>
>> Tom
>>
>>
>> On 6/16/11 12:24 PM, "Praveen Sripati"<[EMAIL PROTECTED]>  wrote:
>>
>>> Tom,
>>>
>>>>> Note, it looks like your java5.home is pointing to java6?
>>>
>>> I have java6 on my laptop and pointed java5.home variable to java6. The
>>> hadoop doc says "Java 1.6.x - preferable from Sun". Is this the problem?
>>>
>>>>> What version of protobufs are you using?
>>>
>>> I have protobuf 2.4.1.
>>>
>>>>> What about mvn version?
>>>
>>> Apache Maven 3.0.3 (r1075438; 2011-02-28 23:01:09+0530)
>>>
>>>>> So you had both common and hdfs built before doing mapreduce and
>>>
>>> common built before building hdfs? Or was common failing with the error
>>> you mention  below? If you haven't already you might simply try
>>> veryclean on everything and go again in order.
>>> I tried common first and there were some errors related to fop, but the
>>> common jars were created, so I started with hdfs and it was successful.
>>> Then I started the yarn build which led to the
>>> java_generate_equals_and_hash error.
>>>
>>> Thanks,
>>> Praveen
>>>
>>>
>>> On Thursday 16 June 2011 09:54 PM, Thomas Graves wrote:
>>>>
>>>> Note, it looks like your java5.home is pointing to java6?
>>>>
>>>> I've never seen this particular error. The java_generate_equals_and_hash
>>>> option seems to have been added in protobuf2.4.0. What version of
>>>> protobufs
>>>> are you using?  The instructions say to use atleast 2.4.0a, I'm using
>>>> 2.4.1
>>>> right now.
>>>>
>>>> You need to define the following  (I use a build.properties file). These
>>>> are
>>>> the version I'm currently using.  All of these are just downloaded from
>>>> the
>>>> corresponding website.  Some links to those can be found here:
>>>> http://yahoo.github.com/hadoop-common/installing.html
>>>>
>>>> java5.home=/home/tgraves/hadoop/jdk1.5.0_22/
>>>> forrest.home=/home/tgraves/hadoop/apache-forrest-0.8
>>>> ant.home=/home/tgraves/hadoop/apache-ant-1.8.2
>>>> xercescroot=/home/tgraves/hadoop/xerces-c-src_2_8_0
>>>> eclipse.home=/home/tgraves/hadoop/eclipse
>>>> findbugs.home=/home/tgraves/hadoop/findbugs-1.3.9
>>>>
>>>> I thought this was the same as for trunk but perhaps I'm mistaken.
>>>>
>>>> What about mvn version?
>>>> /home/y/libexec/maven/bin/mvn --version
>>>> Apache Maven 2.2.1 (r801777; 2009-08-06 19:16:01+0000)
>>>>
>>>> So you had both common and hdfs built before doing mapreduce and common
>>>> built before building hdfs? Or was common failing with the error you
>>>> mention
>>>> below?   If you haven't already you might simply try veryclean on
>>>> everything
>>>> and go again in order.
>>>>
>>>> Tom
>>>>
>>>>
>>>> On 6/16/11 8:10 AM, "Praveen Sripati"<[EMAIL PROTECTED]>   wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> The hdfs build was successful after including the -Dforrest.home
>>>>> property to the ant command.
>>>>>
>>>>> ***********
>>>>>
>>>>> When  I started the mapreduce build to get the below error.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB