Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # dev >> Building and Deploying MRv2


Copy link to this message
-
Re: Building and Deploying MRv2

Hi,

Finally, got all the jars built. Now is the time to run the MRv2.

It would be nice if the below documentation gets updated

http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL

----------

yarn build

Ubuntu had an older version of the protoc binary, I did put the latest
protoc binary which I build in the PATH and the following error is not
coming

yarn_protos.proto:4:8: Option "java_generate_equals_and_hash" unknown.

Then had to install autoconf 'sudo apt-get install autoconf" and the
hadoop-mapreduce-1.0-SNAPSHOT-all.tar.gz file got generated.

----------

common build

After including the forrestor 8, the following error is not coming
(HADOOP-7394 has been created for the same)

       [exec] Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/fop/messaging/MessageHandler
       [exec]     at
org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:122)
But, now getting the following error (the jars are getting generated in
the build folder)

      [exec] validate-sitemap:
      [exec]
/home/praveensripati/Installations/apache-forrest-0.8/main/webapp/resources/schema/relaxng/sitemap-v06.rng:72:31:
error: datatype library "http://www.w3.org/2001/XMLSchema-datatypes" not
recognized

      [exec] BUILD FAILED
      [exec]
/home/praveensripati/Installations/apache-forrest-0.8/main/targets/validate.xml:158:
Validation failed, messages should have been provided.

Thanks,
Praveen
On Saturday 18 June 2011 03:44 AM, Siddharth Seth wrote:
> Ubuntu seems to install the protocol buffer library (protobuf-compiler) as
> part of the standard install. Can you run 'protoc --version' to figure out
> which version is being used.
> If you've installed it separately - you could play around with the path,
> remove the package installed by Ubuntu, etc to make sure protoc 2.4 is used.
>
> - Sid
>
> On Fri, Jun 17, 2011 at 1:15 AM, Luke Lu<[EMAIL PROTECTED]>  wrote:
>
>> MR-279 actually works fine with maven 3.0.3 (sans a few (IMO bogus)
>> warnings). You can leave out the "tar" target (which depends on the
>> "docs" target, which requires forrest 0.8) to unblock the progress, as
>> mvn-install would suffice for common and hdfs builds.
>>
>> On Thu, Jun 16, 2011 at 7:55 PM, Praveen Sripati
>> <[EMAIL PROTECTED]>  wrote:
>>> Tom,
>>>
>>> I downgraded maven and also changed from open-jdk to sun-jdk and there is
>>> not any progress. I am using Ubuntu 11.04 and could not find
>> sun-java5-jdk
>>> in the Ubuntu repositories, so I installed sun-java6-jdk.
>>>
>>> praveensripati@praveensripati:~$ java -version
>>> java version "1.6.0_24"
>>> Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
>>>
>>> praveensripati@praveensripati:~$ mvn -version
>>> Apache Maven 2.2.1 (r801777; 2009-08-07 00:46:01+0530)
>>>
>>> Thanks,
>>> Praveen
>>>
>>>
>>> On Friday 17 June 2011 02:04 AM, Thomas Graves wrote:
>>>> I know at one time maven 3.x didn't work so I've been using maven 2.x.
>>>>
>>>> Well I've never tried using java6 for java5 home but I would think it
>>>> wouldn't work.  I thought it was forrest that required java5. I would
>>>> suggest using java5.
>>>>
>>>> Tom
>>>>
>>>>
>>>> On 6/16/11 12:24 PM, "Praveen Sripati"<[EMAIL PROTECTED]>
>>   wrote:
>>>>> Tom,
>>>>>
>>>>>>> Note, it looks like your java5.home is pointing to java6?
>>>>> I have java6 on my laptop and pointed java5.home variable to java6. The
>>>>> hadoop doc says "Java 1.6.x - preferable from Sun". Is this the
>> problem?
>>>>>>> What version of protobufs are you using?
>>>>> I have protobuf 2.4.1.
>>>>>
>>>>>>> What about mvn version?
>>>>> Apache Maven 3.0.3 (r1075438; 2011-02-28 23:01:09+0530)
>>>>>
>>>>>>> So you had both common and hdfs built before doing mapreduce and
>>>>> common built before building hdfs? Or was common failing with the error
>>>>> you mention  below? If you haven't already you might simply try
>>>>> veryclean on everything and go again in order.
>>>>> I tried common first and there were some errors related to fop, but the
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB