The fact you're mentioning avro plugin indicating that I think you're
playing with obsolete patches. The current code uses avro-maven-plugin
1.5.1 which is available via maven central.
Please follow the current instruction here:
I have updated the MAPREDUCE-279 jira's description to point to the
above couple of days ago.
On Sat, Jun 18, 2011 at 1:20 AM, Thomas Anderson
<[EMAIL PROTECTED]> wrote:
> Thanks. Recompiling common and hdfs by commands
> 1st. compile commons
> ant veryclean mvn-install (hadoop-common-0.22.0-SNAPSHOT.jar is
> installed to m2 repository)
> 2nd. compile hdfs
> ant veryclean mvn-install -Dresolvers=internal (It installs
> hadoop-hdfs-0.22.0-SNAPSHOT.jar to repository)
> seem to solve the problem below
> [javac] mr-279/hdfs/build.xml:339: warning: 'includeantruntime' was
> not set, defaulting to build.sysclasspath=last; set to false for
> repeatable builds
> [javac] Compiling 237 source files to mr-279/hdfs/build/classes
> [javac] mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:119:
> cannot find symbol
> Now the issue seems to be how to build avro-maven-plugin-1.4.0?
> I execute the command
> export MAVEN_OPTS=-Xmx512m
> mvn clean install assembly:assembly
> would produce error
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources) on
> project yarn-api: Command execution failed. Process exited with an
> error: 127(Exit value: 127) -> [Help 1]
> But the avro maven plugin obtained from
> https://github.com/phunt/avro-maven-plugin.git only provide 1.0. Would
> modify version (pointing it to 1.4.0) in pom work? Or what is the next
> step for building yarn-api?
> Thanks for help.
> On Fri, Jun 17, 2011 at 9:09 PM, Thomas Graves <[EMAIL PROTECTED]> wrote:
>> Did you build common and hdfs before doing mvn install in mapreduce? You
>> have to build them in order as stated in INSTALL doc - common, hdfs, then
>> On 6/17/11 3:33 AM, "Thomas Anderson" <[EMAIL PROTECTED]> wrote:
>>> I was not aware that the source downloaded (a few months ago) is
>>> obsoleted. So now I switch by doing svn update, which solves the code
>>> obsolete issue.
>>> Then I follow the instruction at mapreduce/INSTALL , which points
>>> to install depdencies for yarn first at README. During searching
>>> mailing list, it seems avro plugin is not necessary to install
>>> manually. So I only install protobuf 2.4.1 (configure/ make/ make
>>> install works ok.) But after that, a mvn install under mapreduce
>>> produces error
>>> Failed to execute goal on project yarn-api: Could not resolve
>>> dependencies for project org.apache.hadoop:yarn-api:jar:1.0-SNAPSHOT:
>>> Failure to find org.apache.hadoop:hadoop-hdfs:jar:0.22.0-SNAPSHOT in
>>> https://repository.jboss.org/nexus/content/groups/public-jboss/ was
>>> cached in the local repository, resolution will not be reattempted
>>> until the update interval of jboss-public-repository-group has elapsed
>>> or updates are forced -> [Help 1]
>>> That looks like missing the hdfs jar artifact. So cd to mr-279/hdfs
>>> and execute `ant clean package` generates error message as shown in
>>> compile-hdfs-classes section.
>>> What is the right order/ procedure to successfully compile mr-279? Or
>>> will have a new update for the instruction?
>>> Thanks for help.
>>> . mapreduce/INSTALL.
>>> . README.
>>> [javac] mr-279/hdfs/build.xml:339: warning: 'includeantruntime'
>>> was not set, defaulting to build.sysclasspath=last; set to false for
>>> repeatable builds
>>> [javac] Compiling 237 source files to mr-279/hdfs/build/classes