Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Building Hadoop for HBase


Copy link to this message
-
Building Hadoop for HBase
Yes, I think I could use some clues.  The HBase instructions send me to (
http://wiki.apache.org/hadoop/HowToRelease) and I looked at (
http://wiki.apache.org/hadoop/HowToRelease#Building), which is daunting
and confusing (I thought I was supposed to build with Java 6, not 5).  I
went looking for instructions for how to build for your own purposes
rather than making a release; I found (
http://wiki.apache.org/hadoop/HowToContribute).  I fetch
branch-0.20-append from SVN and then do a build, with this command (copied
from HowToContribute):
ant -Djavac.args="-Xlint -Xmaxwarns 1000" clean test tar

The actual build part produces no complaints, but some of the tests have
problems --- over the course of 3 hours.  The typescript ends like this:
BUILD FAILED
/root/apachedev/hadoop-common/build.xml:817: The following error occurred
while executing this line:
/root/apachedev/hadoop-common/build.xml:806: The following error occurred
while executing this line:
/root/apachedev/hadoop-common/src/contrib/build.xml:48: The following
error occurred while executing this line:
/root/apachedev/hadoop-common/src/contrib/streaming/build.xml:40: The
following error occurred while executing this line:
/root/apachedev/hadoop-common/src/contrib/build-contrib.xml:245: Tests
failed!

Total time: 184 minutes 11 seconds

The following searches through my typescript show the failures and errors.

# egrep -C 1 "Failures: [^0]" typescript.txt
    [junit] Running org.apache.hadoop.cli.TestCLI
    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 32.243 sec
    [junit] Test org.apache.hadoop.cli.TestCLI FAILED
--
    [junit] Running org.apache.hadoop.fs.TestLocalDirAllocator
    [junit] Tests run: 5, Failures: 3, Errors: 0, Time elapsed: 0.493 sec
    [junit] Test org.apache.hadoop.fs.TestLocalDirAllocator FAILED
--
    [junit] Running
org.apache.hadoop.mapred.lib.TestCombineFileInputFormat
    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 5.061 sec
    [junit] Test org.apache.hadoop.mapred.lib.TestCombineFileInputFormat
FAILED

# egrep -C 1 "Errors: [^0]" typescript.txt
    [junit] Running org.apache.hadoop.hdfs.TestDistributedFileSystem
    [junit] Tests run: 4, Failures: 0, Errors: 1, Time elapsed: 9.057 sec
    [junit] Test org.apache.hadoop.hdfs.TestDistributedFileSystem FAILED
--
    [junit] Running org.apache.hadoop.hdfs.TestFileAppend4
    [junit] Tests run: 18, Failures: 0, Errors: 1, Time elapsed: 315.85
sec
    [junit] Test org.apache.hadoop.hdfs.TestFileAppend4 FAILED
--
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.datanode.TestDiskError
FAILED (timeout)
--
    [junit] Running org.apache.hadoop.streaming.TestStreamingBadRecords
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.streaming.TestStreamingBadRecords
FAILED (timeout)

Also, I am puzzled by the fact that HowToContribute does not mention
building the native library but HowToRelease does.  (
http://hadoop.apache.org/common/docs/current/native_libraries.html) says "
Hadoop has native implementations of certain components for performance
reasons and for non-availability of Java implementations" --- which
suggests to me that the native library is not optional (at least for
mapreduce, which is the client mentioned in that web page).  What is going
on here?

Thanks,
Mike Spreitzer