Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # dev - Build failure in map reduce trunk


Copy link to this message
-
Re: Build failure in map reduce trunk
Praveen Sripati 2011-08-26, 03:40
Thairindu,

Try to get more information for debugging.

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

Thanks,
Praveen

On Fri, Aug 26, 2011 at 1:55 AM, Tharindu Mathew <[EMAIL PROTECTED]>wrote:

> Thanks Praveen.
>
> I managed to proceed further. Now I'm stuck at this point. Appreciate if
> you
> can tell me what I'm doing wrong.
>
> Stacktrace:
>
> [INFO] --- make-maven-plugin:1.0-beta-1:configure (make) @
> hadoop-yarn-server-nodemanager ---
> [INFO] checking for a BSD-compatible install... /usr/bin/install -c
> [INFO] checking whether build environment is sane... yes
> [INFO] checking for a thread-safe mkdir -p... ./install-sh -c -d
> [INFO] checking for gawk... no
> [INFO] checking for mawk... no
> [INFO] checking for nawk... no
> [INFO] checking for awk... awk
> [INFO] checking whether make sets $(MAKE)... yes
> [INFO] ./configure: line 2226: CHECK_INSTALL_CFLAG: command not found
> [INFO] ./configure: line 2227: HADOOP_UTILS_SETUP: command not found
> [INFO] checking for gcc... gcc
> [INFO] checking for C compiler default output file name... a.out
> [INFO] checking whether the C compiler works... yes
> [INFO] checking whether we are cross compiling... no
> [INFO] checking for suffix of executables...
> [INFO] checking for suffix of object files... o
> [INFO] checking whether we are using the GNU C compiler... yes
> [INFO] checking whether gcc accepts -g... yes
> [INFO] checking for gcc option to accept ISO C89... none needed
> [INFO] checking for style of include used by make... GNU
> [INFO] checking dependency style of gcc... gcc3
> [INFO] checking whether gcc and cc understand -c and -o together... yes
> [INFO] checking how to run the C preprocessor... gcc -E
> [INFO] checking for grep that handles long lines and -e... /usr/bin/grep
> [INFO] checking for egrep... /usr/bin/grep -E
> [INFO] checking for ANSI C header files... yes
> [INFO] checking for sys/types.h... yes
> [INFO] checking for sys/stat.h... yes
> [INFO] checking for stdlib.h... yes
> [INFO] checking for string.h... yes
> [INFO] checking for memory.h... yes
> [INFO] checking for strings.h... yes
> [INFO] checking for inttypes.h... yes
> [INFO] checking for stdint.h... yes
> [INFO] checking for unistd.h... yes
> [INFO] checking for unistd.h... (cached) yes
> [INFO] checking for stdbool.h that conforms to C99... yes
> [INFO] checking for _Bool... yes
> [INFO] checking for an ANSI C-conforming const... yes
> [INFO] checking for off_t... yes
> [INFO] checking for size_t... yes
> [INFO] checking whether strerror_r is declared... yes
> [INFO] checking for strerror_r... yes
> [INFO] checking whether strerror_r returns char *... no
> [INFO] checking for mkdir... yes
> [INFO] checking for uname... yes
> [INFO] configure: creating ./config.status
> [INFO] config.status: creating Makefile
> [INFO] config.status: executing depfiles commands
> [INFO]
> [INFO] --- make-maven-plugin:1.0-beta-1:make-install (install) @
> hadoop-yarn-server-nodemanager ---
> [INFO] depbase=`echo impl/configuration.o | sed
> 's|[^/]*$|.deps/&|;s|\.o$||'`;\
> [INFO] gcc -DPACKAGE_NAME=\"linux-container-executor\"
> -DPACKAGE_TARNAME=\"linux-container-executor\" -DPACKAGE_VERSION=\"1.0.0\"
> -DPACKAGE_STRING=\"linux-container-executor\ 1.0.0\" -DPACKAGE_BUGREPORT=\"
> [EMAIL PROTECTED]\" -D_GNU_SOURCE=1
> -DPACKAGE=\"linux-container-executor\" -DVERSION=\"1.0.0\" -DSTDC_HEADERS=1
> -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1
> -DHAVE_STRING_H=1
> -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
> -DHAVE_UNISTD_H=1 -DHAVE_UNISTD_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1
> -DHAVE_DECL_STRERROR_R=1 -DHAVE_STRERROR_R=1 -DHAVE_MKDIR=1 -DHAVE_UNAME=1
> -I.    -I./impl -Wall -g -Werror -DHADOOP_CONF_DIR= -MT
> impl/configuration.o
> -MD -MP -MF $depbase.Tpo -c -o impl/configuration.o impl/configuration.c
> &&\
> [INFO] mv -f $depbase.Tpo $depbase.Po