Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Building LZO on hadoop


Copy link to this message
-
Re: Building LZO on hadoop
Actually, if one installs the latest liblzo and sets CFLAGS, LDFLAGS
and LFLAGS correctly, things work fine.
Saptarshi Guha

On Wed, Apr 1, 2009 at 3:55 PM, Saptarshi Guha <[EMAIL PROTECTED]> wrote:
> Fixed. In the configure script src/native/
> change
>  echo 'int main(int argc, char **argv){return 0;}' > conftest.c
>  if test -z "`${CC} ${LDFLAGS} -o conftest conftest.c -llzo2 2>&1`"; then
>        if test ! -z "`which objdump | grep -v 'no objdump'`"; then
>      ac_cv_libname_lzo2="`objdump -p conftest | grep NEEDED | grep
> lzo2 | sed 's/\W*NEEDED\W*\(.*\)\W*$/\"\1\"/'`"
>    elif test ! -z "`which ldd | grep -v 'no ldd'`"; then
>      ac_cv_libname_lzo2="`ldd conftest | grep lzo2 | sed
> 's/^[^A-Za-z0-9]*\([A-Za-z0-9\.]*\)[^A-Za-z0-9]*=>.*$/\"\1\"/'`"
>    else
>      { { echo "$as_me:$LINENO: error: Can't find either 'objdump' or
> 'ldd' to compute the dynamic library for '-llzo2'" >&5
> echo "$as_me: error: Can't find either 'objdump' or 'ldd' to compute
> the dynamic library for '-llzo2'" >&2;}
>   { (exit 1); exit 1; }; }
>    fi
>  else
>    ac_cv_libname_lzo2=libnotfound.so
>  fi
>  rm -f conftest*
>
> lzo2 to lzo.so.2 (again this depends on what the user has), also set
> CFLAGS and LDFLAGS to include your lzo libs/incs
>
>
>
> Saptarshi Guha
>
>
>
> On Wed, Apr 1, 2009 at 2:29 PM, Saptarshi Guha <[EMAIL PROTECTED]> wrote:
>> I checked out hadoop-core-0.19
>> export CFLAGS=$CUSTROOT/include
>> export LDFLAGS=$CUSTROOT/lib
>>
>> (they contain lzo which was built with --shared)
>>>ls $CUSTROOT/include/lzo/
>> lzo1a.h  lzo1b.h  lzo1c.h  lzo1f.h  lzo1.h  lzo1x.h  lzo1y.h  lzo1z.h
>> lzo2a.h  lzo_asm.h  lzoconf.h  lzodefs.h  lzoutil.h
>>
>>>ls $CUSTROOT/lib/
>> liblzo2.so  liblzo.a  liblzo.la  liblzo.so  liblzo.so.1  liblzo.so.2
>> liblzo.so.2.0.0
>>
>> I then run (from hadoop-core-0.19.1/)
>> ant -Dcompile.native=true
>>
>> I get messages like : (many others like this)
>> exec] configure: WARNING: lzo/lzo1x.h: accepted by the compiler,
>> rejected by the preprocessor!
>>     [exec] configure: WARNING: lzo/lzo1x.h: proceeding with the
>> compiler's result
>>     [exec] checking for lzo/lzo1x.h... yes
>>     [exec] checking Checking for the 'actual' dynamic-library for
>> '-llzo2'... (cached)
>>     [exec] checking lzo/lzo1y.h usability... yes
>>     [exec] checking lzo/lzo1y.h presence... no
>>     [exec] configure: WARNING: lzo/lzo1y.h: accepted by the compiler,
>> rejected by the preprocessor!
>>     [exec] configure: WARNING: lzo/lzo1y.h: proceeding with the
>> compiler's result
>>     [exec] checking for lzo/lzo1y.h... yes
>>     [exec] checking Checking for the 'actual' dynamic-library for
>> '-llzo2'... (cached)
>>
>> and finally,
>> ive/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c  -fPIC -DPIC
>> -o .libs/LzoCompressor.o
>>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
>> In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
>>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:137:
>> error: expected expression before ',' token
>>
>>
>> Any ideas?
>> Saptarshi Guha
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB