Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # dev >> Compressed Files NPE


Copy link to this message
-
Compressed Files NPE
Hello

I am trying to write compressed files in HDFS. If I am using the Flume CompressedStream type with HdfsSink, it is working fine.
But If I am writing code outside and running, it is giving null pointer exception. I must be missing something. Any help?
If I change to simple file to sequence file it is working. If I do not have compression then also it is working.

Error:
java.lang.NullPointerException
        at org.apache.hadoop.io.compress.zlib.ZlibFactory.isNativeZlibLoaded(ZlibFactory.java:64)
        at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:97)
        at cieg.atap.DiSnAbs.open(DiSnAbs.java:52)
        at cieg.atap.DiAgManager$DiAgWorker.run(DiAgManager.java:205)

SampleCode:
        InputStream in = new FileInputStream(new File(inFile));
        OutputStream out = fileSystem.create(outFile);
        if (System.getProperty("hdfs.compression.type") != null) {
            CompressionCodec lCodec = getCodec(System.getProperty("hdfs.compression.type"));
            out = lCodec.createOutputStream(out);
        }
    public static CompressionCodec getCompressionCodec(String value)
    {  
        if (value.equals("gzip"))
            return new GzipCodec();
        else if (value.equals("bzip2"))
            return new BZip2Codec();
        else
            return new SnappyCodec();
    }

Thanks
Mallikharjuna Rao
Director, Software Engineering
603-791-8604 (W) 603 296 7721 (C)