Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # dev >> Compressed Files NPE

Copy link to this message
Compressed Files NPE

I am trying to write compressed files in HDFS. If I am using the Flume CompressedStream type with HdfsSink, it is working fine.
But If I am writing code outside and running, it is giving null pointer exception. I must be missing something. Any help?
If I change to simple file to sequence file it is working. If I do not have compression then also it is working.

        at org.apache.hadoop.io.compress.zlib.ZlibFactory.isNativeZlibLoaded(ZlibFactory.java:64)
        at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:97)
        at cieg.atap.DiSnAbs.open(DiSnAbs.java:52)
        at cieg.atap.DiAgManager$DiAgWorker.run(DiAgManager.java:205)

        InputStream in = new FileInputStream(new File(inFile));
        OutputStream out = fileSystem.create(outFile);
        if (System.getProperty("hdfs.compression.type") != null) {
            CompressionCodec lCodec = getCodec(System.getProperty("hdfs.compression.type"));
            out = lCodec.createOutputStream(out);
    public static CompressionCodec getCompressionCodec(String value)
        if (value.equals("gzip"))
            return new GzipCodec();
        else if (value.equals("bzip2"))
            return new BZip2Codec();
            return new SnappyCodec();

Mallikharjuna Rao
Director, Software Engineering
603-791-8604 (W) 603 296 7721 (C)

Mike Percy 2013-03-19, 23:05
Rao, Mallik 2013-03-20, 12:06
Mike Percy 2013-03-20, 16:48