Search Hadoop and all its sub project:

Switch to Threaded View
Subject: Oracle BLOB to avro then MR to HBase
I have some blob content in an oracle database which i sqoop out to hdfs as
external lob,
when I try to read from the lob file, it works for 80% records and for
others I see Reader has been closed.
at org.apache.sqoop.lib.LobRef.getDataStream(
at org.apache.sqoop.lib.LobRef.getDataStream(
at com.trgr.platform.riptide.mapreduce.MyImporter$MyDBImporter
This is how I initialize BlobRef,

String blobRefStr = new String(key.datum().getBODY().array());
BlobRef blobref = BlobRef.parse(blobRefStr);
if(blobref.isExternal()) {

line 210: try (InputStream is = blobref.getDataStream(context)) {
the avro code has

java.nio.ByteBuffer BODY;

is there something I am doing wrong here?

This MR job is to decompress blob content.



NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB