Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Avro >> mail # user >> Backwards compatible - Optional fields


Copy link to this message
-
Backwards compatible - Optional fields
Hi all,

I had an impression that reader works with older version object as long as
the new fields are optional.  Is that true?  If not,
what would you recommend?  Thanks a lot in advance.

For example:

{
  "namespace": "org.apache.avro.examples",
  "protocol": "MyProtocol",

  "types": [
    { "name": "Metadata", "type": "record", "fields": [
      {"name": "S1", "type": "string"}
    ]},

    { "name": "Metadatav2", "type": "record", "fields": [
      {"name": "S1", "type": "string"},
      {"name": "S2", "type": ["string", "null"]}      // optional field in
the new version
    ]}
  ]
}

    public static <A extends SpecificRecordBase> A parseAvroObject(final A
a, final byte[] bb)
    throws IOException {
        if (bb == null) {
            return null;
        }
        ByteArrayInputStream bais = new ByteArrayInputStream(bb);
        DatumReader<A> dr = new SpecificDatumReader<A>(a.getSchema());
        Decoder d = DecoderFactory.get().binaryDecoder(bais, null);
        return dr.read(a, d);
    }
public static void main(String[] args) throws IOException {

        Metadata.Builder mb = Metadata.newBuilder();
        mb.setS1("S1 value");
        byte[] bs = toBytes(mb.build());

        Metadata m = parseAvroObject(new Metadata(), bs);
        System.out.println("parse as Metadata " + m);

        // This I thought it worked with older avro
        Metadatav2 m2 = parseAvroObject(new Metadatav2(), bs);
        System.out.println("parse as Metadatav2 " + m2);
     }
Exception in thread "main" java.io.EOFException
    at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:145)
    at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:405)
    at
org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229)
    at org.apache.avro.io.parsing.Parser.advance(Parser.java:88)
    at
org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206)
    at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:142)
    at
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:166)
    at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:138)
    at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:129)
Thanks
-gabe
+
Scott Carey 2012-10-03, 06:28
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB