Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Avro >> mail # user >> Exception while encoding generic record - java.lang.String cannot be cast to org.apache.avro.generic.IndexedRecord


Copy link to this message
-
Re: Exception while encoding generic record - java.lang.String cannot be cast to org.apache.avro.generic.IndexedRecord
In the code you sent I don't see you ever setting the "ud" field.
Also note that GenericRecord#toString() does not validate against the
schema.  So both a record with field "x" and a map with key "x" will
be indistinguishable even if only one is valid according to the
schema.

If you can provide a complete, self-contained example, as a .java
file, then perhaps we can help.

Doug

On Thu, Jul 19, 2012 at 12:36 PM, Sameer Deokule <[EMAIL PROTECTED]> wrote:
> Using  avro-1.6.2
>
> I am encountering the following exception when I try to serialize a
> constructed GenericRecord
> using either the binary or json encoder . The schema compiles ok (Schema.txt
> attachment to this email.)
>
> Am I setting up one of the Java maps incorrectly?
>
> java.lang.ClassCastException: java.lang.String cannot be cast to
> org.apache.avro.generic.IndexedRecord
>     at org.apache.avro.generic.GenericData.getField(GenericData.java:518)
>     at org.apache.avro.generic.GenericData.getField(GenericData.java:533)
>     at
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:103)
>     at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:65)
>     at
> org.apache.avro.generic.GenericDatumWriter.writeMap(GenericDatumWriter.java:165)
>     at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:68)
>     at
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:105)
>     at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:65)
>     at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:57)
>
> I was looking at TestGenericRecordBuilder.java in the avro github testsuite
> and it
> also seems to be using the builder.set to build the record.
>
>
> GenericRecord is constructed as follows : (schema is created by parsing the
> json file
> for the schema attached )
>
>                 GenericRecord root = new GenericData.Record(schema);
>                 GenericRecordBuilder grb = new GenericRecordBuilder(schema);
>                 GenericRecord postBody;
>                 JSONObject kvm = new JSONObject(kvmStr);
>                 ObjectMapper mapper = new ObjectMapper();
>                 LinkedHashMap<CharSequence, CharSequence> _kvm > mapper.readValue(kvm.toString().getBytes(),LinkedHashMap.class);
>                 LinkedHashMap<CharSequence, CharSequence> arguments = new
> LinkedHashMap<CharSequence, CharSequence>();
>                 arguments.put("update_interval", "1");
>                 arguments.put("attr", "pr");
>                 arguments.put("keys", "10,pr");
>                 arguments.put("f", "/var/f");
>                 arguments.put("pause", "0");
>
>                 LinkedHashMap<CharSequence, Object> ud = new
> LinkedHashMap<CharSequence, Object>();
>                 ud.put("type", "MapStringUD");
>
>
>                 LinkedHashMap<CharSequence, Object> data = new
> LinkedHashMap<CharSequence, Object>();
>                 LinkedHashMap<CharSequence, Object> pr = new
> LinkedHashMap<CharSequence, Object>();
>                 ud.put("data", data);
>                 data.put("pr", pr);
>                 pr.put("type", "ListInt");
>                 Integer [] iData = {1,2,3,4,5,6,7,8,9,10};
>                 List<Integer> lData = Arrays.asList(iData);
>                 pr.put("data", lData );
>
>                 List<LinkedHashMap<CharSequence,CharSequence>> requests > Arrays.asList(new LinkedHashMap<CharSequence,CharSequence>());
>
>                 grb.set("requests", requests);
>                 grb.set("upd", upd);
>                 grb.set("kvm", _kvm);
>                 grb.set("context", new LinkedHashMap<CharSequence,
> CharSequence>());
>                 grb.set("properties", new LinkedHashMap<CharSequence,
> CharSequence>());
>                 grb.set("arguments", arguments);
>
>                 postBody = grb.build();
>                 JSONObject jj = new JSONObject(postBody.toString());
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB