In the past I have used the avro-tools jar "fromjson" to convert a json file
containing utf-8 multibyte chars to avro as expected.  This data is type
"bytes" in the schema.

Today this isn't working for me  - instead the multibyte characters are each
represented in my avro output as a single ? (questionmark).

No doubt this is due to me changing something in myenvironment. Does anyone
know what I need to set/download to get back to normal running?

Thanks,

Nick

 

--
View this message in context: http://apache-avro.679487.n3.nabble.com/avro-tools-not-serialising-multibyte-chars-today-tp4037037.html
Sent from the Avro - Users mailing list archive at Nabble.com.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB