Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Avro >> mail # dev >> AVRO-1026 (C++) results in invalid JSON when converting a ValidSchema to JSON

Copy link to this message
AVRO-1026 (C++) results in invalid JSON when converting a ValidSchema to JSON
Has anyone else seen this?


A change made in NodeImpl.cc  to introduce namespace support has
introduced a bug that corrupts JSON schemas when writing them to disk or
dumping to a stream.  This manifested when using DataFileWriter to
generate a data file, and the schema stored in the header was corrupted
from the original schema provided.







  "type": "fixed",

  "name": "Unsigned16",

  "size": 2






  "type": "fixed",

  "size": 2,

  "name" : "Unsigned16",



Note the extraneous comma after the 'name' attribute.


I fixed it by reordering some lines in NodeFixed::printJson to write the
fields as type:,  name:, then size:, and removed the comma after the
size attribute.  I haven't checked the other types.


I couldn't see a JIRA ticket for this, so I assume I just go ahead and
raise one?  I'm happy to provide an accompanying test case, and the fix.




Steve Roehrs

Senior Software Engineer | Lockheed Martin


| p: +61 8 7389 3342     | m: +61 4 3891 5622     | f: +61 8 7389 4551

| w: www.rlmgroup.com.au | e: [EMAIL PROTECTED]

| Company address: 82-86 Woomera Ave, Edinburgh, SA 5111

This email and any attachment to it remains the property of Lockheed
Martin and is intended only to be read or used by the named addressee.
It may contain information that is confidential, commercially valuable
or subject to legal privilege.  If you receive this email in error,
please immediately delete it and notify the sender.  Opinions,
conclusions and other information in this message that do not relate to
the official business of Lockheed Martin or any companies within
Lockheed Martin shall be understood as neither given nor endorsed by