Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> Error while using Avro Pig and Hcat


Copy link to this message
-
Error while using Avro Pig and Hcat
I am getting following error while executing the pig script.
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
open iterator for alias eventData. Backend error : Unable to recreate
exception from backed error: Error:
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;

The script I am using is

AvroDataDumpFromTable.pig

register /home/y/libexec/hive/lib/hcatalog.jar;
register /home/y/libexec/hive/lib/hive-metastore.jar;
register /homes/immilind/avro-1.7.2.jar;
register /homes/immilind/avro-mapred-1.7.2.jar;
register /homes/immilind/haivvreo-1.0.12-avro15-hive81-SNAPSHOT.jar;

eventData = load 'serdetestdb.employee_table' using
org.apache.hcatalog.pig.HCatLoader();
dump eventData;
Table Creation Hive Script

USE  SerDeTestDB;

CREATE EXTERNAL TABLE SerDeTestDB.employee_table

PARTITIONED BY (schema_def string, gen_time string, arr_time string)

ROW FORMAT SERDE 'com.linkedin.haivvreo.AvroSerDe'

WITH SERDEPROPERTIES (
    'schema.literal' = '{
        "type" : "record",
        "name" : "employee",
        "fields":[
            {"name" : "name", "type" : "string", "default" : "NU"},
            {"name" : "age", "type" : "int", "default" : 0 },
            {"name" : "dept", "type": "string", "default" : "DU"}
        ]
    }'
)

STORED AS INPUTFORMAT 'com.linkedin.haivvreo.AvroContainerInputFormat'
OUTPUTFORMAT 'com.linkedin.haivvreo.AvroContainerOutputFormat'
LOCATION '/user/immilind/AvroHcatTest';
A partition is registered with this table and it contains a file with
serialized data using the above schema. The serialized file is correct as I
tried de serializing the data using the same files which worked fine.

Any clue what is going wrong ?

Thanks
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB