Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Custom Serde/Connector Null Pointer Exception


Copy link to this message
-
Custom Serde/Connector Null Pointer Exception
Greetings all. I am not sure if this is a hive issue or a custom serde
issue. So I will ask in both places.  I am trying to use the mongodb
connection written by yc-huang.  This could have great potential with our
data. THe link is here.

https://github.com/yc-huang/Hive-mongo
I followed instructions for compiling, and running the first CREATE TABLE.
I have verified that the mongo db on 192.168.0.11 is exposed on the proper
port as well as the collection referenced exists properly.

So that being said, this is a hard one for me, a non-java expert to
troubleshoot. And wanted to get thoughts on whether this was a connector
problem or a hive issue. I am running Hive 0.9.0 on MapR.  Note: All other
aspects of Hive are working for me right now.  Look forward to any
thoughts.

Thanks!
hive> create external table mongo_users(name STRING, age INT)
    > COMMENT 'Ya boy'
    > stored by "org.yong3.hive.mongo.MongoStorageHandler"
    > with serdeproperties ( "mongo.column.mapping" = "name,age" )
    > tblproperties (
"mongo.host"="192.168.0.11","mongo.port"="27017","mongo.db"="test","mongo.collection"="users");
12/10/13 14:30:13 INFO ql.Driver: <PERFLOG method=Driver.run>
12/10/13 14:30:13 INFO ql.Driver: <PERFLOG method=compile>
12/10/13 14:30:13 DEBUG parse.VariableSubstitution: Substitution is on:
create external table mongo_users(name STRING, age INT)
COMMENT 'Ya boy'
stored by "org.yong3.hive.mongo.MongoStorageHandler"
with serdeproperties ( "mongo.column.mapping" = "name,age" )
tblproperties (
"mongo.host"="192.168.0.11","mongo.port"="27017","mongo.db"="test","mongo.collection"="users")
12/10/13 14:30:13 INFO parse.ParseDriver: Parsing command: create external
table mongo_users(name STRING, age INT)
COMMENT 'Ya boy'
stored by "org.yong3.hive.mongo.MongoStorageHandler"
with serdeproperties ( "mongo.column.mapping" = "name,age" )
tblproperties (
"mongo.host"="192.168.0.11","mongo.port"="27017","mongo.db"="test","mongo.collection"="users")
12/10/13 14:30:13 INFO parse.ParseDriver: Parse Completed
12/10/13 14:30:13 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
12/10/13 14:30:13 INFO parse.SemanticAnalyzer: Creating table mongo_users
position=22
12/10/13 14:30:13 INFO ql.Driver: Semantic Analysis Completed
12/10/13 14:30:13 DEBUG parse.SemanticAnalyzer: validation start
12/10/13 14:30:13 INFO ql.Driver: Returning Hive schema:
Schema(fieldSchemas:null, properties:null)
12/10/13 14:30:13 INFO ql.Driver: </PERFLOG method=compile
start=1350156613478 end=1350156613635 duration=157>
12/10/13 14:30:13 INFO ql.Driver: <PERFLOG method=Driver.execute>
12/10/13 14:30:13 INFO ql.Driver: Starting command: create external table
mongo_users(name STRING, age INT)
COMMENT 'Ya boy'
stored by "org.yong3.hive.mongo.MongoStorageHandler"
with serdeproperties ( "mongo.column.mapping" = "name,age" )
tblproperties (
"mongo.host"="192.168.0.11","mongo.port"="27017","mongo.db"="test","mongo.collection"="users")
12/10/13 14:30:13 INFO exec.DDLTask: Use StorageHandler-supplied
org.yong3.hive.mongo.MongoSerDe for table mongo_users
12/10/13 14:30:13 INFO hive.log: DDL: struct mongo_users { string name, i32
age}
FAILED: Error in metadata: java.lang.NullPointerException
12/10/13 14:30:13 ERROR exec.Task: FAILED: Error in metadata:
java.lang.NullPointerException
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:544)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3305)
 at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
 at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1326)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
 at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
Caused by: java.lang.NullPointerException
at
org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector.init(StandardStructObjectInspector.java:116)
 at
org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector.<init>(StandardStructObjectInspector.java:106)
at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory.getStandardStructObjectInspector(ObjectInspectorFactory.java:274)
 at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory.getStandardStructObjectInspector(ObjectInspectorFactory.java:259)
at org.yong3.hive.mongo.MongoSerDe.initialize(MongoSerDe.java:101)
 at
org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:203)
at
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:260)
 at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:253)
at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:490)
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:518)
... 17 more

12/10/13 14:30:13 DEBUG exec.DDLTask:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.NullPointerException
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:544)
 at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3305)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242)
 at o