Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive, mail # user - Question about how to add the debug info into the hive core jar

Copy link to this message
Question about how to add the debug info into the hive core jar
java8964 java8964 2013-03-20, 20:45

I have the hadoop running in  pseudo-distributed mode on my linux box. Right now I face a problem about a Hive, which throws Exception in a table for some data which used my custom SerDe and InputFormat class.
To help me to trace the root cause, I need to modify the code of org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe to add more debug logging information to understand why the exception happens.
After I modify the hive code, I can compile it and generate a new hive-serde.jar file, with the same name as the release version, just size changed.
Now I put my new hive-serde.jar under $HIVE_HOME/lib folder, replace the old one, and run the query which failed. But after the failure, if I check the $HADOOP_HOME/logs/user_logs/, I saw the Exception stacktrace still looked like generated by the original hive-serde class. The reason is that the line number shown in the log doesn't match with the new code I changed to add the debug information.
My question is, if I have this new compiled hive-serde.jar file, besides $HIVE_HOME/lib, where should I put it in?
1) This is a pseudo environments. Everything (namenode, data node, job tracker and tasktracer are all running in one box)2) After I replace hive-serde.jar with my new jar, I even stop all the hadoop java processing and restart them.3) But when I run the query in the hive session, I still saw the log generated by the old hive-serde.jar class. Why?
Thank for any help
Abdelrhman Shettia 2013-03-21, 00:35
java8964 java8964 2013-03-21, 01:05