Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Hive + mongoDB


Copy link to this message
-
Re: Hive + mongoDB
Can you share your create table ddl for table name docs?

Select statement does not need all those details. Those are part of create
table DDL only.
On Fri, Sep 13, 2013 at 4:24 PM, Sandeep Nemuri <[EMAIL PROTECTED]>wrote:

> Hi nithin
>
> Thanks for your help
> I have used this query in hive to retrieve the data from mongodb
>
> add jar /usr/lib/hadoop/lib/mongo-2.8.0.jar;
> add jar /usr/lib/hive/lib/hive-mongo-0.0.3-jar-with-dependencies.jar;
>
> select * from docs
> input format "org.yong3.hive.mongo.MongoStorageHandler"
> with serdeproperties ( "mongo.column.mapping" > "_id,dayOfWeek,bc3Year,bc5Year,bc10Year,bc20Year,bc1Month,bc2Year,bc3Year,bc30Year,bc1Year,bc7Year,bc6Year"
> )
> tblproperties ( "mongo.host" = "127.0.0.1", "mongo.port" = "27017",
> "mongo.db" = "sample", "mongo.user" = "sample", "mongo.passwd" > "password", "mongo.collection" = "docs" );
>
>
> I got an Error
>
> FAILED: Parse Error: line 2:6 mismatched input 'format' expecting EOF near
> 'input'
>
>
>
> On Thu, Sep 12, 2013 at 6:23 PM, Nitin Pawar <[EMAIL PROTECTED]>wrote:
>
>> try creating table with your existing mongo db and collection see the
>> data can be read by the user or not.
>> What you need to do is mongo collection column mapping exactly with same
>> names into hive column definition.
>>
>> if you can not see mongo data from hive query, do let me know what errors
>> do you see.
>>
>>
>> On Thu, Sep 12, 2013 at 5:28 PM, Sandeep Nemuri <[EMAIL PROTECTED]>wrote:
>>
>>> How we will get mongo data into mongo table ?
>>>
>>> By using this we can just create table
>>>
>>> create external table mongo_users(id int, name string, age int)
>>> stored by "org.yong3.hive.mongo.MongoStorageHandler"
>>> with serdeproperties ( "mongo.column.mapping" = "_id,name,age" )
>>> tblproperties ( "mongo.host" = "192.168.0.5", "mongo.port" = "11211",
>>> "mongo.db" = "test", "mongo.user" = "testUser", "mongo.passwd" = "testPasswd", "mongo.collection" = "users" );
>>>
>>>
>>>
>>>
>>> On Thu, Sep 12, 2013 at 5:02 PM, Nitin Pawar <[EMAIL PROTECTED]>wrote:
>>>
>>>> If you are importing from hive to mongo, why can't you just select from
>>>> mongo table and insert into hive table?
>>>>
>>>>
>>>> On Thu, Sep 12, 2013 at 4:24 PM, Sandeep Nemuri <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Hi Nitin Pawar,
>>>>>                         I have checked That link .There data is
>>>>> importing from Hive-mongo.
>>>>> But my requirment is to import data from mongo-hive.
>>>>>
>>>>>
>>>>> On Thu, Sep 12, 2013 at 3:49 PM, Nitin Pawar <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Sandip,
>>>>>>
>>>>>> Did you try using hive-mongo (https://github.com/yc-huang/Hive-mongo
>>>>>> ).
>>>>>>
>>>>>> Its pretty easy to use as well. If you want to start with analytics
>>>>>> directly.
>>>>>>
>>>>>>
>>>>>> On Thu, Sep 12, 2013 at 2:02 PM, Sandeep Nemuri <[EMAIL PROTECTED]
>>>>>> > wrote:
>>>>>>
>>>>>>> Thanks all
>>>>>>> i am trying to import data with this program
>>>>>>> but when i compied this code i got errors
>>>>>>>
>>>>>>> Here is the code
>>>>>>>
>>>>>>> import java.io.*;
>>>>>>> import org.apache.commons.logging.*;
>>>>>>> import org.apache.hadoop.conf.*;
>>>>>>> import org.apache.hadoop.fs.Path;
>>>>>>> import org.apache.hadoop.io.*;
>>>>>>> import org.apache.hadoop.mapreduce.lib.output.*;
>>>>>>> import org.apache.hadoop.mapreduce.*;
>>>>>>> import org.bson.*;
>>>>>>> import com.mongodb.hadoop.*;
>>>>>>> import com.mongodb.hadoop.util.*;
>>>>>>>
>>>>>>> public class ImportWeblogsFromMongo {
>>>>>>>
>>>>>>> private static final Log log = LogFactory.
>>>>>>> getLog(ImportWeblogsFromMongo.class);
>>>>>>>
>>>>>>> public static class ReadWeblogsFromMongo extends Mapper<Object,
>>>>>>> BSONObject, Text, Text>{
>>>>>>>
>>>>>>> public void map(Object key, BSONObject value, Context context)
>>>>>>> throws IOException, InterruptedException{
>>>>>>>
>>>>>>> System.out.println("Key: " + key);
>>>>>>> System.out.println("Value: " + value);
>>>>>>>
>>>>>>> String md5 = value.get("md5").toString();

Nitin Pawar
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB