To add on to what Bejoy mentioned, while Sqoop is generally used to
transfer data between a JDBC compliant RDBMS and Hadoop, it is not
restricted to it - an example is Sqoop's connector to Couchbase.
Furthermore in Sqoop 2, connectors will no longer be restricted to the
JDBC model, but can rather define their own vocabulary, e.g. Couchbase
no longer needs to specify a table name, only to overload it as a
backfill or dump operation:
On Wed, Jul 11, 2012 at 10:56 AM, prabhu k <[EMAIL PROTECTED]> wrote:
> Thanks for the response, I will go through the guide and come back to you..
> On Wed, Jul 11, 2012 at 2:26 PM, Bejoy KS <[EMAIL PROTECTED]> wrote:ha
>> Hi Prabhu
>> Sqoop is used to transfer data between Jdbc compliant RDBMS and hadoop.
>> For hive the underlying storage is mostly hdfs, so the question for data
>> transfer between hdfs and hive seems irrelevant
>> You can use SQOOP import to transfer data from rdbms to hadoop. Specify
>> the connection parameters and other required arguments. Please refer Sqoop
>> User guide for details on sqoop import.
>> Bejoy KS
>> Sent from handheld, please excuse typos.
>> From: prabhu k <[EMAIL PROTECTED]>
>> Date: Wed, 11 Jul 2012 14:09:49 +0530
>> To: <[EMAIL PROTECTED]>
>> ReplyTo: [EMAIL PROTECTED]
>> Subject: Load from Hive table to HDFS using sqoop
>> Hi Users list,
>> Please help me on the below .
>> 1. how to load from Hive table to HDFS using sqoop
>> 2. How to load from mysql database to HDFS using sqoop
>> 3. How to load from HDFS to Hive table using sqoop.