Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user -


Copy link to this message
-
Re:
Ramakrishna Nalam 2013-07-19, 11:42
Update: Did some more digging into the logic of sqoop, realized the avro
check happens only in JDBCExportJob, and DirectMySQLManager uses
MysqlExportJob that does not extend JDBCExportJob.
Guessing avro is supported only for record based inputs and not bulk import
of mysql.

The mapper itself seems to iterate over the rows, though. Can anyone help
me out with why is the avro format not supported by MysqlExportJob?

Regards,
Rama.

On Fri, Jul 19, 2013 at 9:27 AM, Ramakrishna Nalam
<[EMAIL PROTECTED]>wrote:

>
> (Pls ignore the --direct pasted twice, I'm using it only once, the problem
> still remains)
>
>
> On Fri, Jul 19, 2013 at 9:25 AM, Ramakrishna Nalam <[EMAIL PROTECTED]
> > wrote:
>
>> Hi,
>>
>> I'm trying to export a set of avro files to mysql using the
>> DirectMysqlManager, using the sqoop export command as follows:
>>
>> sqoop export  --verbose  --connect jdbc:mysql://<mysqluri>/<dbname>
>> --direct   --username <username> --password <password> --direct
>> --export-dir <dir with avro files> --table <tablename>
>>
>>
>> On running this command, I see that the job's *mapred.inputformat.class* as
>> '*org.apache.sqoop.mapreduce.ExportInputFormat'* wheras I was expecting
>> it to set it to '*org.apache.sqoop.mapreduce.AvroInputFormat**'*
>>
>> I have also checked that* org.apache.sqoop.mapreduce.ExportJobBase.getFileType(conf
>> , path ) *on the same path returns *AVRO_DATA_FILE*
>>
>> Please let me know if I'm missing something here?
>>
>> TIA,
>> Rama.
>>
>
>