Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # dev >> Customizing HBase Import


Copy link to this message
-
Re: Customizing HBase Import
My bad, I interpreted your question incorrect. It seems difficult to add a
custom processor as you pointed out.
The quickest and dirtiest way would be to hack in your Processor in place
of the original and re-build Sqoop.
Second way would be to add a custom property, you can go ahead and create a
JIRA so this can be explored in detail.

Best Regards,
Abhijeet Gaikwad

On 23 December 2013 00:35, Ravi Kiran <[EMAIL PROTECTED]> wrote:

> Hi Abhijeet,
>
>     Thanks for the response . I primarily would like to register a custom
> org.apache.sqoop.hbase.HBasePutProcessor as we need to do some custom
> processing to get metadata of the HBase table.  I noticed HBaseImportJob
> has HBasePutProcessor hard coded and hence was curious how a custom
> processor can be passed. Any ideas on how to address it ?
>
>
>     Configuration conf = job.getConfiguration();
>     conf.setClass("sqoop.output.delegate.field.map.processor.class",
> HBasePutProcessor.class,  FieldMapProcessor.class);
>
> Regards
> Ravi
>
>
>
> On Sun, Dec 22, 2013 at 1:14 PM, abhijeet gaikwad <[EMAIL PROTECTED]
> >wrote:
>
> > Usage:
> > sqoop -D<key=value> <sqoop_command>
> > eg: sqoop -Dsqoop.hbase.insert.put.transformer.class=<class_name>
> > import/export <options>
> >
> > More details on PutTransformer:
> >
> >
> http://sqoop.apache.org/docs/1.4.4/SqoopDevGuide.html#_hbase_serialization_extensions
> >
> > Hope this helps!
> >
> > Best Regards,
> > Abhijeet Gaikwad
> >
> > On 22 December 2013 00:58, Ravi Kiran <[EMAIL PROTECTED]> wrote:
> >
> > > Hi all,
> > >
> > >    I am currently working on integrating Apache Phoenix(currently at
> > > https://github.com/forcedotcom/phoenix)  with Sqoop for importing data
> > > into
> > > HBase. To this, we need to write a custom HBasePutProcessor and have
> that
> > > called as part of the import process .
> > >    I notice that one can give a custom PutTransformer through -D
> argument
> > > of the sqoop-import command but couldn't find a way to pass a custom
> > > PutProcessor . Can anyone please provide any pointers / suggestions on
> > how
> > > this can be achieved.
> > >
> > >   Appreciate your help!!
> > >
> > > Regards
> > > Ravi
> > >
> >
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB