Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Support of arrays in fields


Copy link to this message
-
Re: Support of arrays in fields
Jarek,

This is a second time :-) you are asking me to open a jira on Apache that
is already present on Cloudera. If you are saying that Cloudera site is no
longer used:
https://issues.apache.org/jira/browse/SQOOP-390?focusedCommentId=13631487&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13631487
then why haven't all its issues been migrated in batch?

Thanks
On Mon, Apr 15, 2013 at 8:58 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:

> Hi Ruslan,
> I'm afraid that Sqoop currently do not supports arrays natively. The
> import case can be workaround by using array_to_string function, but I'm
> not sure how to easily workaround export. Would you mind opening a new JIRA
> on Apache JIRA [1] for that?
>
> Jarcec
>
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP
>
> On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote:
> > Hey guys,
> >
> > Sorry for raising this old question, but is there a workaround for
> > uploading data with arrays in fields?
> > I have a table like this:
> > CREATE TABLE tablename
> > (
> > image_urls character varying(300)[]
> > );
> > and I am uplooading from a file on HDFS. Basically I can change a format
> of
> > the file, but in what form should I do it in order to make Sqoop upload
> it
> > to this unsupported data type? Maybe there is a workaround.
> >
> > Also I saw this issue, but it is still unresolved:
> > https://issues.cloudera.org/browse/SQOOP-160
> >
> > Any help would be appreciated
> >
> >
> > On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]
> >wrote:
> >
> > > You might consider utilizing Postgresql's function array_to_string to
> > > "join" the array into one string. You would have to change the import
> from
> > > --table to --query then thought.
> > >
> > > Jarcec
> > >
> > > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > > > Thanks Jarcec for the update, So Sqoop is not suitable for shifting
> data
> > > > from DB to HDFS , if some columns have integer[] or bigint[]
> datatypes.
> > > >
> > > > Is there any way i can sh*ift  date having bigint[] datatypes from*
> > > postgresql
> > > > DB to HDFS using Sqoop or I need to test another tool like Talend
> etc.
> > > >
> > > >
> > > > Thanks
> > > >
> > > >
> > > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <
> [EMAIL PROTECTED]
> > > >wrote:
> > > >
> > > > > Hi Adarsh,
> > > > > as far as I know then Sqoop should not have any issues with bigint
> data
> > > > > type.
> > > > >
> > > > > Based on provided log fragment, It seems that you're having issues
> with
> > > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array
> is
> > > really
> > > > > not supported in Sqoop at the moment.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > 1:
> > > > >
> > >
> http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > > > >
> > > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > > > Hi all,
> > > > > >
> > > > > > Please forgive if i violate any rule before posting in this
> mailing
> > > list
> > > > > .
> > > > > > I am using for some testing in my hadoop standalone set up.
> > > > > >
> > > > > > Hadoop Version: 0.20.2-cdh3u5,
> > > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > > > >
> > > > > >
> > > > > > I am able to export data from HDFS to postgresql but when I am
> > > trying to
> > > > > > import data from DB to hdfs , below problem arises :
> > > > > > hadoop@test123:~/project/
> > > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > > > > > jdbc:postgresql://localhost/hadooppipeline  --table test_table
> > > --username
> > > > > > postgres --password postgres
> > > > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > > > > > Please set $HBASE_HOME to the root of your HBase installation.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB