Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> /tmp dir for import configurable?


Copy link to this message
-
Re: /tmp dir for import configurable?
Thanks for the idea Alex. I considered this but that would mean I have to
change my cluster setup for sqoop (last resort solution). I'd very much
rather point sqoop to existing large disks.

Cheers,
Christian
On Thu, Mar 28, 2013 at 3:50 PM, Alexander Alten-Lorenz <[EMAIL PROTECTED]
> wrote:

> You could mount a bigger disk into /tmp - or symlink /tmp to another
> directory which have enough space.
>
> Best
> - Alex
>
> On Mar 28, 2013, at 4:35 PM, Christian Prokopp <[EMAIL PROTECTED]>
> wrote:
>
> > Hi,
> >
> > I am using sqoop to copy data from MySQL to S3:
> >
> > (Sqoop 1.4.2-cdh4.2.0)
> > $ sqoop import --connect jdbc:mysql://server:port/db --username user
> --password pass  --table tablename --target-dir s3n://xyz@somehwere/a/b/c
> --fields-terminated-by='\001' -m 1 --direct
> >
> > My problem is that sqoop temporarily stores the data on /tmp, which is
> not big enough for the data. I am unable to find a configuration option to
> point sqoop to a bigger partition/disk. Any suggestions?
> >
> > Cheers,
> > Christian
> >
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>
--
Best regards,

*Christian Prokopp*
Data Scientist, PhD
Rangespan Ltd. <http://www.rangespan.com/>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB