Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> split big files into small ones to later copy


Copy link to this message
-
split big files into small ones to later copy
I have one 500GB plain-text file in HDFS, and I want to copy locally, to
zip it and put it on another machine in a local disk. The problem is that I
don't have enough space in the local disk where HDFS is, to then zip it and
transfer to another host.

Can I split the file into small files to be able to copy to the local disk?
Any suggestions on how to do a copy?

--
Best regards,
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB