Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> distcp question


Copy link to this message
-
distcp question
I have 2 different versions of Hadoop running. I need to copy significant
amount of data  (100tb) from one cluster to another. I know distcp is the
way to do. On the target cluster I have webhdfs running. Would that work?

The DistCp manual says, I need to use "HftpFileSystem". Is that necessary
or will webhdfs do the task?

--
--- Get your facts first, then you can distort them as you please.--
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB