Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> Uploading file to HDFS


+
超级塞亚人 2013-04-19, 04:35
Copy link to this message
-
RE: Uploading file to HDFS
I just realized another trick you might trying. The Hadoop dfs client can
read input from STDIN, you could use netcat to pipe the stuff across to HDFS
without hitting the hard drive, I haven’t tried it, but here’s what I
would think might work:

 

On the Hadoop box, open a listening port and feed that to the HDFS command:

nc -l 2342 | hdfs dfs -copyFromLocal - /tmp/x.txt

 

On the remote server:

cat my_big_2tb_file > nc 10.1.1.1 2342

 

I haven’t tried it yet, but in theory this would work. I just happened to
test out the hdfs dfs command reading from stdin. You might have to correct
the above syntax, I just wrote it off the top of my head.

 

Dave

 

 

From: 超级塞亚人 [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2013 11:35 AM
To: [EMAIL PROTECTED]
Subject: Uploading file to HDFS

 

I have a problem. Our cluster has 32 nodes. Each disk is 1TB. I wanna upload
2TB file to HDFS.How can I put the file to the namenode and upload to HDFS?

+
Wellington Chevreuil 2013-04-19, 10:01
+
Olivier Renault 2013-04-22, 08:37
+
超级塞亚人 2013-04-23, 11:05
+
Mohammad Tariq 2013-04-23, 15:53
+
shashwat shriparv 2013-04-23, 16:02
+
David Parks 2013-04-19, 07:02