Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> Piping output of hadoop command


+
Julian Wissmann 2013-02-18, 17:16
+
Harsh J 2013-02-18, 18:01
Copy link to this message
-
Re: Piping output of hadoop command
Hi Julian,

I think it's not outputing on the standard output bu on the error one.

You might want to test that:
hadoop fs -copyToLocal FILE_IN_HDFS 1>&2 | ssh REMOTE_HOST "dd
of=FILE_ON REMOTE_HOST"

Which will redirect the stderr to the stdout too.

Not sure, but it might be your issue.

JM

2013/2/18, Julian Wissmann <[EMAIL PROTECTED]>:
> Hi,
>
> we're running a Hadoop cluster with hbase for the purpose of
> evaluating it as database for a research project and we've more or
> less decided to go with it.
> So now I'm exploring backup mechanisms and have decided to experiment
> with hadoops export functionality for that.
>
> What I am trying to achieve is getting data out of hbase and into hdfs
> via hadoop export and then copy it out of hdfs onto a backup system.
> However while copying data out of hdfs to the backup machine I am
> experiencing problems.
>
> What I am trying to do is the following:
>
> hadoop fs -copyToLocal FILE_IN_HDFS | ssh REMOTE_HOST "dd of=TARGET_FILE"
>
> It creates a file on the remote host, however this file is 0kb in
> size; instead of copying any data over there, the file just lands in
> my home folder.
>
> The command output looks like this: hadoop fs -copyToLocal
> FILE_IN_HDFS | ssh REMOTE_HOST "dd of=FILE_ON REMOTE_HOST"
> 0+0 Datensätze ein
> 0+0 Datensätze aus
> 0 Bytes (0 B) kopiert, 1,10011 s, 0,0 kB/s
>
> I cannot think of any reason, why this command would behave in this
> way. Is this some Java-ism that I'm missing here (like not correctly
> treating stdout), or am I actually doing it wrong?
>
> The Hadoop Version is 2.0.0-cdh4.1.2
>
> Regards
>
> Julian
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB