i read the doc about hadoop streaming,is it possible to construct job
chain through pipeline and hadoop streaming ?
if the first job like this
first job : hadoop jar /usr/lib/hadoop-mapreduce/hadoop-streaming.jar
-input /alex/messages -output /alex/stout4 -mapper /bin/cat -reducer /tmp/
mycount.pl -file /tmp/mycount.pl
so i want to let the first job output become the second job input ,if can
,how to do it? thanks!