Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Query about "hadoop dfs -cat" in hadoop-0-0.20.2


Copy link to this message
-
Re: Query about "hadoop dfs -cat" in hadoop-0-0.20.2
Marcos Ortiz 2011-06-17, 13:42
On 06/17/2011 07:41 AM, Lemon Cheng wrote:
> Hi,
>
> I am using the hadoop-0.20.2. After calling ./start-all.sh, i can type
> "hadoop dfs -ls".
> However, when i type "hadoop dfs -cat
> /usr/lemon/wordcount/input/file01", the error is shown as follow.
> I have searched the related problem in the web, but i can't find a
> solution for helping me to solve this problem.
> Anyone can give suggestion?
> Many Thanks.
>
>
>
> 11/06/17 19:27:12 INFO hdfs.DFSClient: No node available for block:
> blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
> 11/06/17 19:27:12 INFO hdfs.DFSClient: Could not obtain block
> blk_7095683278339921538_1029 from any node:  java.io.IOException: No
> live nodes contain current block
> 11/06/17 19:27:15 INFO hdfs.DFSClient: No node available for block:
> blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
> 11/06/17 19:27:15 INFO hdfs.DFSClient: Could not obtain block
> blk_7095683278339921538_1029 from any node:  java.io.IOException: No
> live nodes contain current block
> 11/06/17 19:27:18 INFO hdfs.DFSClient: No node available for block:
> blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
> 11/06/17 19:27:18 INFO hdfs.DFSClient: Could not obtain block
> blk_7095683278339921538_1029 from any node:  java.io.IOException: No
> live nodes contain current block
> 11/06/17 19:27:21 WARN hdfs.DFSClient: DFS Read: java.io.IOException:
> Could not obtain block: blk_7095683278339921538_1029
> file=/usr/lemon/wordcount/input/file01
>         at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
>         at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
>         at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
>         at java.io.DataInputStream.read(DataInputStream.java:83)
>         at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:47)
>         at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
>         at org.apache.hadoop.fs.FsShell.printToStdout(FsShell.java:114)
>         at org.apache.hadoop.fs.FsShell.access$100(FsShell.java:49)
>         at org.apache.hadoop.fs.FsShell$1.process(FsShell.java:352)
>         at
> org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsShell.java:1898)
>         at org.apache.hadoop.fs.FsShell.cat
> <http://org.apache.hadoop.fs.fsshell.cat/>(FsShell.java:346)
>         at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1543)
>         at org.apache.hadoop.fs.FsShell.run(FsShell.java:1761)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
>
>
> Regards,
> Lemon
Are you sure that all your DataNodes are online?
--
Marcos Lu�s Ort�z Valmaseda
  Software Engineer (UCI)
  http://marcosluis2186.posterous.com
  http://twitter.com/marcosluis2186