Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> hftp can list directories but won't send files


+
Robert Rapplean 2012-12-18, 22:43
+
Arpit Gupta 2012-12-18, 22:49
+
Robert Rapplean 2012-12-18, 23:05
+
Harsh J 2012-12-18, 23:16
+
Robert Rapplean 2012-12-18, 23:28
Copy link to this message
-
Re: hftp can list directories but won't send files
Robert

Another thing you can try is

export HADOOP_ROOT_LOGGER=DEBUG,console and run the hadoop dfs -cat command with hftp and you should get more logs on the client.

Also since you are running cdh it might be better to ask on the cdh mailing lists.

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Dec 18, 2012, at 3:28 PM, Robert Rapplean <[EMAIL PROTECTED]> wrote:

> The cluster says this:
>
> Hadoop 2.0.0-cdh4.0.0
> Subversion file:///data/1/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hadoop-2.0.0-cdh4.0.0/src/hadoop-common-project/hadoop-common -r 5d678f6bb1f2bc49e2287dd69ac41d7232fc9cdc
> Compiled by jenkins on Mon Jun  4 16:52:21 PDT 2012
> From source with checksum 64f877fc49f5adc0d7d55c13089e866e
>
> Which put a tiny strain on my knowledge to retrieve. Can you make a suggestion regarding which logs you want to look at?
>
> Robert Rapplean
> Senior Software Engineer
> 303-872-2256  direct  | 303.438.9597  main | www.trueffect.com
>
>
> -----Original Message-----
> From: Harsh J [mailto:[EMAIL PROTECTED]]
> Sent: Tuesday, December 18, 2012 4:17 PM
> To: <[EMAIL PROTECTED]>
> Subject: Re: hftp can list directories but won't send files
>
> What version/distribution of Hadoop is your source cluster?
>
> Also, I'd take a look at a your NN's and a few of your DN's logs right after encountering this issue, to see the reason+stacktrace printed for the Server Error 500 (a code for a server-end fault). That'd give us more ideas on whys.
>
> On Wed, Dec 19, 2012 at 4:13 AM, Robert Rapplean <[EMAIL PROTECTED]> wrote:
>> Hey, everone. Just got finished reading about all of the unsubscribe messages in Sept-Oct, and was hoping someone has a clue about what my system is doing wrong. I suspect that this is a configuration issue, but I don't even know where to start looking for it. I'm a developer, and my sysadmin is tied up until the end of the year.
>>
>> I'm trying to move files from one cluster to another using distcp, using the hftp protocol as specified in their instructions.
>>
>> I can read directories over hftp, but when I attempt to get a file I get a 500 (internal server error). To eliminate the possibility of network and firewall issues, I'm using hadoop fs -ls and hadoop fs -cat commands on the source server in order to attempt to figure out this issue.
>>
>> This provides a directory of the files, which is correct.
>>
>> hadoop fs -ls ourlogs/day_id=19991231/hour_id=1999123123
>> -rw-r--r--   3 username supergroup        812 2012-12-16 17:21 logfiles/day_id=19991231/hour_id=1999123123/000008_0
>>
>> This gives me a "file not found" error, which is also correct because the file isn't there:
>>
>> hadoop fs -cat
>> hftp://hdenn00.trueffect.com:50070/user/username/logfiles/day_id=19991
>> 231/hour_id=1999123123/000008_0x
>> cat:
>> `hftp://hdenn00.trueffect.com:50070/user/prodman/ods_fail/day_id=19991
>> 231/hour_id=1999123123/000008_0x': No such file or directory
>>
>> This line gives me a 500 internal server error. The file is confirmed to be on the server.
>>
>> hadoop fs -cat
>> hftp://hdenn00.trueffect.com:50070/user/username/logfiles/day_id=19991
>> 231/hour_id=1999123123/000008_0
>> cat: HTTP_OK expected, received 500
>>
>> Here is a stack trace of what distcp logs when I attempt this:
>>
>> java.io.IOException: HTTP_OK expected, received 500
>>    at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderUrlOpener.connect(HftpFileSystem.java:365)
>>    at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:119)
>>    at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>>    at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:187)
>>    at java.io.DataInputStream.read(DataInputStream.java:83)
>>    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:424)
>>    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:547)
>>    at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:314)