In my experience with Flume and this issue, it occurs when the file is not
properly closed. If it was then it would show you the correct size and Hive
will read the content.
On Wed, Sep 25, 2013 at 12:30 AM, ch huang <[EMAIL PROTECTED]> wrote:
> i have a question,i start pump data into hadoop file,but i find
> the file size is zero,but if i use hadoop fs -cat file ,i can see the data
> ,if i query hive table(which connect with the file) ,i can not read
> anything ,why?