Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> extracting lzo compressed files


Copy link to this message
-
Re: extracting lzo compressed files
Hi Bejoy,

I am sorry. I can able to see the file size of compressed one but i am
trying to find what will be size of the file if it is not compressed and by
without extracting all set of files.
Cheers!
Manoj.

On Sun, Oct 21, 2012 at 3:28 PM, Manoj Babu <[EMAIL PROTECTED]> wrote:

> Hi Bejoy,
>
> 'hadoop fs -ls' is not displaying the file size. is there any other way to
> find the original file size.
>
> Thanks in advance.
>
> Cheers!
> Manoj.
>
>
>
> On Sun, Oct 21, 2012 at 1:47 PM, Bejoy KS <[EMAIL PROTECTED]> wrote:
>
>> **
>> Hi Manoj
>>
>> You can get the file in a readable format using
>> hadoop fs -text <fileName>
>>
>> Provided you have lzo codec within the property 'io.compression.codecs'
>> in core-site.xml
>>
>> A 'hadoop fs -ls' command would itself display the file size.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: * Manoj Babu <[EMAIL PROTECTED]>
>> *Date: *Sun, 21 Oct 2012 13:10:55 +0530
>> *To: *<[EMAIL PROTECTED]>
>> *ReplyTo: * [EMAIL PROTECTED]
>> *Subject: *extracting lzo compressed files
>>
>> Hi,
>>
>> Is there any option to extract  the lzo compressed file in HDFS from
>> command line and any option to find the original size of the compressed
>> file.
>>
>> Thanks in Advance!
>>
>> Cheers!
>> Manoj.
>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB