Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> issues with decrease the default.block.size

YouPeng Yang 2013-05-09, 15:42
Copy link to this message
Re: issues with decrease the default.block.size
Are you looking to decrease it to get more parallel map tasks out of
the small files? Are you currently CPU bound on processing these small

On Thu, May 9, 2013 at 9:12 PM, YouPeng Yang <[EMAIL PROTECTED]> wrote:
> hi ALL
>      I am going to setup a new hadoop  environment, .Because  of  there  are
> lots of small  files, I would  like to change  the  default.block.size to
> 16MB
> other than adopting the ways to merge  the files into large  enough (e.g
> using  sequencefiles).
>     I want to ask are  there  any bad influences or issues?
> Regards

Harsh J
yypvsxf19870706 2013-05-10, 14:59
Harsh J 2013-05-10, 15:24
shashwat shriparv 2013-05-12, 17:38