Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Drill >> mail # user >> create a huge size of parquet file


Copy link to this message
-
Re: create a huge size of parquet file
Hi
  haa
  Thanks great. I have also find it by google

Regards

2013/12/20 Ted Dunning <[EMAIL PROTECTED]>

> Oops.
>
> I forgot to include a link:
>
> https://github.com/tdunning/log-synth
>
>
>
> On Thu, Dec 19, 2013 at 11:32 PM, YouPeng Yang <[EMAIL PROTECTED]
> >wrote:
>
> > Hi Ted
> >
> >   Thank you very much for your reply.
> > I am considering to build a map-reduce version.
> >
> >
> > Regards
> >
> >
> > 2013/12/20 Ted Dunning <[EMAIL PROTECTED]>
> >
> > > You could adapt the log-synth program to produce files in parquet.  At
> > the
> > > same time, I would recommend that you build a map-reduce version as
> well
> > so
> > > that you can generate much bigger files.
> > >
> > >
> > > On Thu, Dec 19, 2013 at 10:55 PM, YouPeng Yang <
> > [EMAIL PROTECTED]
> > > >wrote:
> > >
> > > > Hi
> > > >    I have complied the drill M1 with hadoop-2.2. And I have tested
> the
> > > > sample-data/region.parquet in HDFS.It goes well.
> > > >
> > > >   I want to make a deep test on the performance of dirll.
> > > >   First thing it need to create a huge size of parquet file. Is there
> > any
> > > > tools to convert my flat file or other format file to parquet file?
> > > >
> > > >
> > > >
> > > > Regards.
> > > >
> > >
> >
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB