Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> [Error]Finding average using hbase hadoop


Copy link to this message
-
Re: [Error]Finding average using hbase hadoop
*Here is output table===>>>

*
>
> hbase(main):004:0> scan 'nyse5'
> ROW
> COLUMN+CELL
>  symbol               column=stocks_output:average,
> timestamp=1376749641978, val
>                       ue=@\xC6o\x11
>
*Sample output  at my eclipse:::*

13/08/17 07:27:21 INFO mapred.Merger: Merging 1 sorted segments
13/08/17 07:27:21 INFO mapred.Merger: Down to the last merge-pass, with 1
segments left of total size: 42242 bytes
13/08/17 07:27:21 INFO mapred.LocalJobRunner:
13/08/17 07:27:21 INFO mapred.JobClient:  map 100% reduce 0%
*
For    2640    average is:6.201058*
On Sat, Aug 17, 2013 at 11:59 PM, Jean-Marc Spaggiari <
[EMAIL PROTECTED]> wrote:

> Are you outputting to a table? From your code, I don't see any output
> configured.
>
> 2013/8/17 manish dunani <[EMAIL PROTECTED]>
>
> > Thanx a lot!!
> > Jean.
> >
> > I am very thankful to you..And off course Ted also doing very good job.
> > *
> >
> > Revised Code ::*
> >
> > Package com.maddy;
> > >
> > > import java.io.IOException;
> > >
> > > import org.apache.hadoop.conf.Configuration;
> > > import org.apache.hadoop.fs.Path;
> > > import org.apache.hadoop.hbase.HBaseConfiguration;
> > > import org.apache.hadoop.hbase.client.Put;
> > > import org.apache.hadoop.hbase.client.Result;
> > > import org.apache.hadoop.hbase.client.Scan;
> > > import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
> > > import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
> > > import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
> > > import org.apache.hadoop.hbase.mapreduce.TableMapper;
> > > import org.apache.hadoop.hbase.mapreduce.TableReducer;
> > > import org.apache.hadoop.hbase.util.Bytes;
> > > //import org.apache.hadoop.io.DoubleWritable;
> > > import org.apache.hadoop.io.FloatWritable;
> > > import org.apache.hadoop.mapreduce.Job;
> > > import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
> > >
> > >
> > > public class openaveragestock
> > > {
> > >     public static class map extends
> > > TableMapper<ImmutableBytesWritable,FloatWritable>
> > >     {
> > >         private static String col_family="stocks";
> > >         private static String qul="open";
> > >
> > >         private static String col_family1="stocks";
> > >         private static String qul1="symbol";
> > >
> > >         private static byte[] colfamily2=Bytes.toBytes(col_family);
> > >         private static byte[] qul2=Bytes.toBytes(qul);
> > >
> > >         private static byte[] colfamily3=Bytes.toBytes(col_family1);
> > >         private static byte[] qul3=Bytes.toBytes(qul1);
> > >
> > > //        public static float toFloat(int qul2)
> > > //        {
> > > //            return Float.intBitsToFloat((qul2));
> > > //
> > > //        }
> > > //
> > >
> > >
> > >
> > >         public void map(ImmutableBytesWritable row,Result value,Context
> > > context) throws IOException
> > >         {
> > >
> > >
> > >             //byte[]
> > > val1=(value.getValue("stocks".getBytes(),"symbol".getBytes()));
> > >            byte[] val=value.getValue(colfamily2,qul2);
> > >
> > >
> > >             ImmutableBytesWritable stock_symbol=new
> > > ImmutableBytesWritable(qul3);
> > >
> > >
> > >
> > >             try
> > >             {
> > >
> > >                 context.write(stock_symbol,new
> > > FloatWritable(Float.parseFloat(Bytes.toString(val))));
> > >             }
> > >
> > >             catch(InterruptedException e)
> > >
> > >             {
> > >                  throw new IOException(e);
> > >             }
> > >
> > >
> > >         }
> > >
> > >
> > >     }
> > >
> > >
> > >     public static class reduce extends
> > >
> TableReducer<ImmutableBytesWritable,FloatWritable,ImmutableBytesWritable>
> > >     {
> > >
> > >         @Override
> > >         public void reduce(ImmutableBytesWritable
> > > key,Iterable<FloatWritable>values,Context context) throws IOException,
> > > InterruptedException
> > >         {
> > >             float sum=0;
> > >             int count=0;
> > >             float average=0;

Regards

*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*