Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> TableRecordReaderImpl is not able to get the rows


Copy link to this message
-
Re: TableRecordReaderImpl is not able to get the rows
Are you doing something specific with the RecordReader? Maybe you can post
more of your code as it is difficult to tell anything right now.

Best Regards,
Sonal
Crux: Reporting for HBase <https://github.com/sonalgoyal/crux>
Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>

On Wed, Jun 13, 2012 at 7:28 PM, Subroto <[EMAIL PROTECTED]> wrote:

> Hi Sonal,
>
> The Scan is being created by:
> void
> org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(Configuration
> configuration)
> I am not providing any other scan options…. :-(
>
> Cheers,
> Subroto Sanyal
>
> On Jun 13, 2012, at 1:30 PM, Sonal Goyal wrote:
>
> > Hi Subroto,
> >
> > How are you configuring your job? Are you providing any Scan options?
> Check
> > Chapter 7 of the ref guide at
> >
> > http://hbase.apache.org/book/mapreduce.example.html
> >
> > Best Regards,
> > Sonal
> > Crux: Reporting for HBase <https://github.com/sonalgoyal/crux>
> > Nube Technologies <http://www.nubetech.co>
> >
> > <http://in.linkedin.com/in/sonalgoyal>
> >
> >
> >
> >
> >
> > On Wed, Jun 13, 2012 at 4:47 PM, Subroto <[EMAIL PROTECTED]> wrote:
> >
> >> Hi,
> >>
> >> I have a table with details:
> >> hbase(main):024:0> scan 'test'
> >> ROW
> >> COLUMN+CELL
> >> row1
> >> column=cf:a, timestamp=1339581548508, value=value1
> >> row2
> >> column=cf:b, timestamp=1339581557585, value=value2
> >> row3
> >> column=cf:c, timestamp=1339581566227, value=value3
> >> 3 row(s) in 0.0200 seconds
> >>
> >> When my MR job tries to access TableRecordReader
> >> org.apache.hadoop.hbase.client.HTable.ClientScanner.nextScanner set the
> >> "value" to null for every call which gives a sense to my MR application
> >> that there are no records in the table.
> >>
> >> I am using: 0.92.1 HBase and 0.23.1 Hadoop…..
> >>
> >> What can be the possible reason(s) behind it??
> >>
> >> Cheers,
> >> Subroto Sanyal
>
>