Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Unit Test: HBase Map/Reduce


Copy link to this message
-
Re: Unit Test: HBase Map/Reduce
Hi Andrey,

thanks a lot, your classes are quite useful.
However, my problem was not related to the HBaseClusterTestCase, but it
was a problem in my code.

Thanks
--
Renaud Delbru

On 19/04/10 19:39, Andrey S wrote:
> 2010/4/19 Renaud Delbru<[EMAIL PROTECTED]>
>
>    
>> Hi,
>>
>> I am trying to create a unit test using the HBaseClusterTestCase and the
>> RowCounter example.
>> I am able to spin up a hbase table, load data inside, access the data
>> (lookup and scan), but whenever I am trying to launch a map/reduce job
>> (TableMapper), the map/reduce functions are never executed because the
>> number of splits returned byt the TableInputFormat is empty. After some
>> debugging, I noticed that the line (in TableInputFormatBase)
>> final byte [][] startKeys = table.getStartKeys();
>> is returning an empty array.
>>
>> In fact, even if I am able to access table data using HTable#get, the
>> HTable#getStartKeys is returning nothing. Any ideas on this issue ? Also, do
>> someone have some advices/examples on how to write and run unit tests
>> involving hbase ?
>>
>> Thanks,
>> Regards
>> --
>> Renaud Delbru
>>
>>      
> I use the following "recompilation" of the standard test helper classes, and
> they now works fine. Here was many magic :) with parameters and finally I
> able to use them with pig (0.7.0) to run tests in mapreduce mode under
> maven.
> importTable() method can import tables produced by Export.class from hbase
> distribution.
>
> Hopes this helps.
>
> import org.apache.commons.logging.Log;
> import org.apache.commons.logging.LogFactory;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
> import org.apache.hadoop.io.IOUtils;
> import org.junit.AfterClass;
> import org.junit.BeforeClass;
>
> import java.io.File;
> import java.io.IOException;
> import java.io.InputStream;
> import java.io.OutputStream;
> import java.net.URISyntaxException;
>
> /**
>   * @author octo
>   */
> public abstract class BaseHadoopTestCase {
>
>      private static final Log LOG > LogFactory.getLog(BaseHadoopTestCase.class);
>
>      protected static FileSystem localFs;
>      protected static Configuration localConf;
>
>      @BeforeClass
>      public static void beforeClass() throws IOException, URISyntaxException
> {
>          //System.setProperty("java.io.tmpdir", new
> File("target/tmp").getAbsolutePath());
>          System.setProperty("hadoop.tmp.dir", new
> File("target/hadoop-test").getAbsolutePath());
>          System.setProperty("test.build.data", new
> File("target/hadoop-test").getAbsolutePath());
>
>          localConf = new Configuration();
>          localConf.set("fs.default.name", "file://" + new
> File("target/hadoop-test/dfs").getAbsolutePath());
>          localFs = FileSystem.getLocal(localConf);
>          LOG.info(String.format("Filesystem at %s",
> localFs.getWorkingDirectory()));
>          localConf.set("hadoop.log.dir", new
> Path("target/hadoop-test/logs").makeQualified(localFs).toString());
>          localConf.set("mapred.system.dir", new
> Path("target/hadoop-test/mapred/sys").makeQualified(localFs).toString());
>          localConf.set("mapred.local.dir", new
> File("target/hadoop-test/mapred/local").getAbsolutePath());
>          localConf.set("mapred.temp.dir", new
> File("target/hadoop-test/mapred/tmp").getAbsolutePath());
>          System.setProperty("hadoop.log.dir",
> localConf.get("hadoop.log.dir"));
>      }
>
>      @AfterClass
>      public static void afterClass() throws IOException {
>          if (localFs != null)
>              localFs.close();
>      }
>
>      protected Path localResourceToPath(String path, String target) throws
> IOException {
>          try {
>              final InputStream resource > this.getClass().getResourceAsStream(path);
>              if (resource == null)
>                  throw new IllegalArgumentException(path + " not found");
>              final Path targetPath >                      new Path(localFs.getWorkingDirectory(),
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB