-Re: Fault injection framework for testing
Konstantin Boudnik 2013-01-16, 03:13
Hadoop-1 includes framework called Herriot that would allow you to develop
on-the-cluster FI system tests. However, because of the some timing, it hasn't
been hooked into the maven build system Hadoop-2 branches.
Basically, I see two way of doing what you need to do here:
- wait until the Herriot is integrated back (that might take a while,
- go along with MOP using Groovy and develop a cluster test for your
feature. MOP won't require pretty much anything but a groovy jar to be
added to the classpath of the java process(es) in question. With it in
place you can instrument anything you want the way you need during the
application bootstrap. In fact, I think Herriot would be better off with
that approach instead of initial AspectJ build-time mechanism.
Hope it helps,
On Wed, Jan 16, 2013 at 02:19AM, Tsuyoshi OZAWA wrote:
> I've created patch for MAPREDUCE-4502. Now, I confirmed that it works
> well for usual case, and I also added code to handle MapTask failure.
> As a next step, I need to add test code against MapTask failure.
> So I have questions:
> Is there fault injection in MapReduce testing framework?
> If the answer is negative, do you have any ideas to test it?
> OZAWA Tsuyoshi