-Re: Fault injection framework for testing
Tsuyoshi OZAWA 2013-01-16, 04:18
Thanks for your comment.
Your comment is helpful for me.
I'd like to go with 2nd approach - MOP with Groovy. In that case, how
can I add test code to the trunk?
Is it acceptable for Hadoop project to add test code written in groovy?
On Wed, Jan 16, 2013 at 12:13 PM, Konstantin Boudnik <[EMAIL PROTECTED]> wrote:
> Hadoop-1 includes framework called Herriot that would allow you to develop
> on-the-cluster FI system tests. However, because of the some timing, it hasn't
> been hooked into the maven build system Hadoop-2 branches.
> Basically, I see two way of doing what you need to do here:
> - wait until the Herriot is integrated back (that might take a while,
> - go along with MOP using Groovy and develop a cluster test for your
> feature. MOP won't require pretty much anything but a groovy jar to be
> added to the classpath of the java process(es) in question. With it in
> place you can instrument anything you want the way you need during the
> application bootstrap. In fact, I think Herriot would be better off with
> that approach instead of initial AspectJ build-time mechanism.
> Hope it helps,
> On Wed, Jan 16, 2013 at 02:19AM, Tsuyoshi OZAWA wrote:
>> I've created patch for MAPREDUCE-4502. Now, I confirmed that it works
>> well for usual case, and I also added code to handle MapTask failure.
>> As a next step, I need to add test code against MapTask failure.
>> So I have questions:
>> Is there fault injection in MapReduce testing framework?
>> If the answer is negative, do you have any ideas to test it?
>> OZAWA Tsuyoshi