Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> How to run Fault injection in HDFS


Copy link to this message
-
Re: How to run Fault injection in HDFS
Oh, I see. Right, injection framework has been introduced in early 0.21 and
never been backported to 0.20.

On 11/23/09 19:38 , Thanh Do wrote:
> The reason I changed the /build.xml/ is that /build.xml/ in the
> hadoop-common trunk release (0.20.1) does not contain /injectfaults/
> target ( I wanna use AspectJ in the hadoop release that contains both
> hdfs and mapred). I just add following two targets.
>
> <target name="compile-fault-inject" depends="compile-hdfs-classes">
> <!-- AspectJ task definition -->
> <taskdef
>
> resource="org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties">
> <classpath>
> <pathelement location="${common.ivy.lib.dir}/aspectjtools-1.6.4.jar"/>
> </classpath>
> </taskdef>
> <echo message="Start weaving aspects in place"/>
> <iajc
>            encoding="${build.encoding}"
>            srcdir="${hdfs.src.dir};${build.src}"
>            includes="org/apache/hadoop/**/*.java,
> */org/apache/hadoop/myaspect/**/*.aj/*"
>            destDir="${build.classes}"
>            debug="${javac.debug}"
>            target="${javac.version}"
>            source="${javac.version}"
>            deprecation="${javac.deprecation}">
> <classpath refid="test.classpath"/>
> </iajc>
> <echo message="Weaving of aspects is finished"/>
> </target>
>
> <target name="injectfaults" description="Instrument HDFS classes with
> faults and other AOP advices">
> <subant buildpath="${basedir}" target="compile-fault-inject">
> <property name="build.dir" value="${build.dir}"/>
> </subant>
> </target>
>
> So that, when I want to weave my aspect, I only type:
>
> /ant injectfaults/
>
>
> On Fri, Nov 20, 2009 at 3:21 PM, Konstantin Boudnik <[EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]>> wrote:
>
>     Generally the idea was to provide everything needed for injection by
>     what current build.xml is having in Common and Hdfs. Would you mind
>     to share what extra changes you've needed and why?
>
>     Cos
>
>
>     On 11/20/09 12:32 , Thanh Do wrote:
>
>         Thank you folks!
>
>         Finally, I am able (really) to run FI with HADOOP. I added some
>         aspects
>         into the source code, changed the build.xml, and that's it.
>
>         AspectJ is awesome!
>
>         Have a nice weekend!
>
>         On Fri, Nov 20, 2009 at 1:08 PM, Konstantin Boudnik
>         <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>
>         <mailto:[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>> wrote:
>
>             Hi Thanh.
>
>             hmm, it sounds like you have some issue with compilation of
>         your code.
>
>             addDeprication() has been added to Configuration in 0.21, I
>         believe.
>             And it is there no matter how do you compile your code (with
>         FI or
>             without).
>
>             Cos
>
>
>             On 11/19/09 10:12 , Thanh Do wrote:
>
>                 Sorry to dig this thread again!
>
>                 I am expecting the release of 0.21 so that I don't have to
>                 manually play
>                 around with AspectJ FI any more.
>
>                 I still have problem with running HDFS with instrumented
>         code
>                 (with aspect).
>
>                 Here is what I did:
>
>                 In the root directory of HDFS:
>                 /$ ant injectfaults
>
>                 $ ant jar-fault-inject
>                 /At this point, i have a jar file containing hdfs
>         classed, namely,
>                 /hadoop-hdfs-0.22.0-dev-fi.jar/, located in /build-fi/
>         folder.
>
>                 Now I go to the HADOOP folder (which contains running
>         script in bin
>                 directory), and do the following
>                 /$ ant compile-core-classes/
>                 ( now I need additional hdfs classes to be able to run
>                 /start-dfs.sh/ <http://start-dfs.sh/>
>         <http://start-dfs.sh/>,
>
>                 right)
>                 What I did is copying
>                 /$HDFS/build-fi/hadoop-hdfs-0.22.0-dev-fi.jar /to
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB