Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # dev >> Unit tests

From: Ben West [[EMAIL PROTECTED]]
Sent: Sunday, January 29, 2012 11:11 PM
Subject: Unit tests


I'm trying to write a unit test for my patch at https://issues.apache.org/jira/browse/HADOOP-7943 and I have a few questions based on reading the wiki (http://wiki.apache.org/hadoop/HowToDevelopUnitTests)

1. The patch is for v1.0, not trunk. Should I be using the "junit.framework" style tests or the "org.junit" stuff? (I think this might correspond to JUnit v3 vs. v4? It seems like trunk has switched to this new style, but not 1.0)
[Uma Ans] V4 style tests recommended.
2. The wiki says "Avoid starting servers (including Jetty and Mini{DFS|MR}Clusters) in unit tests... Try to use one of the lighter weight test doubles." How do I do this? I want to test copying a file from HDFS to a local file system; is it possible to run this test without starting a mini cluster?
[Uma Ans]  If you have some thing specific to DFS to test which require server, then you must start MiniDFSCluster.  Looks your test requires to test DFS permissions. If you are testing local filesystem behaviour then you need not start cluster. you can do operations from local to local.
3. Are there any existing tests of the FsShell that I could piggy back off of? I see trunk has some tests relating to the shell, but none testing full-out functionality like copying a file.
[Uma Ans] You should be able to find it from TestDFSShell.java

Also, adding the answers to #1 and #2 to the wiki would be helpful.