Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • All public classes and methods should have informative Javadoc comments. 

    • Do not use @author tags. 
  • Code must be formatted according to Sun's conventions, with one exception: 

    • Indent two spaces per level, not four. 
  • Code formatter xml is present here: https://github.com/apache/hadoop/tree/trunk/dev-support/code-formatter . IntelliJ users can directly import hadoop_idea_formatter.xml
  • Contributions must pass existing unit tests. 
    • New unit tests should be provided to demonstrate bugs and fixes. JUnit is our test framework: 

    • You must implement a class that uses @Test annotations for all test methods. Please note, Hadoop uses JUnit v4. 

    • Define methods within your class whose names begin with test, and call JUnit's many assert methods to verify conditions; these methods will be executed when you run mvn test. Please add meaningful messages to the assert statement to facilitate diagnostics. 

    • By default, do not let tests write any temporary files to /tmp. Instead, the tests should write to the location specified by the test.build.data system property. 

    • If a HDFS cluster or a MapReduce/YARN cluster is needed by your test, please use org.apache.hadoop.dfs.MiniDFSCluster and org.apache.hadoop.mapred.MiniMRCluster (or org.apache.hadoop.yarn.server.MiniYARNCluster), respectively. TestMiniMRLocalFS is an example of a test that uses MiniMRCluster. 

    • Place your class in the src/test tree. 

    • TestFileSystem.java and TestMapRed.java are examples of standalone MapReduce-based tests. 

    • TestPath.java is an example of a non MapReduce-based test. 

    • You can run all the project unit tests with mvn test, or a specific unit test with mvn -Dtest=<class name without package prefix> test. Run these commands from the hadoop-trunk directory. 

  • If you modify the Unix shell scripts, see the UnixShellScriptProgrammingGuide.  

...