Testing Strategies

Currently, Maven only supports unit testing out of the box. What we want is something flexible enough to work without having to make test definitions arbitrary.

Types of Tests

Unit Tests

  • run before package
  • want 100% success before moving on generally

Integration Tests

  • require the artifact, so run after package but before install.
  • generally want 100% before moving on.
  • May use junit or other plugin like cactus, including deploying to a server

Acceptance Tests

  • also called functional or regression tests.
  • not run every time, probably a report and/or part of a profile.
  • run after install, before deploy
  • Regression tests (acceptance tests from previous milestones) should be at 100%
  • acceptance (also called functional) tests only need to get to 100% at release time.
  • performance/stress tests - another form of acceptance test.

Use cases

  • plugin integration tests (c.f. eclipse plugin)
  • report integration tests (see Vincent S's work - htmlunit)
  • cactus plugin
  • geronimo itest plugin
  • need to be able to compile additional sources, have own dependencies (basically reapply the whole test lifecycle again - given that this also matches the main one, perhaps that can be reusable/forked)
  • we should have a setup and teardown phase around the actual test goal
  • may want to reapply test stuff on other test sets (eg junit report, clover)

separate project is ideal, but

  • a bit unnatural, more strung out
  • doesn't fit IDE well
  • you can still use it if you support in project tests, so better to do that

While a separate project might be something we recommend, I think we need to support them within the same project.

Design

Plugin integration tests

This means running a separate instance of Maven to do it effectively, so should always be separate projects utilising the built archive without requiring it be installed (bound to the integration test phase).

(question) How can we ensure the integration test of the subproject is run before installation of the parent?
(info) Perhaps it is better to have a plugin integration testing plugin that actually runs from the parent and execs maven on the subproject.

Report integration tests

This should be done in an identical fashion to a plugin integration test, but using htmlunit to test the output. This means having junit test cases.

Junit Integration Tests

We have an issue here in that we would have to recofigure the whole plugin, essentially needing to know its information. Can we preconfigure it, like we do with the forked lifecycles? Or is it best to just wire this one in by default to the integration testing parts of the lifecycle and use a different mojo (surefire:itest, compiler:itestCompile)?

TODO

Cactus Plugin

TODO

Junit Acceptance Tests

These would be set up in the same way as the junit integration tests except for the time at which they are run. The main difference may be that the default configuration would be to change the required acceptance to pass. It may be better to have a different plugin for these in general, but the default would also be surefire like above.

Reapplying junit report/clover to other test sets

TODO

Resources

  • No labels

6 Comments

  1. Unknown User (davesag)

    With respect to UATs I would like to see support for FIT style tests ala FitNesse (http://fitnesse.org).

  2. Unknown User (davesag)

    I've mentioned this before in other forums but the surefire plugin needs to run with assertions enabled for almost any of my unit tests to pass properly. Currently this can only be done by setting MAVEN_OPTS=-ea in my .profile and is not a portable solution.

    This is discussed in Jira. http://jira.codehaus.org/browse/MNG-441

  3. Unknown User (martinus)

    Hi, I have just recently started using maven, so apologies if this is nonsense. Nevertheless, this is what I think would be a good way to handle different types of testing:

    I think the test goal should have different stages, for example:

    1. mandatory
    2. default
    3. extensive
    1. everything

    per default "mvn test" executes all mandatory tests, and all default tests. "mvn test::everything" executes all available tests. 

    Some different types of tests, and their level:

    • Unit Tests to test individual classes or objects (default)
    • Acceptance tests / Functional tests (everything)
    • Performance Tests: how fast it is (extensive)
    • Load Tests: how it performs under a large load (extensive)
    • Smoke Tests: fast tests for the key functionality (mandatory)
    • Integration Tests: how the pieces work together (default)
    • Mock Client Tests: tests from the client's point of view (default)
    • ...

    It should also be possible to define custom test types, assign them to a level, and to run them.

    Well, at least that's what I would really like to have (smile)

  4. Martin: currently you can use profiles to modify which tests are run depending on the environment and your command line paraemeters.

  5. From an architectural viewpoint, I believe that requiring a separate project for integration tests is a poor choice -- integration tests have exactly the same affinity with the sources being tested that unit tests do.  In particular, a source code release of the main project should always include the sources for the integration tests too.  That requires error-prone manual intervention in the current model.

     My suggestion is that you add a series of lifecycle phases similar in operation to the current testing section, but with its own source directory (perhaps it/java, it/resources, etc.).  Most important is that this also needs a separate scope for dependencies ... for exapmle, my Ant-based integration tests of webapps in Shale require HtmlUnit, but the unit tests do not.

     As I mentioned in MNG-2344, I can sort of live with this for Shale itself, because we can teach our own committers to always run the integration tests project, and teach our release managers to release both projects simultaneously.  But I was hoping to leverage the archetype capabilities of Maven to create "starter" webapp project structures, complete with starter unit and integration tests.  Without the ability to "show by example" that integration tests are important, we're likely to see more under-tested or un-tested webapps in the world (sad).

  6. Unknown User (rauar)

    I agree 100% Craig's view of this issue and I see several problems with the separate project solution.

    • integration (and other) tests have exactly the same importance like unit tests from a project's view - a project should strictly not be allowed to be installed/deployed/released if its integration tests fail. A separate project approach completely prevent such a requirement as a separate project is beyond the actual project's lifecycle.
    • test feedback for the developer should be provided when working on the project - not on any other project
    • the only difference between different types of tests ( e.g. unit tests vs. integration tests) is IMHO the current build stage of the project.
      • Unit tests - compiled sources, resources, test sources, test resources (currently separated from the actual sources and tied together in the ./test/ folder)
      • Integration tests - precondition of unit tests AND prepackaged build ( e.g. war:exploded ) currently NOT separated/integrated)
      • Acceptance tests - precondition of integration tests and completely built project - currently NOT separated/integrated)

    Therefore I propose the following project directory structure besides other directories:

    • main/src
    • main/resources
    • unittest/src (earlier test/src)
    • unittest/resources (earlier test/resource)
    • integrationtest/src
    • integrationtest/resources
    • acceptancetest/src (java sources needed for acceptance tests ?)
    • acceptancetest/resource

    This would by convention allow different (re-)sources to be used on different stages of the build. Compatibility could be preserved if the ./test/ folder would be treated as an alias to the ./unittest/ folder.

    A second step would be to find a solution for aggregating common output (like e.g. clover reports which should be collected over integration tests as well, perhaps even over acceptance tests as far as they are automated).

    I'm pretty new in the maven business, but this would be my intuitive view of a maven solution. I can't seriously estimate the efforts.