Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Verifying a Release Candidate

Legal: (required checks for a valid ASF compilant release)

  • Check if checksums and GPG files match the corresponding release files

  • Check if the source archive contains any binaries
    • binaries are not allowed in the source release

  • Check if the source release is building properly with maven (including license header check (default) and checkstyle). Also the tests should be executed (mvn clean install)

  • Verify that the LICENSE and NOTICE file is correct for the binary and source release.
    • All dependencies must be checked for their license and the license must be ASL 2.0 compatible (http://www.apache.org/legal/resolved.html#category-x)
    • The LICENSE and NOTICE files in the root directory refer to dependencies in the source release, i.e., files in the git repository (such as fonts, css, JavaScript, images)
    • The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer to the binary distribution and mention all of Flink's Maven dependencies as well

  • Check that all POM files point to the same version (mostly relevant to examine quickstart artifact files)

  • Check the README.md file

 

Functional: (checks for delivering a release with good quality)

  • Run the start-local.sh, start-cluster.sh, start-webclient.sh scripts and verify that the processes come up
    • Examine the *.out files (should be empty) and the log files (should contain no exceptions)
    • Test for Linux, OS X, Windows (for Windows as far as possible, not all scripts exist)
    • Shutdown and verify there are no exceptions in the log output (after shutdown)
  • Verify that the examples are running from both ./bin/flink and from the web-based job submission tool
    • Should be run on
      • local mode (start-local.sh)
      • cluster mode (start-cluster.sh)
      • multi-node cluster (can simulate locally by starting two taskmanagers)
    • The flink-conf.yml should define more than one task slot

    • Results of job are produced and correct
    • Check also that the examples are running with the build-in data and external sources.On

    • Examine the log output - no error messages should be encountered
    • Web interface shows progress and finished job in history

  • Test on a cluster with HDFS.
    • Check check that a good amount of input splits is read locally
  • Verify that Flink is working properly on OS X, Linux and Windows
    • No exceptions in the log output (after shutdown)
    • Web interface shows progress and finished job in history
    • Results of job are produced and correct
    • (JobManager log reveals local assignments)

  • Test the ./bin/flink command line client
    • Test "info" option, paste the JSON into the plan visualizer HTML file, check that plan is rendered
    • Test the parallelism flag (-p) to override the configured default parallelism

  • Verify the plan visualizer with different browsers/operating systems

  • Verify that the quickstarts for scala and java are working with the staging repository for both IntelliJ and Eclipse.
    • in particular the dependencies of the quickstart project need to be set correctly and the QS project needs to build from the staging repository (replace the snapshot repo URL with the staging repo URL)
    • The dependency tree of the QS QuickStart project must not contain any dependencies we shade away upstream (guava, netty, ...)

  • Run examples on a YARN cluster

  • Run all examples from the IDE (Eclipse & IntelliJ)

  • Run an example with the RemoteEnvironment against a cluster started from the shell script

 

Creating a release candidate

...