Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Expand
titleBuild a release candidate

The core of the release process is the build-vote-fix cycle. Each cycle produces one release candidate. The Release Manager repeats this cycle until the community approves one release candidate, which is then finalized.

Build and stage Java and Python artifacts

Set up a few environment variables to simplify Maven commands that follow. This identifies the release candidate being built. Start with RC_NUM equal to 1 and increment it for each candidate.

Code Block
languagebash
RC_NUM="1"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}"

Now, create a release branch ((warning) this step can not be skipped for minor releases):

Code Block
languagebash
$ cd tools
tools $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh

Tag the release commit:

Code Block
git tag -s ${TAG} -m "${TAG}"

You can use -c "user.signingkey=<gpg-fingerprint>" as an additional git parameter if you have multiple signing keys in your keychain.

We now need to do several things:

  • Create the source release archive
  • Deploy jar artefacts to the Apache Nexus Repository, which is the staging area for deploying the jars to Maven Central
  • Build PyFlink wheel packages (since 1.11)

You might want to create a directory on your local machine for collecting the various source and binary releases before uploading them. Creating the binary releases is a lengthy process but you can do this on a another machine (for example, in the "cloud"). When doing this, you can skip signing the release files on the remote machine, download them to your local machine and sign them there.

First, we build the source release:

Code Block
languagebash
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh

Next, we stage the maven artifacts:

Code Block
languagebash
tools $ releasing/deploy_staging_jars.sh

Review all staged artifacts in the staging repositories(https://repository.apache.org/#stagingRepositories). They should contain all relevant parts for each module, including pom.xml, jar, test jar, source, test source, javadoc, etc. Carefully review any new artifacts.

Close the staging repository on Apache Nexus. When prompted for a description, enter “Apache Flink, version X, release candidate Y”.

Then, you need to build the PyFlink wheel packages.(since 1.11) 

  1. Set up an azure pipeline in your own Azure account. You can refer to Azure Pipelines for more details on how to set up azure pipeline for a fork of the Flink repository. Note that a google cloud mirror in Europe is used for downloading maven artifacts, therefore it is recommended to set your Azure organization region to Europe to speed up the downloads.
  2. Push the release candidate branch to your forked personal Flink repository, e.g.

    Code Block
    languagebash
    tools $ git push <remote> refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}


  3. Trigger the Azure Pipelines manually to build the PyFlink wheel packages
    1. Go to your Azure Pipelines Flink project → Pipelines
    2. Click the "New pipeline" button on the top right
    3. Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines YAML file"
    4. Select your branch → Set path to "/azure-pipelines.yaml" → click on "Continue" → click on "Variables"
    5. Then click "New Variable" button, fill the name with "MODE", and the value with "release". Click "OK" to set the variable and the "Save" button to save the variables, then back on the "Review your pipeline" screen click "Run" to trigger the build.
    6. You should now see a build where only the "CI build (release)" is running
  4. Download the PyFlink wheel packages from the build result page after the jobs of "build_wheels mac" and "build_wheels linux" have finished.
    1. Download the PyFlink wheel packages
      1. Open the build result page of the pipeline
      2. Go to the `Artifacts` page (build_wheels linux -> 1 artifact)
      3. Click `wheel_Darwin_build_wheels mac` and `wheel_Linux_build_wheels linux` separately to download the zip files
    2. Unzip these two zip files

      Code Block
      languagebash
      $ cd /path/to/downloaded_wheel_packages
      $ unzip wheel_Linux_build_wheels\ wheel_Linux_build_wheels_on_Linux.zip
      $ unzip wheel_Darwin_build_wheels\ wheel_Darwin_build_wheels_on_macos.zip


    3. Create directory `dist` under the directory of flink-python

      Code Block
      languagebash
      $ cd <flink-dir>
      $ mkdir flink-python/dist


    4. Move the unzipped wheel packages to the directory of flink-python/dist

      Code Block
      languagebash
      $ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
      $ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
      $ cd tools


Finally, we create the binary convenience release files:

Code Block
languagebash
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh

If you want to run this step in parallel on a remote machine you have to make the release commit available there (for example by pushing to a repository). This is important: the commit inside the binary builds has to match the commit of the source builds and the tagged release commit. When building remotely, you can skip gpg signing by setting  SKIP_GPG=true. You would then sign the files manually after downloading them to your machine:

Code Block
languagebash
for f in flink-*-bin*.tgz; do gpg --armor --detach-sig $f; done

gpg --armor --detach-sig apache-flink-*.tar.gz

The release manager need to make sure the PyPI project `apache-flink` and `apache-flink-libraries` has enough available space for the python artifacts. The remaining space must be larger than the size of `tools/releasing/release/python`. Login with the PyPI admin account (account info is only available to PMC members) and check the remaining space in project settings

Request an increase if there's not enough space. Note, it could take some days for PyPI to review our request. 

Stage source and binary releases on dist.apache.org

Copy the source release to the dev repository of dist.apache.org.

  1. If you have not already, check out the Flink section of the dev repository on dist.apache.org via Subversion. In a fresh directory:

    Code Block
    languagebash
    svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates


  2. Make a directory for the new release:

    Code Block
    languagebash
    mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
    Copy Flink source/binary distributions, hashes, and GPG signature and the python subdirectory:
    Code Block
    mv <flink-dir>/tools/releasing/release/* flink/flink-${RELEASE_VERSION}-rc${RC_NUM}


  3. Add and commit all the files.

    Code Block
    languagebash
    cd flink 
    svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
    svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"


  4. Verify that files are present

(Push the release tag)

If you haven't pushed the release tag yet, here's the command:

Code Block
git push <remote> refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM}

Propose a pull request for website updates

The final step of building the candidate is to propose a website pull request containing the following changes:

  • update docs/data/flink.yml
    • Add a new major version or update minor version as required
  • update docs/data/release_archive.yml
  • update version references in quickstarts (q/ directory) as required (outdated?)
  • add a blog post announcing the release in _posts
  • (major, or minor release of latest stable version) update FlinkStableVersion in docs/config.toml to point to release version.
  • (major only) add a organized release notes page under docs/content/release-notes and docs/content.zh/release-notes (like https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/). The page is based on the non-empty release notes collected from the issues, and only the issues that affect existing users should be included (e.g., instead of new functionality). It should be in a separate PR since it would be merged to the flink project. 

Don’t merge the PRs before finalizing the release.

Checklist to proceed to the next step

  1. Maven artifacts deployed to the staging repository of repository.apache.org
  2. Source distribution deployed to the dev repository of dist.apache.org
  3. Website pull request proposed to list the release
  4. (major only) Check docs/config.toml to ensure that
    • the version constants refer to the new version
    • the baseurl does not point to flink-docs-master  but flink-docs-release-X.Y instead

You can (optionally) also do additional verification by:

  1. Check hashes (e.g. shasum -c *.sha512)
  2. Check signatures (e.g. gpg --verify flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz)
  3. grep for legal headers in each file.
  4. If time allows check the NOTICE files of the modules whose dependencies have been changed in this release in advance, since the license issues from time to time pop up during voting. See Verifying a Flink Release "Checking License" section.

...

Expand
titleFinalize the release

Once the release candidate has been reviewed and approved by the community, the release should be finalized. This involves the final deployment of the release candidate to the release repositories, merging of the website changes, etc.

Deploy Python artifacts to PyPI (Since 1.9)

Release manager should create a PyPI account and ask the PMC add this account to pyflink collaborator list with Maintainer role (The PyPI admin account info can be found here. NOTE, only visible to PMC members;) to deploy the Python artifacts to PyPI. The artifacts could be uploaded using twine(https://pypi.org/project/twine/). To install twine, just run:

Code Block
languagetext
pip install --upgrade twine==1.12.0

Note: Please ensure that the version of urllib3  is less than 2.0 in your python env, otherwise you may encounter some problem like unexpected keyword argument method_whitelist.

Info
titleapitoken

Please note that Username/Password authentication is no longer supported. Please use API Tokens https://pypi.org/help/#apitoken

Download the python artifacts from dist.apache.org and upload it to pypi.org:

Code Block
languagetext
svn checkout https://dist.apache.org/repos/dist/dev/flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
cd flink-${RELEASE_VERSION}-rc${RC_NUM}

cd python

#uploads wheels
for f in *.whl; do twine upload -r pypi $f; done

#upload source packages
twine upload -r pypi apache-flink-libraries-${RELEASE_VERSION}.tar.gz

twine upload -r pypi apache-flink-${RELEASE_VERSION}.tar.gz

PyPI has removed GPG signatures support, so it is recommended to no longer upload signatures to PyPI.

If upload failed or incorrect for some reason(e.g. network transmission problem), you need to delete the uploaded release package of the same version(if exists) and rename the artifact to apache-flink-${RELEASE_VERSION}.post0.tar.gz, then re-upload.

Note: re-uploading to pypi.org must be avoided as much as possible because it will cause some irreparable problems. If that happens, users cannot install the apache-flink package by explicitly specifying the package version, i.e. the following command "pip install apache-flink==${RELEASE_VERSION}" will fail. Instead they have to run "pip install apache-flink" or "pip install apache-flink==${RELEASE_VERSION}.post0" to install the apache-flink package.

Deploy artifacts to Maven Central Repository

Use the Apache Nexus repository to release the staged binary artifacts to the Maven Central repository. In the Staging Repositories section, find the relevant release candidate orgapacheflink-XXX entry and click Release. Drop all other release candidates that are not being released.

Deploy source and binary releases to dist.apache.org

Copy the source and binary releases from the dev repository to the release repository at dist.apache.org using Subversion.

Code Block
svn move -m "Release Flink ${RELEASE_VERSION}" https://dist.apache.org/repos/dist/dev/flink/flink-${RELEASE_VERSION}-rc${RC_NUM} https://dist.apache.org/repos/dist/release/flink/flink-${RELEASE_VERSION}

(Note: Only PMC members have access to the release repository. If you do not have access, ask on the mailing list for assistance.)

Remove old release candidates from dist.apache.org

Remove the old release candidates from https://dist.apache.org/repos/dist/dev/flink using Subversion.

Code Block
languagebash
titleRemove old release candidates from dist.apache.org
svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
cd flink
svn remove flink-${RELEASE_VERSION}-rc*
svn commit -m "Remove old release candidates for Apache Flink ${RELEASE_VERSION}"

Git tag

Create and push a new Git tag for the released version by copying the tag for the final release candidate, as follows:

Code Block
git tag -s "release-${RELEASE_VERSION}" refs/tags/${TAG}^{} -m "Release Flink ${RELEASE_VERSION}"
git push <remote> refs/tags/release-${RELEASE_VERSION}

Mark the version as released in JIRA

In JIRA, inside version management, hover over the current release and a settings menu will appear. Click Release, and select today’s date.

(Note: Only PMC members have access to the project administration. If you do not have access, ask on the mailing list for assistance.)

If PRs have been merged to the release branch after the the last release candidate was tagged, make sure that the corresponding Jira tickets have the correct Fix Version set.

Publish the Dockerfiles for the new release

Note: the official Dockerfiles fetch the binary distribution of the target Flink version from an Apache mirror. After publishing the binary release artifacts, mirrors can take some hours to start serving the new artifacts, so you may want to wait to do this step until you are ready to continue with the "Promote the release" steps below.

Follow the instructions in the flink-docker repo to build the new Dockerfiles and send an updated manifest to Docker Hub so the new images are built and published.

Info
titlepublishing to DockerHub: apache/flink

Please make sure "publishing to DockerHub: apache/flink" is finished before the announce mail or announcement blog post is sent.


Checklist to proceed to the next step

  • Python artifacts released and indexed in the PyPI Repository https://pypi.org/project/apache-flink/#history
  • Maven artifacts released and indexed in the Maven Central Repository (usually takes about a day to show up)
  • Source & binary distributions available in the release repository of https://dist.apache.org/repos/dist/release/flink/
  • Dev repository https://dist.apache.org/repos/dist/dev/flink/ is empty
  • Release tagged in the source code repository
  • Release version finalized in JIRA. (Note: Not all committers have administrator access to JIRA. If you end up getting permissions errors ask on the mailing list for assistance)
  • Website contains links to new release binaries and sources in download page
  • For major releases, the front page references the correct new major release version and directs to the correct link
  • Dockerfiles in flink-docker updated for the new Flink release and pull request opened on the Docker official-images with an updated manifest

...