General Concept

We have usually set up build jobs manually for every project at Apache. Usually even more than one, especially when it comes to branched development. This can be annoying. 

The pattern was to manually duplicate the develop/master build and change the branch it checks out and monitors. 

The Multi-Branch Pipeline Plugin now adds a new type of Job to Jenkins. This job is a lot simpler to set up compared to other build jobs because it doesn't actually have to be set up, as the build instructions are shifted to become part of the project. 

The new job type monitors the branches of a given repository and checks if these branches contain a file called: "Jenkinsfile". If one or more does, a build job is automatically configured.

The Jenkinsfile contains all the information Jenkins needs to execute the build.

If you have set up a correct Jenkinsfile in your develop branch, every branch created from that will automatically have its build set up without manual intervention.

A Jenkinsfile used to be a Groovy script in which the different build steps were configured, the preferred way to do this is by using the Jenkinsfile Groovy DSL.

With this you declaratively define the different steps of your build. Here is an example:

#!groovy
pipeline {

    agent {
        node {
            label 'ubuntu'
        }
    }

    environment {
		// ... setup any environment variables ...
    }

    tools {
		// ... tell Jenkins what java version, maven version or other tools are required ...
    }

    options {
		// Configure an overall timeout for the build of one hour.
        timeout(time: 1, unit: 'HOURS')
        // When we have test-fails e.g. we don't need to run the remaining steps
        skipStagesAfterUnstable()
    }

    stages {
        stage('Initialization') {
            steps {
				// ... define any initialization ...
            }
        }

        stage('Cleanup') {
            steps {
				// ... define any cleanup operations ...
            }
        }

        stage('Checkout') {
            steps {
				// ... checkout the current branch ...
            }
        }

        stage('Build') {
            when {
                expression {
                    env.BRANCH_NAME != 'develop'
                }
            }
            steps {
				// ... perform the build of any non-develop branch ...
            }
            post {
                always {
                    junit(testResults: '**/surefire-reports/*.xml', allowEmptyResults: true)
                    junit(testResults: '**/failsafe-reports/*.xml', allowEmptyResults: true)
                }
            }
        }

        stage('Build develop') {
            when {
                branch 'develop'
            }
            steps {
				// ... perform the build of the develop branch ...
            }
            post {
                always {
                    junit(testResults: '**/surefire-reports/*.xml', allowEmptyResults: true)
                    junit(testResults: '**/failsafe-reports/*.xml', allowEmptyResults: true)
                }
            }
        }

        stage('Code Quality') {
            when {
                branch 'develop'
            }
            steps {
 				// ... perform code quality checks ...
           }
        }

        stage('Deploy') {
            when {
                branch 'develop'
            }
            steps {
				// ... deploy snapshots ...
            }
        }

        stage('Build site') {
            when {
                branch 'develop'
            }
            steps {
				// ... generate the projects website ...
            }
        }

        stage('Stage site') {
            when {
                branch 'develop'
            }
            steps {
				// ... stage the projects website ...
            }
        }

        stage('Deploy site') {
            when {
                branch 'develop'
            }
            steps {
				// ... deploy the projects website ...
            }
        }
    }

	// Do any post build stuff ... such as sending emails depending on the overall build result.
    post {
        // If this build failed, send an email to the list.
        failure {
        }

        // If this build didn't fail, but there were failing tests, send an email to the list.
        unstable {
        }

        // Send an email, if the last build was not successful and this one is.
        success {
        }

        always {
        }
    }

}

The above skeleton defines a build that runs on a Jenkins build agent labeled with "ubuntu". It has multiple stages, not all of which run in every case. 

Up to the step "Build" or "Build develop" all steps are always executed. However, for non-develop builds, only "build" is run, which might just compile and run all the tests, It doesn't do anything beyond that.

If it's the "develop" branch, however, after building, the QA checks are done, Snapshots are deployed and the website is generated and deployed. 

Preparing your IDE

While it is possible to write your Jenkinsfile without tooling, sometimes it does help, especially if you're new to this and your IDE can provide you with content assist, auto-completion and validation. Depending on your IDE of choice the setup might differ greatly.

Probably the two resources you will need are the "gdl" file generated by Jenkins and a declarative pipeline definition available here: https://gist.github.com/arehmandev/736daba40a3e1ef1fbe939c6674d7da8

Save this file to some directory inside your project (Doesn't have to be checked in).

In order to get the generated gdl file which Jenkins generates, log in to the ASF Jenkins at https://builds.apache.org and select a Multi-Branch Pipeline build. You can do this by selecting "All" build jobs and searching for the icon for a pipeline build:

Select that build and open the "Pipeline Syntax" link:

This opens a page that generates different code snippets.

IntelliJ IDEA

To have IntelliJ help you, make sure IntelliJ treats Jenkinsfiles as Groovy files. Go to "Preferences" and select "Editor"/"File Types".

Within the settings in the "Recognized File Types", select "Groovy". Then in the section "Registered Patterns" click on the "+" button and add the pattern "Jenkinsfile" to the list.

After that IntelliJ will treat Jenkinsfiles as Groovy files. Unfortunately it still doesn't help much regarding content assist and code completion.

Jenkins provides a way to download a so-called GDSL file. In order to get this file go to https://builds.apache.org and select the "Pipeline Syntax" as described in the general "Preparing your IDE" section.

There, click on the "Download IDEA GDSL" link:

Save this file somewhere in your project (doesn't have to be checked in). Ideally you should add it to a directory dedicated to containing such Groovy files.

For IntelliJ to pick this file up, make the directory containing the GDSL a "source directory". To do this, select the IntelliJ IDEA "Project Structure" settings. Go to the "Project Settings"/"Modules" tab and select the module containing the directory with the GDSL files. On the Sources tab of that settings, right-click the directory with the GDSL file and select "Sources".

Now you should be all set for editing your Jenkinsfile.

Deploying Artifacts

When using tools like Maven, you deploy new artifacts as part of the build process. 

In some projects, it is hard to deploy snapshots because the dedicated project build node doesn't have the credentials to deploy to Nexus. To solve the problem, split build and deploy into two steps. In the "Build" step run a normal Maven build, but provide an alternate remote repository to deploy things to. After the build this directory contains all the files that a normal build would transfer to Nexus.

Next, run the actual deployment on a node that has the credentials to deploy to Nexus. Transfer the artifacts you need to it using the "stash" and "unstash" commands. These commands pack up the content of the local directory of the build node and unpack it on a deploy node.

Here part of our Jenkinsfile:

#!groovy
pipeline {

    agent {
        node {
            label 'plc4x'
        }
    }

	// ... snip ...

    stages {

		// ... snip ...

        stage('Build develop') {
            steps {
                echo 'Building'
                // Make sure the directory is wiped.
                dir("local-snapshots-dir/") {
                    deleteDir()
                }

                // We'll deploy to a relative directory so we can save
                // that and deploy in a later step on a different node
                sh 'mvn -DaltDeploymentRepository=snapshot-repo::default::file:./local-snapshots-dir clean deploy'

                // Stash the build results so we can deploy them on another node
                stash name: 'plc4x-build-snapshots', includes: 'local-snapshots-dir/**'
            }
        }

		// ... snip ...


        stage('Deploy') {
            // Only the official build nodes have the credentials to deploy setup.
            agent {
                node {
                    label 'ubuntu'
                }
            }
            steps {
                echo 'Deploying'
                // Clean up the snapshots directory.
                dir("local-snapshots-dir/") {
                    deleteDir()
                }

                // Unstash the previously stashed build results.
                unstash name: 'plc4x-build-snapshots'

                // Deploy the artifacts using the wagon-maven-plugin.
                sh 'mvn -f jenkins.pom -X -P deploy-snapshots wagon:upload'
            }
        }

		// ... snip ...
    }

	// ... snip ...

}

The content of the "jenkins.pom" is as follows:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>org.apache</groupId>
        <artifactId>apache</artifactId>
        <version>21</version>
    </parent>

    <groupId>org.apache.plc4x</groupId>
    <artifactId>plc4x-jenkins-tools</artifactId>
    <version>0.2.0-SNAPSHOT</version>
    <packaging>pom</packaging>

    <name>PLC4X: Jenkins Tools</name>
    <description>Set of helpers to do individual tasks only needed on our Jenkins build.</description>



	<!-- ... snip ... -->

    <profiles>
        <!--
            This profile is used to deploy all the artifacts in the
            'local-snapshots-dir' to Apache's SNAPSHOT repo.
        -->
        <profile>
            <id>deploy-snapshots</id>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>wagon-maven-plugin</artifactId>
                        <version>2.0.0</version>
                        <configuration>
                            <fromDir>${project.basedir}/local-snapshots-dir</fromDir>
                            <includes>**</includes>
                            <serverId>apache.snapshots.https</serverId>
                            <url>${distMgmtSnapshotsUrl}</url>
                        </configuration>
                    </plugin>
                </plugins>
            </build>
        </profile>


		<!-- ... snip ... -->

    </profiles>

</project>

Note that the second wiping in the deploy phase is necessary as otherwise we would be deploying more snapshot versions with every run as the initial wipe only wiped things on the build node and not the deploy node. As the artifacts are not named "mymodule-1.2.3-SNAPSHOT" but with the timestamp SNAPSHOT format, more and more versions would be building up in the deploy nodes increasing the time for deployment with every build.

Deploying Website Content

Similar to the problem of not being able to deploy to Nexus on every node, only Jenkins nodes tagged with "git-websites" are able to push to an ASF Git repositories "asf-site" branch (and only to that branch)

To deal with this, again split website generation and deployment into two separate steps. Run generation on the build node and deployment on a "git-websites" node. The problem of transferring stuff from one node to the other is again taken care of by the Jenkins "stash" and "unstash" function.

Here is an example Jenkinsfile:

#!groovy
pipeline {

    agent {
        node {
            label 'plc4x'
        }
    }

	// ... snip ...

    stages {


		// ... snip ...

        stage('Build site') {
            when {
                branch 'develop'
            }
            steps {
                echo 'Building Site'
                sh 'mvn -P${JENKINS_PROFILE} site'
            }
        }

        stage('Stage site') {
            when {
                branch 'develop'
            }
            steps {
                echo 'Staging Site'
                sh 'mvn -P${JENKINS_PROFILE} site:stage'
                // Stash the generated site so we can publish it on the 'git-website' node.
                stash includes: 'target/staging/**/*', name: 'plc4x-site'
            }
        }

        stage('Deploy site') {
            when {
                branch 'develop'
            }
            // Only the nodes labeled 'git-websites' have the credentials to commit to the.
            agent {
                node {
                    label 'git-websites'
                }
            }
            steps {
                echo 'Deploying Site'
                // Unstash the previously stashed site.
                unstash 'plc4x-site'
                // Publish the site with the scm-publish plugin.
                sh 'mvn -f jenkins.pom -X -P deploy-site scm-publish:publish-scm'
            }
        }
    }

	// ... snip ...

}

The content of the "jenkins.pom" are as follows:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>org.apache</groupId>
        <artifactId>apache</artifactId>
        <version>21</version>
    </parent>

    <groupId>org.apache.plc4x</groupId>
    <artifactId>plc4x-jenkins-tools</artifactId>
    <version>0.2.0-SNAPSHOT</version>
    <packaging>pom</packaging>

    <name>PLC4X: Jenkins Tools</name>
    <description>Set of helpers to do individual tasks only needed on our Jenkins build.</description>

	<!-- We are publishing the site to a different repository -->
	<distributionManagement>
    	<site>
	        <id>apache.website</id>
        	<url>scm:git:https://gitbox.apache.org/repos/asf/incubator-plc4x-website.git</url>
    	</site>
	</distributionManagement>

    <profiles>


		<!-- ... snip ... -->

		<profile>
		    <id>deploy-site</id>
		    <build>
		        <plugins>
		            <plugin>
        		        <groupId>org.apache.maven.plugins</groupId>
                		<artifactId>maven-scm-publish-plugin</artifactId>
		                <configuration>
        		            <!-- mono-module doesn't require site:stage -->
		                    <content>${project.build.directory}/staging</content>
        		            <!-- branch where to deploy -->
		                    <scmBranch>asf-site</scmBranch>
                		</configuration>
		            </plugin>
		        </plugins>
		    </build>
		</profile>
 	</profiles>

</project>

To avoid confusing the commit history and gui tools, deploy to an alternate git repo which contains only the website. This is configured in the distributionManagement section of the pom above.

SonarQube Analysis

The SonarQube instance on builds.apache.org was protected with a login. This caused some problems with builds. Here is how to do it.

Here's the part of our Jenkinsfile that handles the code analysis:

#!groovy
pipeline {

	// ... snip ...

    stages {

		// ... snip ...

        stage('Code Quality') {
            steps {
                echo 'Checking Code Quality'
                withSonarQubeEnv('ASF Sonar Analysis') {
                    sh 'mvn -P${JENKINS_PROFILE} sonar:sonar'
                }
            }
        }

		// ... snip ...

    }

	// ... snip ...

}

The important part is the "withSonarQubeEnv" which injects the credentials for accessing the ASF Sonar server.

The important parts of the pom.xml are as follows:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

  <modelVersion>4.0.0</modelVersion>

  <parent>
    <groupId>org.apache</groupId>
    <artifactId>apache</artifactId>
    <version>21</version>
  </parent>

  <groupId>org.apache.plc4x</groupId>
  <artifactId>plc4x-parent</artifactId>
  <version>0.3.0-SNAPSHOT</version>
  <packaging>pom</packaging>

  <!-- ... snip ... -->

  <properties>
	<!-- ... snip ... -->

    <!-- URL of the ASF SonarQube server -->
    <sonar.host.url>https://builds.apache.org/analysis</sonar.host.url>
    <!-- Exclude all generated code -->
    <sonar.exclusions>**/generated-sources</sonar.exclusions>


	<!-- ... snip ... -->
  </properties>

  <!-- ... snip ... -->

  <build>


	<!-- ... snip ... -->

    <pluginManagement>
      <plugins>


		<!-- ... snip ... -->

        <plugin>
          <groupId>org.sonarsource.scanner.maven</groupId>
          <artifactId>sonar-maven-plugin</artifactId>
          <version>3.5.0.1254</version>
        </plugin>


		<!-- ... snip ... -->

      </plugins>
    </pluginManagement>
  </build>

  <!-- ... snip ... -->

</project>

In the pom fix the version of the sonar plugin to a version matching the ASF Sonar version and tell the plugin what server to connect to.

Sending Emails

While sending emails in normal Jenkins jobs is easy, when using the Multibranch pipeline build, it's a lot more complicated as you have to manually set it up. Fortunately you only have to configure this once and can almost just copy+paste it into any other Multibranch project:

#!groovy
pipeline {

	// ... snip ...

	options {
	
		// ... snip ...

	    // When we have test-fails e.g. we don't need to run the remaining steps
	    skipStagesAfterUnstable()
	}

    stages {


		// ... snip ...

        stage('Build') {
            steps {


				// ... snip ...

            }
            post {
                always {
                    junit(testResults: '**/surefire-reports/*.xml', allowEmptyResults: true)
                    junit(testResults: '**/failsafe-reports/*.xml', allowEmptyResults: true)
                }
            }
        }

		// ... snip ...

    }

    // Send out notifications on unsuccessful builds.
    post {
        // If this build failed, send an email to the list.
        failure {
            script {
                if(env.BRANCH_NAME == "develop") {
                    emailext(
                        subject: "[BUILD-FAILURE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'",
                        body: """
BUILD-FAILURE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]':

Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>"
""",
                        to: "dev@yourproject.apache.org",
                        recipientProviders: [[$class: 'DevelopersRecipientProvider']]
                    )
                }
            }
        }

        // If this build didn't fail, but there were failing tests, send an email to the list.
        unstable {
            script {
                if(env.BRANCH_NAME == "develop") {
                    emailext(
                        subject: "[BUILD-UNSTABLE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'",
                        body: """
BUILD-UNSTABLE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]':

Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>"
""",
                        to: "dev@yourproject.apache.org",
                        recipientProviders: [[$class: 'DevelopersRecipientProvider']]
                    )
                }
            }
        }

        // Send an email, if the last build was not successful and this one is.
        success {
            script {
                if ((env.BRANCH_NAME == "develop") && (currentBuild.previousBuild != null) && (currentBuild.previousBuild.result != 'SUCCESS')) {
                    emailext (
                        subject: "[BUILD-STABLE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'",
                        body: """
BUILD-STABLE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]':

Is back to normal.
""",
                        to: "dev@yourproject.apache.org",
                        recipientProviders: [[$class: 'DevelopersRecipientProvider']]
                    )
                }
            }
        }

        always {
            script {
                if(env.BRANCH_NAME == "master") {
                    emailext(
                        subject: "[COMMIT-TO-MASTER]: A commit to the master branch was made'",
                        body: """
COMMIT-TO-MASTER: A commit to the master branch was made:

Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>"
""",
                        to: "dev@yourproject.apache.org",
                        recipientProviders: [[$class: 'DevelopersRecipientProvider']]
                    )
                }
            }
        }
    }

}

In the options tell Jenkins to skip all succeeding stages, if one marked the build as unstable.

Then run the Maven build to not fail in case of failing tests. The post step of the "build" stage collects the unit- and integration-test results and marks the build unstable in case of failed tests. Without this, there are only successful and failing tests.

In the large "post" block, define what should happen if a build is "successful", "failing" or "unstable". There is also a step that is executed always. 

If you only want "success", "failure" and "unstable" emails the "develop" branch, set a simple if-statement to filter out other messages. You can also set the the emails so you are not informed about every success, but only about the first success after a failing build: in the "success" block check if the previous build was not "SUCCESS" and only in that case send an email. 

You can add an "always" block which automatically sends an email if someone happens to have committed anything to the "master" branch. This should only happen after a successful release, so now for every commit to "master" the dev-list gets informed and may take action if this was an accident.