Table of Contents |
---|
Usually we have manually setup We have usually set up build jobs manually for every project at Apache. Usually even more than one. Especially , especially when it comes to branched development this . This can be annoying.
Previously the usual The pattern was to manually duplicate the develop/master build and change the branch it checks out and monitors.
The Multi-Branch Pipeline Plugin now adds a new type of Job to Jenkins. This job is a lot simpler to setup set up compared to other build jobs . The main reason for this is that because it doesn't actually have to be setup set up, as the build instructions are shifted to become part of the project.
What it does is monitor The new job type monitors the branches of a given repository and check checks if these branches contain a file called: "Jenkinsfile". If it one or more does, a build job is automatically configured.
The Jenkinsfile contains all the information Jenkins needs to execute the build.
If you have setup set up a correct Jenkinsfile in your develop branch, every branch created from that will automatically have its build setup set up without manual intervention.
Previously a A Jenkinsfile used to be a Groovy script in which the different build steps were configured, the now preferred way to do this is by using the Jenkinsfile Groovy DSL.
...
The above skeleton defines a build that runs on a Jenkins build agent labeled with "ubuntu". It consists of has multiple stages, not all of them being which run in every case.
Up to the step "Build" or "Build develop" all steps are always executed. However, for non-develop builds, only "build" is run, which might just compile and run all the tests, however it It doesn't do anything beyond that.
If it's the "develop" branch, however, after building, the QA checks are done, Snapshots are deployed and the website is generated and deployed.
...
While it is possible to write your Jenkinsfile without tooling, some times sometimes it does help. Especially , especially if you're new to this , if and your IDE can provide you with content assist, auto-completion and validation. Depending on your IDE of choice the setup might differ greatly.
Probably the two resources you will need , are the "gdl" file generated by Jenkins and a declarative pipeline definition available here: https://gist.github.com/arehmandev/736daba40a3e1ef1fbe939c6674d7da8
Just save Save this file to some directory inside your project (Doesn't have to be checked in).
In order to get the generated gdl file which is generated by Jenkins. You need to log-Jenkins generates, log in to the ASF Jenkins at https://builds.apache.org and select a Multi-Branch Pipeline build. You can do this by selecting "All" build jobs and searching for the icon for a pipeline build:
...
Select that build and open the "Pipeline Syntax" link:
This will open opens a page that will generate generates different code snippets.
In order to To have IntelliJ help you, you have to ensure make sure IntelliJ treats Jenkinsfiles as Groovy files. You can do this by going to the Go to "Preferences" and selecting select "Editor"/"File Types".
Within the settings in the "Recognized File Types", select "Groovy". Then in the section "Registered Patterns" click on the "+" button and add the pattern "Jenkinsfile" to the list.
...
Jenkins provides a way to download a so-called GDSL file. In order to get this file go to https://builds.apache.org and select the "Pipeline Syntax" as described in the general "Preparing your IDE" section.
There, click on the "Download IDEA GDSL" link:
...
Save this file somewhere in your project (doesn't have to be checked in). Ideally you should add it to a directory dedicated to containing such Groovy files.
In order for For IntelliJ to pick this file up, you need to manually make the directory containing the GDSL a "source directory". In order to To do this, select the IntelliJ IDEA "Project Structure" settings, go . Go to the "Project Settings"/"Modules" tab and select the module containing the directory with the GDSL files. On the Sources tab of that modules settings, right-click the directory with the GDSL file and select "Sources".
...
When using tools like Maven, you deploy new artifacts are deployed as part of the build process.
However in one of the projects I was involved with, we couldn't directly deploy snapshots as we needed to run our tests on a In some projects, it is hard to deploy snapshots because the dedicated project build node and this didndoesn't have the credentials to deploy to Nexus. We solved To solve the problem, by splitting up split build and deploy into two steps. In the "Build" step we run a normal Maven build, but provide an alternate remote repository to deploy things to. In this case we simply deploy to a relative directory within the workspace. So after the After the build this directory contains all the files we would have transferred that a normal build would transfer to Nexus in a normal build. Now we need to
Next, run the actual deployment on a node that has the credentials to do so. Unfortunately this nodes workspace doesn't contain our artifacts. We transfer them using deploy to Nexus. Transfer the artifacts you need to it using the "stash" and "unstash" commands. These commands pack up the content of the local directory of our the build node and unpack it on a deploy node.
...
Similar to the problem of not being able to deploy to Nexus on every node, only Jenkins nodes tagged with "git-websites" are able to push to an ASF Git repositories "asf-site" branch (And and only to that branch)
As again we had the dilemma of being required to build and test on a dedicated project node, so we somehow had to split up the To deal with this, again split website generation and the deployment into two separate steps. Then we ensured the generation is run on our Run generation on the build node and the deployment is run on a "git-websites" node. The problem of transferring stuff from one node to the other is again taken care of by the Jenkins "stash" and "unstash" function.
Here parts of our is an example Jenkinsfile:
Code Block | ||
---|---|---|
| ||
#!groovy pipeline { agent { node { label 'plc4x' } } // ... snip ... stages { // ... snip ... stage('Build site') { when { branch 'develop' } steps { echo 'Building Site' sh 'mvn -P${JENKINS_PROFILE} site' } } stage('Stage site') { when { branch 'develop' } steps { echo 'Staging Site' sh 'mvn -P${JENKINS_PROFILE} site:stage' // Stash the generated site so we can publish it on the 'git-website' node. stash includes: 'target/staging/**/*', name: 'plc4x-site' } } stage('Deploy site') { when { branch 'develop' } // Only the nodes labeled 'git-websites' have the credentials to commit to the. agent { node { label 'git-websites' } } steps { echo 'Deploying Site' // Unstash the previously stashed site. unstash 'plc4x-site' // Publish the site with the scm-publish plugin. sh 'mvn -f jenkins.pom -X -P deploy-site scm-publish:publish-scm' } } } // ... snip ... } |
The content of the "jenkins.pom" is are as follows:
Code Block |
---|
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.apache</groupId> <artifactId>apache</artifactId> <version>21</version> </parent> <groupId>org.apache.plc4x</groupId> <artifactId>plc4x-jenkins-tools</artifactId> <version>0.2.0-SNAPSHOT</version> <packaging>pom</packaging> <name>PLC4X: Jenkins Tools</name> <description>Set of helpers to do individual tasks only needed on our Jenkins build.</description> <!-- We are publishing the site to a different repository --> <distributionManagement> <site> <id>apache.website</id> <url>scm:git:https://gitbox.apache.org/repos/asf/incubator-plc4x-website.git</url> </site> </distributionManagement> <profiles> <!-- ... snip ... --> <profile> <id>deploy-site</id> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-scm-publish-plugin</artifactId> <configuration> <!-- mono-module doesn't require site:stage --> <content>${project.build.directory}/staging</content> <!-- branch where to deploy --> <scmBranch>asf-site</scmBranch> </configuration> </plugin> </plugins> </build> </profile> </profiles> </project> |
As we didn't want to have the commits of the site generation spoil our To avoid confusing the commit history and mess up gui tools, we decided to deploy to an alternate git repo which contains only the website. This is configured in the distributionManagement section of the pom above.
A while ago the The SonarQube instance on builds.apache.org was protected with a login. This caused quite some problems with out build till we finally managed to figure out builds. Here is how to do it.
Here's the part of our Jenkinsfile that handles the code analysis:
...
Code Block |
---|
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.apache</groupId> <artifactId>apache</artifactId> <version>21</version> </parent> <groupId>org.apache.plc4x</groupId> <artifactId>plc4x-parent</artifactId> <version>0.3.0-SNAPSHOT</version> <packaging>pom</packaging> <!-- ... snip ... --> <properties> <!-- ... snip ... --> <!-- URL of the ASF SonarQube server --> <sonar.host.url>https://builds.apache.org/analysis</sonar.host.url> <!-- Exclude all generated code --> <sonar.exclusions>**/generated-sources</sonar.exclusions> <!-- ... snip ... --> </properties> <!-- ... snip ... --> <build> <!-- ... snip ... --> <pluginManagement> <plugins> <!-- ... snip ... --> <plugin> <groupId>org.sonarsource.scanner.maven</groupId> <artifactId>sonar-maven-plugin</artifactId> <version>3.5.0.1254</version> </plugin> <!-- ... snip ... --> </plugins> </pluginManagement> </build> <!-- ... snip ... --> </project> |
In the pom it was necessary to fix the version of the sonar plugin to a version matching the ASF Sonar version as well telling and tell the plugin what server to connect to.
While sending Emails emails in normal Jenkins jobs is easy, when using the Multibranch pipeline build, it's a lot more complicated as you have to manually set it up. Fortunately you only have to configure this once and can almost just copy+paste it into another project.Here comes what we didany other Multibranch project:
Code Block | ||
---|---|---|
| ||
#!groovy pipeline { // ... snip ... options { // ... snip ... // When we have test-fails e.g. we don't need to run the remaining steps skipStagesAfterUnstable() } stages { // ... snip ... stage('Build') { steps { // ... snip ... } post { always { junit(testResults: '**/surefire-reports/*.xml', allowEmptyResults: true) junit(testResults: '**/failsafe-reports/*.xml', allowEmptyResults: true) } } } // ... snip ... } // Send out notifications on unsuccessful builds. post { // If this build failed, send an email to the list. failure { script { if(env.BRANCH_NAME == "develop") { emailext( subject: "[BUILD-FAILURE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'", body: """ BUILD-FAILURE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]': Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>" """, to: "dev@yourproject.apache.org", recipientProviders: [[$class: 'DevelopersRecipientProvider']] ) } } } // If this build didn't fail, but there were failing tests, send an email to the list. unstable { script { if(env.BRANCH_NAME == "develop") { emailext( subject: "[BUILD-UNSTABLE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'", body: """ BUILD-UNSTABLE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]': Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>" """, to: "dev@yourproject.apache.org", recipientProviders: [[$class: 'DevelopersRecipientProvider']] ) } } } // Send an email, if the last build was not successful and this one is. success { script { if ((env.BRANCH_NAME == "develop") && (currentBuild.previousBuild != null) && (currentBuild.previousBuild.result != 'SUCCESS')) { emailext ( subject: "[BUILD-STABLE]: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]'", body: """ BUILD-STABLE: Job '${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]': Is back to normal. """, to: "dev@yourproject.apache.org", recipientProviders: [[$class: 'DevelopersRecipientProvider']] ) } } } always { script { if(env.BRANCH_NAME == "master") { emailext( subject: "[COMMIT-TO-MASTER]: A commit to the master branch was made'", body: """ COMMIT-TO-MASTER: A commit to the master branch was made: Check console output at "<a href="${env.BUILD_URL}">${env.JOB_NAME} [${env.BRANCH_NAME}] [${env.BUILD_NUMBER}]</a>" """, to: "dev@yourproject.apache.org", recipientProviders: [[$class: 'DevelopersRecipientProvider']] ) } } } } } |
So the notable parts of this is firstly in In the options we tell Jenkins to skip all succeeding stages, if one marked the build as unstable.
Then we run our the Maven build to not fail in case of failing tests. The post step of the "build" stage collects the unit- and integration-test results and mark marks the build unstable in case of failed tests. Without this, there are only successful and failing tests.
In the large "post" block, we define what should happen if a build is "successful", "failing" or "unstable". We even have There is also a step that is executed always.
We only wanted to have emails for If you only want "success", "failure" and "unstable" for emails the "develop" branch", so set a simple if-statement filters to filter out . Also didn't we want to be other messages. You can also set the the emails so you are not informed about every success, but only about the first success after a failing build. Therefore : in the "success" block we check if the previous build was not "SUCCESS" and only in that case send an email.
Also did we add the You can add an "always" block which automatically sends an email if someone happens to have committed anything to the "master" branch. This should only happen after a successful release, so now for every commit to "master" the dev-list gets informed and may take action if this was an accident.