The Jenkins jobs are defined in ./test-infra/jenkins. These jobs are written using the Job DSL, using Apache Groovy.
Job definitions should be a simple as possible, and ideally identify a single gradle target to execute.
To test changes locally, without breaking the project’s job definitions, you can use Local Dockerized Jenkins.
It is possible to test a PR that includes Jenkins changes, but it is potentially destructive because the job definitions are a shared resource. To update the job definitions on a PR, use the “Run Seed Job” trigger phrase. This will cause the job definitions to be read and parsed, and they will be active until the next scheduled execution on beam_SeedJob or beam_SeedJob_Standalone.
Beam committers can trigger a job with the jenkins UI. Non-committers can trigger a job if there is a trigger phrase.
Each post-commit and pre-commit job file defines several jobs with different suffixes. For pre-commits, there _Commit, _Phrase, and _Cron suffixes. The _Commit job happens with every push to a pull request. The _Phrase happens when the trigger phrase is entered as a comment in the pre-commit. The _Cron pre-commit happens post-commit on the master branch as a signal whether the pre-commit would pass without any changes.
Most beam Jenkins jobs specify the label beam, which uses beam executors 1-15.
The performance test jobs specify the label beam-perf, which uses beam executors 16.
The website publishing job specifies the label git-websites, which allows publishing generated documentation to the asf-site branch.
Accessing test history
You can look at this url:
to get there
failed job -> test result -> navigate to failed test -> history
I remember having issue getting there for succeeded tests. So usually just replaced relevant fields in url.
Jenkins infrastructure setup
Jenkins continuous integrating jobs are supported by 16 GCE instances. The compute instances are managed by an instance group 'apache-beam-jenkins-jnlp-group' in the Google Cloud project 'apache-beam-testing'. Each instance has 16 CPUs and 104GB memory.
|Instance Name (Jenkins agent names are same)||Jenkins Label|
Installing and upgrading software on Jenkins workers
All software updates must be performed and verified in a temporary experimental compute instance.
To install or upgrade tools for Jenkins instances:
Create a compute instance with the most recent created boot image. You can create the instance through VM Instances in Cloud Console or by running the gcloud command
Ssh to the instance via Cloud Console or gcloud command, then install or upgrade the required tools:
Verify the tool by runing cooresponding Beam tests and verify backwards compatibility by running Beam PostCommits against all SDKs.
Clone Beam repository from Git:
Run your tests by gradle wrapper. For example:
Remove beam repository after verification.
Create a new image from the experiment instances and named it as jenkins-slave-boot-image-YYYYMMDD. Put the image in the family of 'jenkins-slave-boot-image'.
For each instance, stop the VM, then click edit. Delete the existing boot disk by clicking the 'X'. Create a new boot disk by using the newly generated image. Name disk as 'apache-beam-jenkins-[1,16]-YYYYMMDD'. Make sure the boot disk 'Model' is 'Boot, read/write'.
Go to the Pantheon Disks and remove all deprecated jenkins slave disks.
- Ask a PMC member to reboot the Jenkins executors. (For PMC, you'll need to ssh to the VMs and run the command listed in the Beam SVN repo, https://svn.apache.org/repos/private/pmc/beam/accounts/JenkinsWorkersSecrets.txt.)