Quantcast
Channel: Jenkins Blog
Viewing all 1087 articles
Browse latest View live

Closure on enumerators in Pipeline

$
0
0

While at Jenkins World, Kohsuke Kawaguchi presented two long-time Jenkins contributors with a "Small Matter of Programming" award: Andrew Bayer andJesse Glick. "Small Matter of Programming" being:

a phrase used to ironically indicate that a suggested feature or design change would in fact require a great deal of effort; it often implies that the person proposing the feature underestimates its cost.

— Wikipedia

In this context the "Small Matter" relates to Jenkins Pipeline and a very simple snippet of Scripted Pipeline:

[1, 2, 3].each { println it }

For a long time in Scripted Pipeline, this simply did not work as users would expect it. Originally filed asJENKINS-26481 in 2015, it became one of the most voted for, and watched, tickets in the entire issue tracker until it was ultimately fixed earlier this year.

Photo by Kohsuke

At least some closures are executed only once inside of Groovy CPS DSL scripts managed by the workflow plugin.

— Original bug description by Daniel Tschan

At a high level, what has been confusing for many users is that Scripted Pipeline looks like a Groovy, it quacks like a Groovy, but it’s not exactly Groovy. Rather, there’s an custom Groovy interpreter (CPS) that executes the Scripted Pipeline in a manner which provides the durability/resumability that defines Jenkins Pipeline.

Without diving into too much detail, refer to the pull requests linked to JENKINS-26481 for that, the code snippet above was particularly challenging to rectify inside the Pipeline execution layer. As one of the chief architects for Jenkins Pipeline, Jesse made a number of changes around the problem in 2016, but it wasn’t until early 2017 when Andrew, working on Declarative Pipeline, started to identify a number of areas of improvement in CPS and provided multiple patches and test cases.

As luck would have it, combining two of the sharpest minds in the Jenkins project resulted in the "Small Matter of Programming" being finished, and released in May of this year with Pipeline: Groovy 2.33.

Please join me in congratulating, and thanking, Andrew and Jesse for their diligent and hard work smashing one of the most despised bugs in Jenkins history :).


Parallel stages with Declarative Pipeline 1.2

$
0
0

After a few months of work on its key features, I’m happy to announce the 1.2 release of Declarative Pipeline! On behalf of the contributors developing Pipeline, I thought it would be helpful to discuss three of the key changes.

A Pipeline with Parallel stages

Parallel Stages

First, we’ve added syntax support for parallel stages. In earlier versions of Declarative Pipeline, the only way to run chunks of Pipeline code in parallel was to use the parallel step inside the steps block for a stage, like this:

/* .. snip .. */
stage('run-parallel-branches') {
  steps {
    parallel(a: {
        echo "This is branch a"
      },b: {
        echo "This is branch b"
      }
    )
  }
}/* .. snip .. */

While this works, it doesn’t integrate well with the rest of the Declarative Pipeline syntax. For example, to run each parallel branch on a different agent, you need to use a node step, and if you do that, the output of the parallel branch won’t be available for post directives (at a stage or pipeline level). Basically the old parallel step required you to use Scripted Pipeline within a Declarative Pipeline.

But now with Declarative Pipeline 1.2, we’ve introduced a true Declarative syntax for running stages in parallel:

Jenkinsfile
pipeline {
    agent none
    stages {
        stage('Run Tests') {
            parallel {
                stage('Test On Windows') {
                    agent {
                        label "windows"
                    }
                    steps {
                        bat "run-tests.bat"
                    }
                    post {
                        always {
                        junit "**/TEST-*.xml"
                        }
                    }
                }
                stage('Test On Linux') {
                    agent {
                        label "linux"
                    }
                    steps {
                        sh "run-tests.sh"
                    }
                    post {
                        always {
                        junit "**/TEST-*.xml"
                        }
                    }
                }
            }
        }
    }
}

You can now specify either steps or parallel for a stage, and withinparallel, you can specify a list of stage directives to run in parallel, with all the configuration you’re used to for a stage in Declarative Pipeline. We think this will be really useful for cross-platform builds and testing, as an example. Support for parallel stages will be in the soon-to-be-released Blue Ocean Pipeline Editor 1.3 as well.

You can find more documentation on parallel stages in theUser Handbook.

Defining Declarative Pipelines in Shared Libraries

Until the 1.2 release, Declarative Pipelines did not officially support defining your pipeline blocks in a shared library. Some of you may have tried that out and found that it could work in some cases, but since it was never an officially supported feature, it was vulnerable to breaking due to necessary changes for the supported use cases of Declarative. But with 1.2, we’ve added official support for defining pipeline blocks in src/*.groovy files in your shared libraries. Within your src/*.groovy file’s call method, you can call pipeline { ... }, or possibly different pipeline { ... } blocks depending on if conditions and the like. Note that only one pipeline { ... } block can actually be executed per run - you’ll get an error if a second one tries to execute!

Major Improvements to Parsing and Environment Variables

Hopefully, you’ll never actually care about this change, but we’re very happy about it nonetheless. The original approach used for actually taking the pipeline { ... } block and executing its contents was designed almost two years ago, and wasn’t very well suited to how you all are actually using Declarative Pipelines. In our attempts to work around some of those limitations, we made the parsing logic even more complicated and fragile, resulting in an impressive number of bugs, mainly relating to inconsistencies and bad behavior withenvironment variables.

In Declarative 1.2, we’ve replaced the runtime parsing logic completely with a far more robust system, which also happens to fix most of those bugs at the same time! While not every issue has been resolved, you may find that you can use environment variables in more places, escaping is more consistent, Windows paths are no longer handled incorrectly, and a lot more. Again, we’re hoping you’ve never had the misfortune to run into any of these bugs, but if you have, well, they’re fixed now, and it’s going to be a lot easier for us to fix any future issues that may arise relating to environment variables, when `expression`s, and more. Also, the parsing at the very beginning of your build may be about 0.5 seconds faster. =)

More to Come!

While we don’t have any concrete plans for what will be going into Declarative Pipelines 1.3, rest assured that we’ve got some great new features in mind, as well as our continuing dedication to fixing the bugs you encounter and report. So please do keep opening tickets for issues and feature requests. Thanks!

Pipeline and Blue Ocean Demos from Jenkins World

$
0
0

At Jenkins World last month, we continued the tradition of "lunch-time demos" in the Jenkins project’s booth which we started in 2016. We invited a number of Jenkins contributors to present brief 10-15 minute demos on something they were working on, or considered themselves experts in. Continuing the post-Jenkins World tradition, we also just hosted a "Jenkins Online Meetup" featuring a selection of those lunch-time demos.

I would like to thank Alyssa Tong for organizing this online meetup, Liam Newman for acting as the host, and our speakers:

Below are some links from the sample projects demonstrated and the direct links to each session.

Developing Pipeline Libraries Locally

If you have ever tried developing Pipeline Libraries, you may have noticed how long it takes to deploy a new version to server to discover just another syntax error. I will show how to edit and test Pipeline libraries locally before committing to the repository (with Configuration-as-Code and Docker).

Delivery Pipelines with Jenkins

Showing off how to set up holistic Delivery Pipelines with the DevOps enabler tool Jenkins.

Pimp my Blue Ocean

How to customize Blue Ocean, where I create a custom plugin and extending Blue Ocean with custom theme and custom components.

Deliver Blue Ocean Components at the Speed of Light

Using storybook.js.org for Blue Ocean frontend to speed up the delivery process - validate with PM and designer the UX. Showing how quickly you develop your components.

Mozilla’s Declarative + Shared Libraries Setup

How Mozilla is using Declarative Pipelines and shared libraries together.

See also the #fx-test IRC channel on irc.mozilla.org

Git Tips and Tricks

Latest capabilities in the git plugin, like large file support, reference repositories and some reminders of existing tips that can reduce server load, decrease job time, and decrease disc use.

Visual Pipeline Creation in Blue Ocean

We will show how to use Blue Ocean to build a real-world continuous delivery pipeline using the visual pipeline editor. We will coordinate multiple components of a web application across test and production environments, simulating a modern development and deployment workflow.

Jenkins Contributors Awarded Top Honors at Jenkins World 2017

$
0
0

This is a guest post by Alyssa Tong, who runs the Jenkins Area Meetup program and is also responsible for Marketing & Community Programs at CloudBees, Inc.

Awards

For the first time at Jenkins World, the Jenkins project honored the achievement of three Jenkins contributors in the areas of Most Valuable Contributor, Jenkins Security MVP, and most Valuable Advocate. These three individuals has consistently demonstrated excellence and proven value to the project. With gratitude and congratulations, below are the well deserved winners:

Alex Earl - Most Valuable Contributor

Alex is the current or previous maintainer of some of the most used Jenkins plugins and has been for years. He’s a regular contributor to project policy discussions, and helps to keep the project running by improving the Jenkins project infrastructure, moderating the mailing lists and processing requests for hosting new plugins.

Steve Marlowe - Jenkins Security MVP

Steve is one of the most prolific reporter of security vulnerabilities in Jenkins. His reports are well-written, clearly identify the problematic behavior, and provide references that help quickly resolve the reported issue. On top of that, Steve is always responsive when asked for clarification.

Tomonari Nakamura - Most Valuable Advocate

Ikikko

Tomonari leads the Jenkins User Group in Tokyo, which is one of the largest and the most active with a long history. The group has been organizing meet-ups for more than 10 times now, and every meet-up fills up to 100% very quickly with regular turn-out of 100-200 people. At one point the group under his leadership organized a fully volunteer-run "Jenkins User Conference" in Tokyo that commanded 1000+ attendees.

Congratulations to our winners.

We can’t wait to recognize more contributors at Jenkins World 2018!

Share a standard Pipeline across multiple projects with Shared Libraries

$
0
0
This is a guest post by Philip Stroh, Software Architect atTimoCom.

When building multiple microservices - e.g. with Spring Boot - the integration and delivery pipelines of your services will most likely be very similar. Surely, you don’t want to copy-and-paste Pipeline code from one Jenkinsfile to another if you develop a new service or if there are adaptions in your delivery process. Instead you would like to define something like a pipeline "template" that can be applied easily to all of your services.

The requirement for a common pipeline that can be used in multiple projects does not only emerge in microservice architectures. It’s valid for all areas where applications are built on a similar technology stack or deployed in a standardized way (e.g. pre-packages as containers).

In this blog post I’d like to outline the possibility to create such a pipeline "template" using Jenkins Shared Libraries. If you’re not yet familiar with Shared Libraries I’d recommend having a look at the documentation.

The following code shows a (simplified) integration and delivery Pipeline for a Spring Boot application in declarative syntax.

JenkinsFile
pipeline {
    agent any
    environment {
        branch = 'master'
        scmUrl = 'ssh://git@myScmServer.com/repos/myRepo.git'
        serverPort = '8080'
        developmentServer = 'dev-myproject.mycompany.com'
        stagingServer = 'staging-myproject.mycompany.com'
        productionServer = 'production-myproject.mycompany.com'
    }
    stages {
        stage('checkout git') {
            steps {
                git branch: branch, credentialsId: 'GitCredentials', url: scmUrl
            }
        }

        stage('build') {
            steps {
                sh 'mvn clean package -DskipTests=true'
            }
        }

        stage ('test') {
            steps {
                parallel ("unit tests": { sh 'mvn test' },"integration tests": { sh 'mvn integration-test' }
                )
            }
        }

        stage('deploy development'){
            steps {
                deploy(developmentServer, serverPort)
            }
        }

        stage('deploy staging'){
            steps {
                deploy(stagingServer, serverPort)
            }
        }

        stage('deploy production'){
            steps {
                deploy(productionServer, serverPort)
            }
        }
    }
    post {
        failure {
            mail to: 'team@example.com', subject: 'Pipeline failed', body: "${env.BUILD_URL}"
        }
    }
}

This Pipeline builds the application, runs unit as well as integration tests and deploys the application to several environments. It uses a global variable "deploy" that is provided within a Shared Library. The deploy method copies the JAR-File to a remote server and starts the application. Through the handy REST endpoints of Spring Boot Actuator a previous version of the application is stopped beforehand. Afterwards the deployment is verified via the health status monitor of the application.

vars/deploy.groovy
defcall(def server, def port) {
    httpRequest httpMode: 'POST', url: "http://${server}:${port}/shutdown", validResponseCodes: '200,408'
    sshagent(['RemoteCredentials']) {
        sh "scp target/*.jar root@${server}:/opt/jenkins-demo.jar"
        sh "ssh root@${server} nohup java -Dserver.port=${port} -jar /opt/jenkins-demo.jar &"
    }
    retry (3) {
        sleep 5
        httpRequest url:"http://${server}:${port}/health", validResponseCodes: '200', validResponseContent: '"status":"UP"'
    }
}

The common approach to reuse pipeline code is to put methods like "deploy" into a Shared Library. If we now start developing the next application of the same fashion we can use this method for deployments as well. But often there are even more similarities within projects of one company. E.g. applications are built, tested and deployed in the same way into the same environments (development, staging and production). In this case it is possible to define the whole Pipeline as a global variable within a Shared Library. The next code snippet defines a Pipeline "template" for all of our Spring Boot applications.

vars/myDeliveryPipeline.groovy
defcall(Map pipelineParams) {

    pipeline {
        agent any
        stages {
            stage('checkout git') {
                steps {
                    git branch: pipelineParams.branch, credentialsId: 'GitCredentials', url: pipelineParams.scmUrl
                }
            }

            stage('build') {
                steps {
                    sh 'mvn clean package -DskipTests=true'
                }
            }

            stage ('test') {
                steps {
                    parallel ("unit tests": { sh 'mvn test' },"integration tests": { sh 'mvn integration-test' }
                    )
                }
            }

            stage('deploy developmentServer'){
                steps {
                    deploy(pipelineParams.developmentServer, pipelineParams.serverPort)
                }
            }

            stage('deploy staging'){
                steps {
                    deploy(pipelineParams.stagingServer, pipelineParams.serverPort)
                }
            }

            stage('deploy production'){
                steps {
                    deploy(pipelineParams.productionServer, pipelineParams.serverPort)
                }
            }
        }
        post {
            failure {
                mail to: pipelineParams.email, subject: 'Pipeline failed', body: "${env.BUILD_URL}"
            }
        }
    }
}

Now we can setup the Pipeline of one of our applications with the following method call:

Jenkinsfile
myDeliveryPipeline(branch: 'master', scmUrl: 'ssh://git@myScmServer.com/repos/myRepo.git',email: 'team@example.com', serverPort: '8080',developmentServer: 'dev-myproject.mycompany.com',stagingServer: 'staging-myproject.mycompany.com',productionServer: 'production-myproject.mycompany.com')

The Shared library documentation mentions the ability to encapsulate similarities between several Pipelines with a global variable. It shows how we can enhance our template approach and build a higher-level DSL step:vars/myDeliveryPipeline.groovy

vars/myDeliveryPipeline.groovy
defcall(body) {// evaluate the body block, and collect configuration into the objectdef pipelineParams= [:]
    body.resolveStrategy = Closure.DELEGATE_FIRST
    body.delegate = pipelineParams
    body()

    pipeline {
        // our complete declarative pipeline can go in here
        ...
    }
}

Now we can even use our own DSL-step to set up the integration and deployment Pipeline of our project:

Jenkinsfile
myDeliveryPipeline {
    branch = 'master'
    scmUrl = 'ssh://git@myScmServer.com/repos/myRepo.git'
    email = 'team@example.com'
    serverPort = '8080'
    developmentServer = 'dev-myproject.mycompany.com'
    stagingServer = 'staging-myproject.mycompany.com'
    productionServer = 'production-myproject.mycompany.com'
}

The blog post showed how a common Pipeline template can be developed using the Shared Library functionality in Jenkins. The approach allows to create a standard Pipeline that can be reused by applications that are built in a similar way.

It works for Declarative and Scripted Pipelines as well. For declarative pipelines the ability to define a Pipeline block in a Shared Library is official supported since version 1.2 (see the recent blog post onDeclarative Pipeline 1.2).

Hacktoberfest. Contribute to Jenkins!

$
0
0

Once again it’s October in our calendars. It means that the regular Hacktoberfest event is back! During this one-month hackathon you can support open source and earn a limited edition T-shirt. Jenkins project offers an opportunity to participate in the project and to get reviews and help from Jenkins contributors.

Hacktoberfest

How do I sign up?

  1. Sign-up to Hacktoberfest on the event website.

  2. Everything is set, just start coding!

What can I do?

There are lots of ways to contribute to Jenkins during Hacktoberfest. You can…​

  • Write code

  • Improve documentation, write blogposts

  • Automate Tests

  • Translate and internationalize components

  • Design - artwork and UI improvements also count!

See the Contribute and Participate page for fore information.

Where can I contribute?

The project is located in several organizations in GitHub. Core and plugins are located in the jenkinsci org, infrastructure - in jenkins-infra. You can contribute to any component within these organizations.

For example, you could contribute to the following components:

You can also create new Jenkins plugins and get themhosted in the organization.

What can I do?

Our issue tracker contains lots of issues you could work on. If you are new to Jenkins, you could start by fixing some easier issues. In the issue tracker we mark such issues with the newbie-friendly label (search query). You can also submit your own issue and propose a fix.

How do I label issues and pull requests?

Hacktoberfest project requires issues and/or pull requests to be labeled with the hacktoberfest label. You may have no permissions to set labels on your own, but do not worry! Just mention @jenkinsci/hacktoberfest or @jenkins-infra/hacktoberfest in the repository, and we will set the labels for you.

How do I get reviews?

All examples above are being monitored by the Jenkins contributors, and you will likely get a review within few days. Reviews in other repositories and plugins may take longer. In the case of delays, ping @jenkinsci/code-reviewers in your pull request or send a message to the mailing list.

Where can I find info?

Jenkins project contains lots of materials about contributing to the project. Here are some entry links:

Need help?

You can reach out to us using IRC Channels and the Jenkins Developer Mailing List. In the case of mailing lists it is recommended to mention Hacktoberfest in the email subject.

Important security updates for Jenkins core and plugins

$
0
0

We just released security updates to Jenkins, versions 2.84 and 2.73.2, that fix several security vulnerabilities. Additionally, we published a new release of Swarm Plugin whose client contains a security fix, and Maven Plugin 3.0 was recently released to resolve a security issue. Users of Swarm Plugin and Maven Plugin should update these to their respective newest versions.

For an overview of what was fixed, see the security advisory. For an overview on the possible impact of these changes on upgrading Jenkins LTS, see our LTS upgrade guide.

We also published information about a vulnerability in Speaks! Plugin. There is no fix available and we recommend it be uninstalled. Its distribution has been suspended.

Subscribe to the jenkinsci-advisories mailing list to receive important notifications related to Jenkins security.

Jenkins World 2017 Session Videos are Available

$
0
0

This is a guest post by Alyssa Tong, who runs the Jenkins Area Meetup program and is also responsible for Marketing & Community Programs at CloudBees, Inc.

Jenkins World 2017 keynotes and breakout session videos are now available HERE. Photos from the conference can be seen HERE.

Save the date for Jenkins World 2018:

  • Conference dates are September 16-19, 2018 in San Francisco.

  • Registration will open on October 16, 2017.

  • Call for Papers will open on December 1, 2017.


Security updates for multiple Jenkins plugins

$
0
0

Multiple Jenkins plugins received updates today that fix several security vulnerabilities.

Additionally, the Multijob Plugin also received a security update several weeks ago.

For an overview of these security fixes, see the security advisory.

Active Choices Plugin distribution had been suspended since April due to its mandatory dependency on the suspended Scriptler Plugin. That dependency has been made optional, so Active Choices can be used without having Scriptler installed. This means we are able to resume distribution of Active Choices Plugin again. It should be available on update sites later today.

We also announced a medium severity security vulnerability in SCP publisher plugin that does not have a fix at this time.

Subscribe to the jenkinsci-advisories mailing list to receive important future notifications related to Jenkins security.

Jenkins User Conference China

$
0
0

This is a guest post by Forest Jing, who runs the Shanghai Jenkins Area Meetup

C0238CBC BDE6 4097 B634 5D8633D24F4C

I am excited to announce the inauguralJenkins User Conference China will be taking place on November 19, 2017 in Shanghai, China. The theme of JUC China is “Jenkins Driven CD and DevOps”. Much like in the US, CD and DevOps are big topics of interest in China. We are honored to have Kohsuke Kawaguchi join us as one of the keynote speakers at this inaugural Jenkins event. We will also have sessions from many of China’s big named companies like Baidu, Tencet, Pinterest, Ctrip, Huawei, Microsoft, and more. Below are some highlights of the event.

Sunday Nov 19th Agenda

Morning keynote sessions

There will be 4 keynote speeches:

  1. Kohsuke Kawaguchi, creator of Jenkins will introduce Jenkins Past, Present & Future.

  2. Le Zhang, a very famous DevOps and CD expert will show pipeline driven CD and DevOps.

  3. Engineering Director from Huawei will show the CD and DevOps practice in Huawei.

  4. Xu Zheng from Pinterest will present Run Jenkins infrastructure as service in Kubernetes.

In the Afternoon, we have set up 3 tracks

  1. CD & DevOps user stories from Microsoft, Tencent, Ctrip and JinDong - all are big companies in China.

  2. Enterprise Jenkins experience the use of Jenkins as an enterprise tool not only for teams.

  3. Workshop to lead engineers to practice CloudBees Jenkins and open source Jenkins features.

8C5A0F23 4632 4DAC AFAE AE5535B1BED3

If you’re in the neighborhood, we sincerely invite you to join us at Jenkins User Conference China.

Follow us on Twitter @china_juc

Security updates for Jenkins core

$
0
0

We just released security updates to Jenkins, versions 2.89 and 2.73.3, that fix two low-severity security vulnerabilities.

For an overview of what was fixed, see the security advisory. For an overview on the possible impact of these changes on upgrading Jenkins LTS, see our LTS upgrade guide.

Subscribe to the jenkinsci-advisories mailing list to receive important notifications related to Jenkins security.

Introducing Tutorials in the Jenkins User Documentation

$
0
0

Regular perusers of the Jenkins User Documentation may have noticed the presence of the Tutorials part (between the Guided Tour and User Handbook) that appeared in the last couple of months and gradually began to get populated with much of my recent work, writing Jenkins tutorials.

My name’s Giles and I’ve been a technical writer in the software development field for several years now. I’ve always been passionate about technical writing and more recently, the technologies that go into developing written content and automating its generation - like Jenkins! I was a former Atlassian and recently joined CloudBees as a Senior Technical Writer, working remotely from the "Sydney Office", with my current focus on the Jenkins User Documentation.

Why tutorials?

My exposure to Jenkins and its usage over the years has been patchy at best. During this time, however, I’ve had some degree of experience as a user of various continuous delivery (CD) tools like Jenkins and am reasonably familiar with the advantages these tools can offer software development teams.

I’ve also found that while many software developers are familiar with the broader concept of "developer operations" (or simply "devops"), fewer seem less familiar with the concepts of CD and related tools to facilitate devops within organizations.

The CD process is based on the fundamental flow of building the application>testing it> delivering it, where typically:

  • The building part involves compiling the application and/or ensuring all necessary libraries and dependencies are in place for the application to run as intended.

  • The testing part involves testing the built application with automated tests to ensure that changes implemented by developers function as expected.

  • The delivering part involves packaging or presenting the application in a way that can be delivered to customers or other users for any kind of purpose.

Now, as one of the major contributors to the Jenkins User Documentation (and faced with a reasonably steep learning curve), it quickly became apparent about the lack of accessible documentation to hand-hold people relatively new to Jenkins through this CD process. I couldn’t find anything in the Jenkins User Documentation to demonstrate how Jenkins implements this process on a simple app that delivers an end result.

With the guidance and assistance of helpful colleagues, I therefore decided to embark on creating a series of Jenkins tutorials to help fill these documentation and knowledge gaps. These tutorials are based on Daniele Procida’s description of how tutorials should be presented in his blog post "What nobody tells you about documentation").

Introductory tutorials

The first set of tutorials on the Tutorials overview page (beginning with "Using Jenkins to build …​") demonstrate how to implement this fundamental CD process in Jenkins on a simple application for a given technology stack.

So far, there’s one for Java with Maven and another for Node.js and React with npm. Another for Python will be added to this list in the near future.

These tutorials define your application’s entire CD process (i.e. your Pipeline) in a Jenkinsfile, whose Groovy-like Declarative Pipeline syntax is checked in to your Git source repository. Managing your Pipeline with your application’s source code like this forms the fundamentals of "Pipeline as code".

The Introductory tutorials also cover how to use some powerful features of Jenkins, like Blue Ocean, which makes it easy to connect to an existing cloud, web or locally hosted Git repository and create your Pipeline with limited knowledge of Pipeline syntax.

Advanced tutorials

Also soon to be released will be the first Advanced tutorial on building multibranch Pipelines in Jenkins. This tutorial takes the "Pipeline as code" concept to a new level, where a single Jenkinsfile (defining the entire CD process across all branches of your application’s Git repository) consists of multiple stages which are selectively executed based on the branch that Jenkins is building.

Additional tutorials that demonstrate more advanced features of Jenkins and how to manage your Pipelines with greater sophistication and flexibility will be added to this section in future.

Summing up

You can access all currently available tutorials from theTutorials overview page in the Jenkins User Documentation. It’s worthwhile checking that page from time to time as it’ll be updated whenever a new tutorial is published.

Also, if you have any suggestions for tutorials or other content you’d like to see in the documentation, please post your suggestions in the Jenkins Documentation Google Group, which you can also post (and reply) to by emailingjenkinsci-docs@googlegroups.com.

The "Sydney Office" team
The Sydney Office team meeting at Carriageworks - from left to right, Giles Gaskell, Nicholae Pascu, Michael Neale and James Dumay

Jenkins User Conference China Recap

$
0
0

This is a guest post by Forest Jing, who runs the Shanghai Jenkins Area Meetup

The first Jenkins User Conference China was held on November 19, 2017 in Shanghai, China. It was an amazing conference for Jenkins users in China. There were almost 450 attendees to enjoy a lovely day with Jenkins creator, Kohsuke Kawaguchi.

0A3 1763

There were 12 wonderful presentations, 1 workshop and 1 open space to set the stage for JUC China. The day began with Kohsuke, welcomed to the stage by all the attendees here with their warm applauses. All of them are fans of Jenkins and Kohsuke. Thanks to Kohsuke, he made a wonderful presentation to show the past, present and future of Jenkins for Chinese Jenkins users.

0A3 1742

The co-founder of DevOps Times community, Le Zhang released a report of adoption of Deployment Pipeline in Chinese IT organization. The report shows that Jenkins is the most popular tool for DevOps and CD in China.

0A77A42D 0BFA 49AA 80E0 3D7FC0997261

It also has proven that “If it hurts, do it more often!” is right because the IT organization who deploy more frequently, deployment failure would be less.

0A3 1767

The Three Musketeers of DevOps in China (Le Zhang, Xuefeng Shi,Forest Jing) have demonstrated a DevOps pipeline based on Jenkins and many open-source software such as Kubernetes, Gitlab, etc.

53297F0E 0B48 4E62 80FB 6B6DF4D7A2C2

After the presentation, lots of fans waited in line to take pictures with Kohsuke.

0A3 1825

Here are some photos of the JUC China. We really enjoyed it.

68B2E556 D452 4FD9 B6D0 5E5985AE7221
3F5821C1 8CFF 4561 8989 FD68775B22E2
0A3 9598
0A3 1644

Lastly, as an organizer for JUC China, I would like to thank Kohsuke Kawaguchi, Alyssa Tong, Sam Van Oort and so many friends from CloudBees and Jenkins community to have given us so many help to make the first JUC China an amazing and successful conference for Jenkins users in China. We look forward to organizing many more JUC China.

Security updates for Jenkins core

$
0
0

We just released security updates to Jenkins, versions 2.95 and 2.89.2, that fix two security vulnerabilities. For an overview of what was fixed, see the security advisory.

We usually announce core security updates well in advance on the jenkinsci-advisories mailing list, to give Jenkins administrators time to schedule a maintenance. Additionally, we try to align security updates with the regular LTS schedule. We have chosen not to do so in this case for two reasons:

  • The random failure to set up Jenkins is very noticeable, and given that we’ve seen automated exploits for unprotected Jenkins instances in the past we consider it important to fix that issue as soon as possible, so that users setting up new instances of Jenkins can be confident they won’t start up insecurely.

  • The CSRF issue appears to only affect instances for a very short (seconds at most, if at all) time period immediately after startup, so administrators could apply the fix during the next scheduled Jenkins downtime, rather than immediately.

Auto-Convert Freestyle Jobs to Jenkins Pipeline

$
0
0

This is a guest post by Sanil Pillai, Director of Labs & Strategic Insights, Infostretch

Infostretch has created aplugin for teams upgrading from Freestyle Jobs to Pipelines as code with Jenkins Pipeline. This new plugin streamlines the process and accelerates pipeline on-boarding for any new set of applications. Previously, when upgrading to Jenkins Pipeline, converting Freestyle Jobs required developers to drill down on each one of those hundreds (or thousands!) of jobs to understand tools, configurations, URLs, parameters, and more before rewriting them inPipeline syntax. This process is very manual, error-prone, lengthy, and not cost-effective. Beyond saving time, the new plugin also assures adherence to proper coding standards and separates complex business logic and standards declaration from execution flow.

Key features:

  • Convert single freestyle job to pipeline

  • Convert chain of freestyle jobs to single pipeline

  • Works with both Jenkins and CloudBees Jenkins Enterprise

  • Plugin can be customized to support any Freestyle plugin and an organization’s Pipeline Shared Library, or Groovy coding standards.

  • Works with CloudBees' Role-based Access Control to help the new Pipelines comply with existing security policies.

  • Direct migration of properties such as "Build with Parameters" to newly created Pipelines.

  • Direct migration of Agent on which job is to be run with support for multiple agent labels across different downstream jobs

  • Environment properties: JDK, NodeJS

  • Supports Git SCM.

  • Build steps: Maven, Ant, Shell, Batch, and Ansible Playbook.

  • Post build actions: artifact archiver, simple mailer, TestNG reports, JUnit reports, checkstyle publisher

Now, let’s take a look at how to get started:

  1. Click on a link at Root level or Folder level or Job level.

    Image01 Jenkins Pipeline Infostretch
  2. Select the job from the drop-down list that is the beginning point of the "chain". If job level link is clicked, this drop-down list will not be visible.

    Image02 Jenkins Pipeline Infostretch

    Provide the new pipeline job name. If this is not specified, the plugin will attempt to create a new pipeline job with the naming convention of "oldname-pipeline".

  3. Check "Recursively convert downstream jobs if any?" if you wish to have all the downstream jobs converted into this new pipeline. The plugin will write all the logic of current and downstream jobs into a single pipeline.

  4. Check "Commit Jenkinsfile?" if you would like the plugin to create a Jenkinsfile and commit it back to the SCM. The plugin will commit the Jenkinsfile at the root of the SCM repository it finds in the first job (selected in step 1 above). It will attempt to commit to this repo using the credentials it finds in the first job.

  5. Do note that the plugin will checkout the repo in to a temporary workspace on the master (JENKINS_HOME/plugins/convert-to- pipeline/ws). Once the conversion is complete and Jenkinsfile is committed back to the repo, the workspace will be deleted.

  6. Click "Convert" to convert the Freestyle job configurations to a single scripted pipeline job. Once the conversion is complete and the new job is created, you will be redirected to the newly created pipeline job.

That’s it!

To learn more about plugin usage, customization and to see a demo click here to watch the webinar replay on-demand.


Happy New Year!

$
0
0

Jenkins project congratulates all users and contributors with the New Year! Let’s take a look at some changes this year.

NewYear

Highlights

Some stats

In 2017 we had 60 weekly and 13 LTS releases with 305 fixes/enhancements only in the core. Next week Jenkins is going to hit the 2.100 version, and the core changed greatly since the 2.0 release in April 2016.Jenkins Security was one of the hottest areas this year, there were 7security advisories for the core and 15 - for plugins. For comparison, in 2016 there were only 6 security releases in total.

There were 2605 plugin releases, and >215 NEW plugins have been hosted in the Update Center. In particular Jenkins ecosystem has greatly expanded into the Cloud space by offering dozens of new plugins (e.g. for Azure and Kubernetes). We also got many new plugins providing integrations with various Development and DevOps tools.

Other subprojects and Jenkins components also got major updates. For example,Jenkins Remoting got 15 releases with stability improvements.Stapler Framework also got 6 releases.

Keep updating, Jenkins 2 is not only about Pipeline as Code!

Events

This year we got many new Jenkins Area Meetups. Currently there are 77 meetups with more than 20,000 members in total (full map). More than 100 meetups have been organized around the globe.

There were also several Jenkins-focused conferences including the following ones:

What’s next?

Next year we will have traditional contributor meeting at FOSDEM and at Jenkins World 2018. If you are interested in Jenkins, stop by at our community booths and join the contributor summits/hackathons. We also want to participate in Google Summer of Code 2018, and currently we are looking for mentors.

Stay tuned, there is much more to come next year!

FOSDEM 2018!

$
0
0

FOSDEM 2018 is a free event for software developers to meet, share ideas and collaborate. It is an annual event that brings open source contributors from around the world for two days of presentations, discussions, and learning.

Jenkins will be well-represented at FOSDEM 2018.

FOSDEM 2018

Happy Hour before FOSDEM

We’ll have a happy hour Friday evening before FOSDEM at Cafe Le Roy d’Espagne. See the meetup page for details.

Jenkins table at FOSDEM

A Jenkins table will be staffed by volunteers at FOSDEM to answer questions, discuss topics, and help users. See the meetup page for details.

Jenkins Hackfest after FOSDEM

A Jenkins Hackfest will be held the day after FOSDEM 2018. Those who would like to join us for the hackfest 5 Feb 2018 should register with the Post FOSDEM Jenkins Hackfest RSVP.

Meals, snacks, and beverages will be provided for the hackfest. Come join us, and let’s write some code!

Questions? feel free to contact Alyssa Tong or Mark Waite.

Google Summer Of Code 2018: Call for mentors

$
0
0

This year the Jenkins project is interested in participating inGoogle Summer of Code (GSoC). As in 2016/2017, we are looking for mentors. So yes, we are looking for you :)

Jenkins GSoC

What is GSoC?

GSoC is an annual international program which encourages college-aged students to participate with open source projects during the summer break between classes.

Students accepted into the program receive a stipend, paid by Google, to work on well-defined projects to improve or enhance the Jenkins project. In exchange, numerous Jenkins community members volunteer as mentors for students to help integrate them into the open source community and succeed in completing their summer projects.

What do mentors get?

  • A student who works full-time in the area of your interest for several months

  • Joint projects with Jenkins experts, lots of fun and ability to study something together

  • Limited-edition of swags from Google and Jenkins project

  • Maybe: Participation in GSoC Mentor Summit and other GSoC events/meetups

Conditions

Mentors are expected to…​

  • Be passionate about Jenkins

  • Lead the project in the area of their interest

  • Actively participate in the project during student selection, community bonding and coding phases (March - August)

  • Work in teams of 2+ mentors per 1 each student

  • Dedicate a consistent and significant amount of time, especially during the coding phase (~5 hours per week in a team of two mentors)

Mentorship does NOT require strong expertise in Jenkins plugin development. The main objective is to guide students and to get them involved into the Jenkins community. GSoC org admins will help to find advisors if a special expertise is needed.

Disclaimer: We cannot guarantee that the Jenkins organization gets accepted to GSoC. Even if it gets accepted, we may need to select projects depending on student applications and the number of allocated project slots.

Timeline

  • Dec 2017 - started collecting project ideas

  • Jan 17 - Status review at the Jenkins Governance Meeting. Outcome: decision whether we apply to GSoC in 2018.

  • Jan 21 - Application to GSoC (deadline - Jan 23)

  • Feb 12 - List of accepted mentoring organizations published

  • Next - If accepted, follow the GSoC Timeline

How to apply?

If you are interested in proposing a project or joining an existing one, please respond tothis thread in the Jenkins Developer mailing list. We aggregate/review proposals inthis document where you just need to describe the idea and introduce yourself.

It is fine to propose project ideas till March 12 when the student application phase begins. We kindly ask that you to do it before Jan 17, 2017 so that we can add it to the application materials.

Project requirements

  • GSoC is about code (though it may and likely should include some documentation and testing work)

  • Projects should be about Jenkins (plugins, core, infrastructure, integrations, etc.)

  • Projects should be potentially doable by a student in 3-4 months

You can find more information about requirements and practices in theGSoC Mentor Guide.

Moving from buddybuild to Jenkins for Android Developers

$
0
0

Last week, buddybuild — a hosted continuous integration service focused on mobile apps — announced that it had been acquired by Apple, and consequently its complete Android offering, along with its free tier for iOS users, will be discontinued at the beginning of March.

This was a fairly undesirable way to start 2018 for buddybuild’s Android users and, with less than two months to find an alternative, many took to Twitter to simultaneously congratulate buddybuild on their acquisition, and commiserate with others who have to find a new way to build and test their app.

While Jenkins is usually deployed as a self-hosted solution (with over 150k installs), rather than a hosted service like buddybuild, we thought this would be a good time to highlight — thanks to the rich plugin ecosystem of Jenkins — some of the possibilities offered to Android developers by Jenkins.

Common workflows

Android projects are fundamentally no different from how other types of software development projects might make use of a Continuous Integration & Continuous Delivery system (CI/CD) such as Jenkins: Android developers will collaborate using a source control management system (SCM) such as Git or Mercurial; they will create Pull Requests, which should be automatically verified; they expect to get feedback on test failures and code quality (e.g. via email or Slack); and they should be able to easily deploy new versions of their app to beta testers or end users.

To this end, Jenkins lets you define your build and deployment pipelines in a structured and auditable fashion (via Jenkinsfile), supports a multitude of SCMs, while the multibranch Pipeline feature automatically creates new Jenkins jobs for every new Pull Request in your repository, and cleans them up as branches get merged. The Blue Ocean user interface ties these features together in a clean, modern UI.

Blue Ocean build screenshot

Building Android Apps

To build an Android app, you need the Java development tools (JDK), which Jenkins can automatically install for you, plus the Android SDK, which you can also install on individual build agents using a tool installer, or you can use a Docker container with the Android SDK Tools preinstalled, for example.

Then, you can use your SCM plugin of choice to fetch your source code, and build the app using the Android Gradle Plugin via the Gradle Wrapper— in most cases this is as simple as running ./gradlew assembleDebug.

Once your app has been built and packaged into a .apk file, you can use the archiveArtifacts build step, storing the APK, enabling colleagues to download APKs directly from Jenkins, so that they can try out the latest build.

Testing Android Apps

The Android SDK supports two types of test: unit tests, which run on the JVM, and instrumentation tests, which have to run on an Android device or emulator. Both types of test can be executed using Jenkins and, since the Android Gradle Plugin writes the test results to disk in JUnit XML format, the JUnit Plugin for Jenkins can be used to parse the results, enabling you see a test report, and to be notified of test failures.

Compiling and executing the unit tests for your app is as simple as adding another build step which runs ./gradlew testDebugUnitTest.

Similarly, instrumentation tests can be compiled and executed via the connectedDebugAndroidTest task in Gradle. However, before you do this, you should ensure that an Android device is connected to your Jenkins build agent, or you can make use of the Android Emulator Plugin to automatically download, create, and start an emulator for you during a build. There are also plugins for cloud testing services such as AWS Device Farm.

Once you have finished executing the tests, you can use the junit step to analyse the results: junit '**/TEST-*.xml'.

Static Analysis

Similar to other Java or Kotlin projects, you can scan your codebase using static analysis tools like FindBugs or Checkstyle. Once again, Jenkins has analysis plugins which can parse the output of these tools, and present you with the results and trend graphs, or optionally flag the build as unstable or failed if too many problems have been detected.

The Android SDK provides a further useful static analysis tool called Lint. The output of this tool can be parsed by the Android Lint Plugin, which will analyse the issues found, and provide you with a detailed report within Jenkins. This functionality was demonstrated by the Android Tools Team at the Google I/O conference a few years back.

Securely signing and deploying Android apps

In order to distribute an Android app, it needs to be signed with a private key, which you should keep safe (losing it means you won’t be able to publish updates to your app!), and as secure as possible.

Instead of developers having to keep the signing keystore on their development machines, you can securely store the keystore and/or its passphrase on Jenkins using the Credentials Plugin. This avoids having to hardcode the passphrase into your build.gradle, or have it otherwise checked into your SCM.

The Credentials Plugin allows you to store secrets in Jenkins — which will be stored encrypted on disk when not in use — and those secrets can temporarily be made available during a build, either as a file in the build workspace, or exposed as an environment variable.

You can use such environment variables in a signingConfig block within your build.gradle, or you can make use of the Android Signing Plugin to sign your APK for you.

Once you have your production-ready APK built and signed, you can automatically upload it to Google Play using the Google Play Android Publisher plugin. The benefit of using this plugin is that it supports multiple APK upload, expansion files, uploading of ProGuard mapping files, promotion of builds from alpha, to beta, to production — and once again, your Google Play credentials are securely stored on Jenkins thanks to integration with the Credentials Plugin.

Sample Pipeline

Here’s a straightforward example of a Jenkinsfile defining a pipeline to build, test, and optionally deploy an Android app, from a multibranch Pipeline job. It requires the Pipeline, JUnit, Android Lint, Google Play Android Publisher, and Mailer plugins to be installed.

Jenkinsfile
pipeline {
  agent {// Run on a build agent where we have the Android SDK installed
    label 'android'
  }
  options {// Stop the build early in case of compile or test failures
    skipStagesAfterUnstable()
  }
  stages {
    stage('Compile') {
      steps {// Compile the app and its dependencies
        sh './gradlew compileDebugSources'
      }
    }
    stage('Unit test') {
      steps {// Compile and run the unit tests for the app and its dependencies
        sh './gradlew testDebugUnitTest testDebugUnitTest'// Analyse the test results and update the build result as appropriate
        junit '**/TEST-*.xml'
      }
    }
    stage('Build APK') {
      steps {// Finish building and packaging the APK
        sh './gradlew assembleDebug'// Archive the APKs so that they can be downloaded from Jenkins
        archiveArtifacts '**/*.apk'
      }
    }
    stage('Static analysis') {
      steps {// Run Lint and analyse the results
        sh './gradlew lintDebug'
        androidLint pattern: '**/lint-results-*.xml'
      }
    }
    stage('Deploy') {
      when {// Only execute this stage when building from the `beta` branch
        branch 'beta'
      }
      environment {// Assuming a file credential has been added to Jenkins, with the ID 'my-app-signing-keystore',// this will export an environment variable during the build, pointing to the absolute path of// the stored Android keystore file.  When the build ends, the temporarily file will be removed.
        SIGNING_KEYSTORE = credentials('my-app-signing-keystore')// Similarly, the value of this variable will be a password stored by the Credentials Plugin
        SIGNING_KEY_PASSWORD = credentials('my-app-signing-password')
      }
      steps {// Build the app in release mode, and sign the APK using the environment variables
        sh './gradlew assembleRelease'// Archive the APKs so that they can be downloaded from Jenkins
        archiveArtifacts '**/*.apk'// Upload the APK to Google Play
        androidApkUpload googleCredentialsId: 'Google Play', apkFilesPattern: '**/*-release.apk', trackName: 'beta'
      }
      post {
        success {// Notify if the upload succeeded
          mail to: 'beta-testers@example.com', subject: 'New build available!', body: 'Check it out!'
        }
      }
    }
  }
  post {
    failure {// Notify developer team of the failure
      mail to: 'android-devs@example.com', subject: 'Oops!', body: "Build ${env.BUILD_NUMBER} failed; ${env.BUILD_URL}"
    }
  }
}

Not just for Android

While buddybuild concentrated on Android and iOS apps, thanks to the distributed build agent architecture of Jenkins, you can automate any type of project.

For example, you can expand the capabilities of Jenkins by adding macOS (or Windows, Linux, BSD…) agents; you can dynamically spin up agents on AWS EC2 instances, Microsoft Azure VMs, or Azure Container Instances; you can create agents using VMware, and so on.

Conclusion

Thousands of Jenkins instances are already using the various Android-related plugins, and Pipeline along with the Blue Ocean User Interface make using Jenkins simpler than it’s ever been.

Give Jenkins a try for building your Android projects, check out the tutorials, and get in touch via the users' mailing list, or IRC.

Finally, as with Jenkins itself, all plugins distributed are open-source, so feel free to contribute!

JEP-200: Remoting / XStream whitelist integrated into Jenkins core

$
0
0

Overview

JEP-200 has been integrated into Jenkins weekly builds and (if all goes well) will be a part of the next LTS line. In a nutshell, this change is a security hardening measure to be less permissive about deserializing Java classes defined in the Java Platform or libraries bundled with Jenkins. For several years now, Jenkins has specifically blacklisted certain classes and packages according to known or suspected exploits; now it will reject all classes not explicitly mentioned in a whitelist, or defined in Jenkins core or plugins.

For Jenkins administrators

Before upgrade

Back up your Jenkins instance prior to upgrade so you have any easy way of rolling back. If you are running any of the plugins listed inPlugins affected by fix for JEP-200, update them after taking the backup but before upgrading Jenkins core.

If you have a way of testing the upgrade in an isolated environment before applying it to production, do so now.

Using backups and a staging server is good advice before any upgrade but especially this one, with a relatively high risk of regression.

After upgrade

To the extent that advance testing of the impact of this change on popular plugins has been completed, most users (and even plugin developers) should not notice any difference. If you do encounter a java.lang.SecurityException: Rejected: some.pkg.and.ClassName in the Jenkins UI or logs, you may have found a case where an unusual plugin, or an unusual usage mode of a common plugin, violates the existing whitelist. This will be visible in the Jenkins system log as a message from jenkins.security.ClassFilterImpl like the following:

some.pkg.and.ClassName in file:/var/lib/jenkins/plugins/some-plugin-name/WEB-INF/lib/some-library-1.2.jar might be dangerous, so rejecting; see https://jenkins.io/redirect/class-filter/

where the link would direct you here.

If you find such a case, please report it in the Jenkins issue tracker, under the appropriate plugin component. Link it to JENKINS-47736 and add the JEP-200 label. If at all possible, include complete steps to reproduce the problem from scratch. Jenkins developers will strive to evaluate the reason for the violation and offer a fix in the form of a core and/or plugin update. For more details and current status, seePlugins affected by fix for JEP-200.

Assuming you see no particular reason to think that the class in question has dangerous deserialization semantics, which is rare, it is possible to work around the problem in your own installation as a temporary expedient. Simply make note of any class name(s) mentioned in such log messages, and run Jenkins with this startup option (details will depend on your installation method):

-Dhudson.remoting.ClassFilter=some.pkg.and.ClassName,some.pkg.and.OtherClassName

For plugin developers

Testing plugins against Jenkins 2.102 and above

As a plugin developer encountering this kind of error, your first task is to ensure that it is reproducible in a functional (JenkinsRule) test when running Jenkins 2.102 or newer to reproduce the error.

mvn test -Djenkins.version=2.102

The above assumes you are using a recent 2.x or 3.x parent Plugin POM. For certain cases you may need to use Plugin Compat Tester (PCT) to run tests against Jenkins core versions newer than your baseline.

Running PCT against the latest Jenkins core:

java -jar pct-cli.jar -reportFile $(pwd)/out/pct-report.xml \
    -workDirectory $(pwd)/work -skipTestCache true -mvn $(which mvn) \
    -includePlugins ${ARTIFACT_ID} -localCheckoutDir ${YOUR_PLUGIN_REPO}

You may need to run tests using an agent (e.g., JenkinsRule.createSlave) or force saves of plugin settings.

For maven plugins you can also specify custom Jenkins versions in Jenkinsfile to run tests against JEP-200:

buildPlugin(jenkinsVersions: [null, '2.102'])

(again picking whatever version you need to test against) so that the test is included during CI builds, even while your minimum core baseline predates JEP-200.

If your plugins are built with Gradle, your mileage may vary.

Making plugins compatible with Jenkins 2.102 or above

If you discover a compatibility issue in your plugin, you then have several choices for fixing the problem:

  • Ideally, simplify your code so that the mentioned class is not deserialized via Jenkins Remoting or XStream to begin with:

    • If the problem occurred when receiving a response from an agent, change your Callable (or FileCallable) to return a plainer type.

    • If the problem occurred when saving an XML file (such as a config.xml or build.xml), use a plainer type in non-transient fields in your persistable plugin classes.

  • If the class(es) are defined in the Java Platform or some library bundled in Jenkins core, propose a pull request adding it to core/src/main/resources/jenkins/security/whitelisted-classes.txt in jenkinsci/jenkins.

  • If the class(es) are defined in a third-party library bundled in your plugin, create a resource file META-INF/hudson.remoting.ClassFilter listing them. (example)

    • You may also do this for Java or Jenkins core library classes, as a hotfix until your core baseline includes the whitelist entry proposed above.

  • If the class(es) are defined in a JAR you build and then bundle in your plugin’s *.jpi, add a Jenkins-ClassFilter-Whitelisted: true manifest entry. This whitelists every class in the JAR. (example)

Viewing all 1087 articles
Browse latest View live