Software Delivery Mastery

Build Pipeline Plugin 1.3.0 Release

Posted on

New release

About a year after the last official update and after some unofficial releases, we are happy to announce a new release of the Build Pipeline Plugin.  This new version is meant to be more visual and responsive.  The release also includes a Build Pipeline Dashboard View to be used with the Dashboard View Plugin.  In addition, various bugs have been fixed and various features have been implemented.  We would like to thank our contributors and we hope this new release improves your experience with the Build Pipeline Plugin.

Pipeline version

Another thing we should mention is the build pipeline version.  Our initial intention was to display the Source Control Management (SVN, GIT, HG, etc) version of the checked out code of the first build in the pipeline.  This proved to be a challenge since various SCM plugins seemed to implement getting the SCM version differently and Jenkins SCM abstract classes did not seem to force SCM plugins to provide methods to get this info in a uniform way.  Furthermore, various users did not care for the SCM version to be displayed as the version of the pipeline and in other cases it simply did not make sense (if people checked out code at each build steps rather than use the clone workspace plugin).  Thus, following the example of one of our contributors, we simply used the build number of the first job as the pipeline version.  We felt that it simplified things from our end while still providing a unique number for when a pipeline was executed.  We hope that this approach will satisfy most users.

Release notes

 

BRIEFING: Delivering Better Software (Tests as a Communication Tool)

Posted on

Come and join us for drinks, socialising and a special presentation
This will be a really informal session, with plenty of opportunities to ask questions and interact. If that doesn’t sell it to you, how about the FREE DRINKS??

The Talk

Completing the circle: Automated web tests, ATDD and Acceptance Tests as a team communication tool

Acceptance Test Driven Development, or ATDD, has proven to be a very effective technique, both for driving and guiding development, and for enhancing communication between developers and other project stakeholders. But why stop there? Well designed Acceptance Tests can also act as a formidable documentation source and communication tool. Indeed, when written in a narrative, BDD-type style, Acceptance Tests have the potential to document in detail how the user interacts with the application.
In this talk we will look at the role of automated Acceptance Tests not only for testing, but also as part of the whole development lifecycle, from writing the user stories right through to deploying the application. We will also look at ways to make your automated acceptance tests more expressive and how to use them more effectively as a communication, reporting and documentation tool.
Finally, we will present and demonstrate a new open source library that helps developers and testers write automated acceptance tests for web applications using WebDriver/Selenium 2. This library also produces clean, narrative-style reports illustrated with screenshots that effectively describe the application’s functionality and behaviour, as well as any regressions or pending features.

The Speaker

CEO of Wakaleo Consulting, John is an experienced consultant and trainer specialising in Enterprise Java, Web Development, and Open Source technologies. John is well known in the Java community for his many published articles, and as author of Java Power Tools, and Jenkins: The Definitive Guide.
John helps organisations around the world to improve their Java development processes and infrastructures and provides training and mentoring in open source technologies, Test Driven Development (TDD, BDD and ATDD), Automated Web Testing, SDLC tools, and agile development processes in general.

The Begging

A fascinating subject that should give you some great ideas and techniques to take back to your team.
This is our first joint event and we’d really appreciate your support. We’ve booked a big room and need to fill it! PLEASE BRING YOUR FRIENDS…

When

Thursday, June 23, 2011 from 5:00 PM – 8:00 PM (GMT+1000)

Where

Combined Services Club (upstairs)
5-7 Barrack Street
(Cnr of Clarence, next to Officeworks)
Sydney, New South Wales 2000
Australia

Registration

So complete the free registration at: http://bettersoftwarebriefing.eventbrite.com

Centrum Systems at Agile Australia 2011

Posted on

Centrum Systems will be sponsering Agile Australia 2011.

Agile Australia is going to be packed with case-studies of how leading businesses are adopting an Agile approach to stay ahead!   As well as Agile dignitaries Alistair Cockburn and Martin Fowler, international Agile guru Jean Tabaka, and celebrated Australian industry author Rob Thomsett

  • Learn how to respond quickly to change, minimise overall risk, improve quality, and enhance project outcomes
  • Discover compelling examples of innovation and business value achieved through Agile

Please come to our stand and say hello…

Coding standards harmony

Posted on

Coding standards

Most mature software development companies or departments define their coding standards.  The intentions is simple; ensure all code looks alike to ease reading, writing, maintaining and communicating code.  As a first effort, these coding conventions may be expressed in some form of standalone document, but conventions that are not being enforced are simply a waste of time.  In the Java world, various tools have existed for some time now, helping us to enforce and adhere to coding standards; Checkstyle, Eclipse and Sonar.   Until fairly recently, it has been laborious to make those tool work together and help us achieve code consistency.  Thankfully, as the mentioned tools matured, it is now possible to define and enforce coding standards effortlessly and the synergy between these tools may even be surprising.

Recap

Lets quickly state the purpose of each tool before we move on.

Checkstyle

Checkstyle is a development tool to help programmers write Java code that adheres to a coding standard. It automates the process of checking Java code to spare humans of this boring (but important) task. This makes it ideal for projects that want to enforce a coding standard.

Eclipse formatter

The Eclipse formatter is a set of rules that defines how code will be formatted.

Eclipse clean up

The clean up utility helps to apply formatting rules and coding conventions to a single file or to a set of files in one go.

Eclipse save actions

Save actions are similar to clean up and they define what should happen to the code when a file is saved.  For example, save actions can ensure code is formatted, unused imports are removed and arguments are set to “final” right before the file is saved.

Sonar

Sonar is an open platform to manage code quality.

A common situation

It would be quite common to define coding standards using Checkstyle and including it as part of a project.  Then Eclipse formatter, clean up and save actions would be configured manually to match the Checkstyle rules.  In addition, Checkstyle would run as part of a build to publish the code violations report to a file or to Sonar.  Some better integrated teams would also use the Checkstyle Eclipse plugin in order to see the violations in their code and as code would change and adhere to standards, the Checkstyle Eclipse plugin would reflect that.

Shortcomings

The common situation outlined above is a decent setup but it has some shortcomings.  If the coding standard rules change, that sends a ripple through all the tools.  The Eclipse formatter needs to reflect the new coding standard rules, as well as the clean up and the save actions.  Furthermore, Sonar needs to be updated with the new rules.  In addition, sharing the Checkstyle file between projects and teams can become a choir.  There are ways to define a remote coding standard file used between teams but that does not address the lack of synchronisation between all the tools … until recently.

Harmony

Checkstyle, Sonar and Eclipse have been around for a long time and as these tools matured they developed great integration between them.  By aligning these tools it is possible to establish one central coding standard rule set and reflect those in the development environment automatically.  Furthermore, once configured, changes to coding standards are propagated automatically and developers are always informed about up to date coding standards and apply them as they code.

Example

Lets look at an example on how to best utilise Checkstyle, Eclipse and Sonar.  In addition, to give the example more relevance, lets start with an existing “legacy” project where coding standards have not necessarily been respected.

Assumptions

The example assumes the following:

  • Java project
  • Maven build
  • Checkstyle file expressing the coding standards
  • Sonar
  • Eclipse
  • Eclipse checkstyle plugin
  • Eclipse sonar plugin

Initial coding standard report

We’ll start from the point where an initial Checkstyle configuration has been uploaded to Sonar and a Sonar report has been produced for our existing project.

Reduce violations in IDE

View and reduce violations in the IDE

Next, we’ll configure Eclipse to see those violations closer to the code.  In order to do so, we’ll need to configure the Eclipse Checkstyle plugin with the same rules as Sonar and apply the configuration to the projects.

  1. A link to the Checkstyle configuration
    A link to the Checkstyle configuration
    A link to the Checkstyle configuration 2
    Permalinks
    Checkstyle
  2. Reference to the Checkstyle rules in eclipse (Window > Preferences > Checkstyle > New)
  3. Configure Checkstyle for a project (right click project > Properties > Checkstyle).  Please note the Write formatter/cleanup config checkbox.  This is the part that synchronises the coding standards with the Eclipse formatter and clean up.  You can also right click on your project > Checkstyle > Create Formatter-Profile to achieve the same thing.  This kind of synchronisation alleviates the painful manual synchronisation between Checkstyle and Eclipse; brilliant!
  4. Once Checkstyle has been configured and enabled for a project, notice that violations are annotated by the code
    width="602"
  5. Now that Eclipse has been configured with the coding standard rules and formatting profiles have been updated, we can bulk clean up existing code and go a long way to ensure the code adheres to standards.  After pressing Next > (to review upcoming changes) or Finish, Eclipse would do what it can to help the code to adhere to standards.

    width="602"
  6. After republishing a code standard report to Sonar, we can see a reduction in violations
    width="584"

Save actions

Once the Eclipse formatter and clean up profiles have been updated, don’t forget to update save actions so that as many coding standards are automatically applied as soon as possible (before every save).

Eclipse Sonar integration

Similarly to the Checkstyle Eclipse  plugin, there is a Sonar Eclipse plugin that will annotate code with the violations as seen in Sonar.  In addition to showing Checkstyle violations, the Sonar Eclipse plugin would show Findbugs and PMD violations (all static code analysis tools configured).  The integration is quite simple.

  1. Install the Sonar Eclipse plugin
  2. Identify your Sonar installation
    width="602"
  3. Associate your Eclipse  projects with their Sonar equivalents (notice that your project has to have at least one Sonar report published)
    width="601"
  4. Once the configuration is complete, you can see the violations as published to Sonar annotated in your code
    width="545"
  5. Please note that the code has been annotated with the violations as it was found in Sonar.  If code changes are made, those violations will remain and get out of sync.  Alternatively, you can choose to rerun the checks locally and refresh the violations view.
    width="547"

The usefulness of the Checkstyle Eclipse plugin

Although the Sonar Eclipse plugin may make the Checkstyle Eclipse plugin look superfluous, remember that it is the latter that updates the Eclipse formatting rules as well as the clean up profiles.  If or until the Sonar Eclipse plugin fulfils the same duty, the Checkstyle Eclipse plugin remains very useful.

Not everything can be automated

Please note that although a lot of coding standards can be applied retroactively and automatically, some violations cannot be automatically eradicated.  Nonetheless, Checkstyle, Eclipse and Sonar can identify the problematic code and guide developers towards coding standard compliance.

Conclusion

Coding standards are a preoccupation for most software development teams.  Defining coding standards is one things but enforcing them effectively is another.  Thankfully as Checkstyle, Eclipse and Sonar matured, coding standard definition and enforcement can be a straightforward and a sustainable activity.

Publishing to Jenkins and Hudson – ease versus control

Posted on

The split

The popular open source continuous integration server, Hudson, forked a few months ago.  Its creator, Kohsuke Kawaguchi, along with a larger part of the open source community forked Hudson and created Jenkins. As for Hudson, it is now under the Oracle and Sonatype umbrella.

In the meantime, we at Centrum Systems were writing the build pipeline plugin.  Not wanting to take sides, we decided to release the plugin to both Hudson and Jenkins.  Having released version 1.0.0 of the plugin and receiving a lot of positive and constructive feedback, we wanted to correct some problems with the plugin, while adding some requested features.  I’ll try and summarize our experience of releasing the plugin to either platform.

Hudson

  1. Publish the plugin using the same mechanism we used to publish 1.0.0
  2. Realize the old process changed since the plugin was deployed to the same repository but it would not appear in Hudson’s update center
  3. Read the new release process and comply to it
  4. Create a JIRA user
  5. Miss a few points and ask Oracle for some help
  6. Create a JIRA ticket expressing our willingness to publish the plugin
  7. Wait for someone to address the ticket and give us rights to publish to the staging repository
  8. Change the build to ensure the artifacts were signed as per the following instructions
  9. Publish the plugin using release:prepare, release:perform (and the corresponding deployment management configuration as described in step 3)
  10. Log on to Sonatype’s Nexus instance and “close” the deployment (some verifications were performed on the plugin)
  11. The verification failed and I needed to publish our public key to a keys server (actually, the error message was really clear and helpful)
  12. “Release” our plugin within Sonatype’s Nexus workflow (a single click, similar to step 10)
  13. Comment on the original JIRA ticket to ask them to release the plugin for real (to the maven 2 repo)
  14. Wait a while for the JSON file listing all Hudson plugins to be updated and show the availability of a new version

Elapsed time: About 6 working days

Jenkins

  1. Publish the plugin using the same mechanism we used to publish 1.0.0 (create an account, add the distribution management information in the pom file, update the settings.xml file with the server information)
  2. Wait a few hours for the JSON file listing all Jenkins plugins to be updated  and show the availability of a new version

Elapsed time: A few hours

Conclusion

I can certainly appreciate the steps to sign the plugin and the extra validations ensuring the plugin has all the right elements, but the Hudson process felt a bit heavy.  Hopefully the next release will not be as tedious since we learned quite a bit during the exercise.  Finally, I would like to thank Winston from Oracle since he was helpful and responsive and all other people involved in the process.  I would also like to thank the Jenkins people since the publishing process was nice, streamlined and uneventful (in a good way).

The fact that Jenkins has had 11 new versions whilst Hudson has had only 2 since the fork is indicative of the different approaches.

BUILD PIPELINE PLUGIN RELEASE 1.1.2

Posted on

Following great community feedback about the Build Pipeline Plugin 1.1.1, we are pleased to announce the availability of version 1.1.2.  The release addressed a few bugs and expanded its SCM support (namely Git and Mercurial).

BUG FIXES

The following defects have been resolved. Remaining defects are visible from the google code issue tracker (http://code.google.com/p/build-pipeline-plugin/issues/list).

Defect ID Summary
2 Support Git SHA1 for “revision”
16 Add date/time of execution
19 Build view empty, NPE in the log
22 Support for mercurial revision number
27 Build Pipeline and Copy Artifact plugin combination causes NPE

Build Pipeline Plugin

Build Pipeline Plugin Release 1.1.1

Posted on

At Centrum Systems we were curious to see the response of the initial release of the Hudson/Jenkins build pipeline plugin by the community. We were pleased with the uptake and received some brilliant feedback including bugs and enhancement requests. Yesterday we published a minor update to the plugin that resolves the majority of the issues as well as offering some new capabilities.

New Features

The major improvements we achieved as part of release 1.1.1 are:

  1. Improved GUI

    The GUI has been completely revamped to improve usability and build pipeline display. New features are:

    • The whole display is now centred with each build pipeline being clearly displayed.
    • Each build has a cleaner look displaying important build information.
    • The job and build information is hyperlinked to the respective pages.
    • Links have been added to:
      • Navigate to the view configuration page
      • Invoke a new build of the pipeline
  2. Support for Parallel Downstream Jobs

    You can now create multiple downstream projects from the same upstream project. This includes multiple downstream splits as shown below.

    Split Pipeline
    Split Build Pipeline
  3. Support for Automatic and Manual trigger Downstream Jobs

    You can now create both automatic and/or manually triggered downstream build steps on the same project.

  4. Support for Hudson/Jenkins Security Settings
  5. Improved overall plugin stability

Bug Fixes

The following defects have been resolved. Remaining defects are visible from the google code issue tracker (http://code.google.com/p/build-pipeline-plugin/issues/list).

Defect ID Summary
1 Manual execution ignores security settings
3 Support for parallel downstream jobs and joins
4 Pipelines seem to be repeated to fill “No Of Displayed Builds”
5 Images not working in view if using –prefix
6 Configure errors
7 Manual build checkbox losing its value
8 Build pipeline revision box does not display SVN revision number
10 Images not found
11 Descriptions not being saved
12 Incorrect job paths
14 Pipeline doesn’t stop when one job fails
18 Can’t select any job as root

Known Limitations

Future Work

Documentation

The documentation is available from the Hudson and Jenkins wikis.

Ratcheting up code coverage with Sonar

Posted on

At Centrum we are often brought in by organisations that want to improve the quality of their software deliverables and remove some of the unwanted “excitement” from the delivery process.  We love engagements like this because it means that the client understands that there is a cost to neglecting to focus on quality and that they are open to changing processes and tools to move forward and start paying off that technical debt.

Unit test coverage – the easy bit…

How a drive for change often starts is that a new “green fields” project is chosen and high unit test coverage is encouraged (or enforced) perhaps along side practices such as TDD.  The benefits can be seen by the team involved in the project and this message is taken on board by management.   Unit Testing has been deemed to be “a good thing”.

So now for the legacy code…right?

So now the organisation or team has bought into the benefits of having a good level of unit test coverage and want to roll it out to all their projects.   However, the problem seems insurmountable.  The code analysis shows that your current coverage is at < 2%.  How do you get up to your target?  Often the response is to only enforce coverage on the new projects that were built from day 1 enforcing high coverage.  This can mean that you are actually enforcing standards on a tiny proportion of your organisations code.  Another option is of course to invest in writing the test cases for legacy code.  However, this investment is rarely made made nor is it necessarily recommended.  Test cases are most valuable when written before 0r at the time that the code is written.

The third way.  Ratcheting up coverage

What we often recommend when we hit the situation outlined above is to take a continual improvement approach.  Find ways to gradually improve the quality of your code and build momentum.  Find some metrics that can show a positive view of the improvements been made, don’t simply compare your legacy projects 2% coverage with your green fields project at 80%.  The 80% is an impossible short-term target and actually acts as a disincentive to improvement.

Sonars now reports coverage of recent changes

Sonar has just introduced functionality to show the coverage on recent changes.  This allow you to enforce coverage on every line of code added or changed during a project and over time your overall coverage will get there.  It also has the effect of introducing tests for those parts of your code base that change more frequently and therefore get the most value out of them.

Sonar Dashboard
Sonar Dashboard

What is also pretty neat is the ability to show the source code marked up with only the code that is untested, but only for the period that you are interested in.  This gives developers the feedback they need to write tests that cover changed code.

Sonar marked up coverage
Filtered code coverage

Footnote:  Sonar for the uninitiated

Sonar is an open source quality platform.  It collates and mines data from a variety of code analysis tools as well as it’s own in built ones to give you a holistic view of the quality of your software.  The “7 axis of code quality” as described by Sonar are:  ArchitectureDesign, Duplications, Unit Test, Complexity, Potential Bugs,  RulesFormatting & Comments (Documentation).

Build Pipeline (Hudson & Jenkins) Plugin 1.0.0 Released

Posted on

We are pleased to announce that we are donating the Centrum build pipeline plugin to the open source community.  This is a plugin developed for the popular Hudson and Jenkins Continuous Integration servers.

This was developed to assist with the orchestrating the promotion of a version of software through quality gates and into production. By extending the concepts of CI you can create a chain of jobs each one subjecting your build to quality assurance steps. These QA steps may be a combination of manual and automated steps. Once a build has passed all these, it can be automatically deployed into production.

Release Management, Continuous Integration, Automated Testing and Deployment Automation are all links in a chain that need to work together to get to high quality, low risk software deliveries.

Documentation

The documentation is available from the Hudson and Jenkins wikis.

Screenshot

build pipeline
build pipeline

Implementing Maven in a Legacy Code Base

Posted on

It’s your first day with a new company or client. The usual things happen; you get introduced to everyone, shown where the toilets and fire stairs are located, pointed towards your desk, allocated a PC and login to the corporate network. Everything is going fine you login successfully, email works and you start to configure your PC to actually get some work done.

You install the Java IDE (in my case Eclipse) and get the URL for the source code repository. You check out the many modules that make up the application and start trawling through the code looking for hints on how the modules are built, pom.xml …, nowhere to be seen. Hmm, build.xml …, ah there you are, OK run the Ant build, FAIL, class not found, class not found…..

You look to the IDE for guidance, red crosses everywhere, same problem; there are multiple dependencies that are not in place (as far as the IDE is concerned). You figure I will discuss this with someone who is more familiar with the code base and application structure. What you find is that the application and module structure is tightly bound to that person’s IDE and is hidden within IDE metadata files. Worse still these metadata files are actually checked into the source code repository complete with hard coded locations to a particular person’s PC.

Sounds familiar? Dealing with a legacy code base that has grown over the years can be very difficult. In some instances the knowledge of the application is with one or two key staff members. Those people may have architected the application on the run and not followed industry standards. They may no longer be with the company. You’re left to work it all out, where do you start?

If you have a legacy code base that has multiple dependencies and is currently built via Ant, you can implement Maven within this code base. Here are some tips that may prove helpful.

1. Baseline

  • Get the application and all of its modules to a known working state.
  • Ensure that you understand the dependencies between modules and to 3rd party libraries.
  • Ensure that you can successfully build the application via the Ant build scripts.
  • Ensure that you can successfully deploy and run the application.

2. Create POM files for each Application Module

Create a pom.xml file for each of the dependent application modules. Ensure that it contains:

  • A &ltparent&gt section to provide the details of the parent POM.
  • A &ltrepository&gt section to provide the details of the enterprise remote repository.
  • A &ltdependency&gt section to provide the details of the installer provider. This will be required for step 3 (in my case I used wagon-webdav).
3. Deploy Application Modules to the Enterprise Remote Maven Repository

Modify the Ant build scripts of each of the dependant application modules to deploy the resulting artefact to the enterprise remote Maven repository. This can be achieved by using the Maven Ant Tasks (http://maven.apache.org/ant-tasks/index.html). The important points to remember are:

  • Ensure that you have a reference to the Maven-Ant task and that the file maven-ant-tasks-*.jar is on the classpath.
 <!-- Creates a classpath element for the Maven-Ant Task (http://maven.apache.org/ant-tasks/index.html) -->   <path id="maven.ant.class.path">        <fileset dir="${maven.ant.lib.dir}">             <include name="*.jar" />        </fileset>   </path>   <typedef resource="org/apache/maven/artifact/ant/antlib.xml"        uri="antlib:org.apache.maven.artifact.ant"        classpathref="maven.ant.class.path" />  
  • Create a classpath element that points to the dependencies within the pom.xml file create in step 2. This can then be used when compiling the code.
  •  <target name="initDependencies">        <artifact:dependencies pathId="maven.dependency.classpath">             <pom file="${project.dir}/pom.xml"/>        </artifact:dependencies>   </target>  
  • Create a deploy target that refers to the pom.xml file from step 2 and the correct enterprise remote repository for deploying artifacts.
  •  <target name="mvnDeploy">        <!-- Refer to the local pom file -->        <artifact:pom id="projectPom" file="${project.dir}/pom.xml" />        <!-- Defines the Remote Repository -->        <artifact:remoteRepository id="inHouseRepo" url="${maven.deploy.repository.url}">             <releases enabled="true"/>             <snapshots enabled="false"/>             <authentication username="${maven.deploy.repository.username}" password="${maven.deploy.repository.password}"/>        </artifact:remoteRepository>        <artifact:install-provider artifactId="wagon-webdav" version="1.0-beta-1"/>        <!-- Deploy the artifact using the new pom file to the Suncorp in house repository -->        <artifact:deploy file="${project.dist.dir}/${project.name}.jar">             <remoteRepository refid="inHouseRepo"/>             <pom refid="projectPom"/>        </artifact:deploy>   </target>  

    These application modules will now be stored in your enterprise remote Maven repository conveniently available for a Maven build.


    3. Create a Maven Project
    Depending on the architecture of your application you can either;

    • Create a new top level module as a Maven project
    • Convert the current top level module into a Maven project

    The key points with this activity are to;

    a. If you are converting ensure that you follow the directory structure require by Maven (http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html)

    b. Ensure that your new pom.xml contains all of the dependencies for the application. For the 3rd party libraries this should be a matter of digging through the various modules, finding the currently used 3rd party JAR files and adding entries to the pom.xml.

    For application modules create a entry for each module as follows:

     <dependency>   <groupId>com.company</groupId>   <artifactId>module-1</artifactId>   <version>1.0</version></dependency> 


    4. Test

    • Package the application via Maven (i.e. mvn clean package).
    • Inspect the resulting artefact.
    • Does it match the baseline artefact created successfully in step 1?
    • Does it run successfully?

    5. Repeat

    Now convert the next highest application module to a Maven project following the steps outlined above.