legacy code

BRIEFING: Delivering Better Software (Tests as a Communication Tool)

Posted on

Come and join us for drinks, socialising and a special presentation
This will be a really informal session, with plenty of opportunities to ask questions and interact. If that doesn’t sell it to you, how about the FREE DRINKS??

The Talk

Completing the circle: Automated web tests, ATDD and Acceptance Tests as a team communication tool

Acceptance Test Driven Development, or ATDD, has proven to be a very effective technique, both for driving and guiding development, and for enhancing communication between developers and other project stakeholders. But why stop there? Well designed Acceptance Tests can also act as a formidable documentation source and communication tool. Indeed, when written in a narrative, BDD-type style, Acceptance Tests have the potential to document in detail how the user interacts with the application.
In this talk we will look at the role of automated Acceptance Tests not only for testing, but also as part of the whole development lifecycle, from writing the user stories right through to deploying the application. We will also look at ways to make your automated acceptance tests more expressive and how to use them more effectively as a communication, reporting and documentation tool.
Finally, we will present and demonstrate a new open source library that helps developers and testers write automated acceptance tests for web applications using WebDriver/Selenium 2. This library also produces clean, narrative-style reports illustrated with screenshots that effectively describe the application’s functionality and behaviour, as well as any regressions or pending features.

The Speaker

CEO of Wakaleo Consulting, John is an experienced consultant and trainer specialising in Enterprise Java, Web Development, and Open Source technologies. John is well known in the Java community for his many published articles, and as author of Java Power Tools, and Jenkins: The Definitive Guide.
John helps organisations around the world to improve their Java development processes and infrastructures and provides training and mentoring in open source technologies, Test Driven Development (TDD, BDD and ATDD), Automated Web Testing, SDLC tools, and agile development processes in general.

The Begging

A fascinating subject that should give you some great ideas and techniques to take back to your team.
This is our first joint event and we’d really appreciate your support. We’ve booked a big room and need to fill it! PLEASE BRING YOUR FRIENDS…

When

Thursday, June 23, 2011 from 5:00 PM – 8:00 PM (GMT+1000)

Where

Combined Services Club (upstairs)
5-7 Barrack Street
(Cnr of Clarence, next to Officeworks)
Sydney, New South Wales 2000
Australia

Registration

So complete the free registration at: http://bettersoftwarebriefing.eventbrite.com

Coding standards harmony

Posted on

Coding standards

Most mature software development companies or departments define their coding standards.  The intentions is simple; ensure all code looks alike to ease reading, writing, maintaining and communicating code.  As a first effort, these coding conventions may be expressed in some form of standalone document, but conventions that are not being enforced are simply a waste of time.  In the Java world, various tools have existed for some time now, helping us to enforce and adhere to coding standards; Checkstyle, Eclipse and Sonar.   Until fairly recently, it has been laborious to make those tool work together and help us achieve code consistency.  Thankfully, as the mentioned tools matured, it is now possible to define and enforce coding standards effortlessly and the synergy between these tools may even be surprising.

Recap

Lets quickly state the purpose of each tool before we move on.

Checkstyle

Checkstyle is a development tool to help programmers write Java code that adheres to a coding standard. It automates the process of checking Java code to spare humans of this boring (but important) task. This makes it ideal for projects that want to enforce a coding standard.

Eclipse formatter

The Eclipse formatter is a set of rules that defines how code will be formatted.

Eclipse clean up

The clean up utility helps to apply formatting rules and coding conventions to a single file or to a set of files in one go.

Eclipse save actions

Save actions are similar to clean up and they define what should happen to the code when a file is saved.  For example, save actions can ensure code is formatted, unused imports are removed and arguments are set to “final” right before the file is saved.

Sonar

Sonar is an open platform to manage code quality.

A common situation

It would be quite common to define coding standards using Checkstyle and including it as part of a project.  Then Eclipse formatter, clean up and save actions would be configured manually to match the Checkstyle rules.  In addition, Checkstyle would run as part of a build to publish the code violations report to a file or to Sonar.  Some better integrated teams would also use the Checkstyle Eclipse plugin in order to see the violations in their code and as code would change and adhere to standards, the Checkstyle Eclipse plugin would reflect that.

Shortcomings

The common situation outlined above is a decent setup but it has some shortcomings.  If the coding standard rules change, that sends a ripple through all the tools.  The Eclipse formatter needs to reflect the new coding standard rules, as well as the clean up and the save actions.  Furthermore, Sonar needs to be updated with the new rules.  In addition, sharing the Checkstyle file between projects and teams can become a choir.  There are ways to define a remote coding standard file used between teams but that does not address the lack of synchronisation between all the tools … until recently.

Harmony

Checkstyle, Sonar and Eclipse have been around for a long time and as these tools matured they developed great integration between them.  By aligning these tools it is possible to establish one central coding standard rule set and reflect those in the development environment automatically.  Furthermore, once configured, changes to coding standards are propagated automatically and developers are always informed about up to date coding standards and apply them as they code.

Example

Lets look at an example on how to best utilise Checkstyle, Eclipse and Sonar.  In addition, to give the example more relevance, lets start with an existing “legacy” project where coding standards have not necessarily been respected.

Assumptions

The example assumes the following:

  • Java project
  • Maven build
  • Checkstyle file expressing the coding standards
  • Sonar
  • Eclipse
  • Eclipse checkstyle plugin
  • Eclipse sonar plugin

Initial coding standard report

We’ll start from the point where an initial Checkstyle configuration has been uploaded to Sonar and a Sonar report has been produced for our existing project.

Reduce violations in IDE

View and reduce violations in the IDE

Next, we’ll configure Eclipse to see those violations closer to the code.  In order to do so, we’ll need to configure the Eclipse Checkstyle plugin with the same rules as Sonar and apply the configuration to the projects.

  1. A link to the Checkstyle configuration
    A link to the Checkstyle configuration
    A link to the Checkstyle configuration 2
    Permalinks
    Checkstyle
  2. Reference to the Checkstyle rules in eclipse (Window > Preferences > Checkstyle > New)
  3. Configure Checkstyle for a project (right click project > Properties > Checkstyle).  Please note the Write formatter/cleanup config checkbox.  This is the part that synchronises the coding standards with the Eclipse formatter and clean up.  You can also right click on your project > Checkstyle > Create Formatter-Profile to achieve the same thing.  This kind of synchronisation alleviates the painful manual synchronisation between Checkstyle and Eclipse; brilliant!
  4. Once Checkstyle has been configured and enabled for a project, notice that violations are annotated by the code
    width="602"
  5. Now that Eclipse has been configured with the coding standard rules and formatting profiles have been updated, we can bulk clean up existing code and go a long way to ensure the code adheres to standards.  After pressing Next > (to review upcoming changes) or Finish, Eclipse would do what it can to help the code to adhere to standards.

    width="602"
  6. After republishing a code standard report to Sonar, we can see a reduction in violations
    width="584"

Save actions

Once the Eclipse formatter and clean up profiles have been updated, don’t forget to update save actions so that as many coding standards are automatically applied as soon as possible (before every save).

Eclipse Sonar integration

Similarly to the Checkstyle Eclipse  plugin, there is a Sonar Eclipse plugin that will annotate code with the violations as seen in Sonar.  In addition to showing Checkstyle violations, the Sonar Eclipse plugin would show Findbugs and PMD violations (all static code analysis tools configured).  The integration is quite simple.

  1. Install the Sonar Eclipse plugin
  2. Identify your Sonar installation
    width="602"
  3. Associate your Eclipse  projects with their Sonar equivalents (notice that your project has to have at least one Sonar report published)
    width="601"
  4. Once the configuration is complete, you can see the violations as published to Sonar annotated in your code
    width="545"
  5. Please note that the code has been annotated with the violations as it was found in Sonar.  If code changes are made, those violations will remain and get out of sync.  Alternatively, you can choose to rerun the checks locally and refresh the violations view.
    width="547"

The usefulness of the Checkstyle Eclipse plugin

Although the Sonar Eclipse plugin may make the Checkstyle Eclipse plugin look superfluous, remember that it is the latter that updates the Eclipse formatting rules as well as the clean up profiles.  If or until the Sonar Eclipse plugin fulfils the same duty, the Checkstyle Eclipse plugin remains very useful.

Not everything can be automated

Please note that although a lot of coding standards can be applied retroactively and automatically, some violations cannot be automatically eradicated.  Nonetheless, Checkstyle, Eclipse and Sonar can identify the problematic code and guide developers towards coding standard compliance.

Conclusion

Coding standards are a preoccupation for most software development teams.  Defining coding standards is one things but enforcing them effectively is another.  Thankfully as Checkstyle, Eclipse and Sonar matured, coding standard definition and enforcement can be a straightforward and a sustainable activity.

Ratcheting up code coverage with Sonar

Posted on

At Centrum we are often brought in by organisations that want to improve the quality of their software deliverables and remove some of the unwanted “excitement” from the delivery process.  We love engagements like this because it means that the client understands that there is a cost to neglecting to focus on quality and that they are open to changing processes and tools to move forward and start paying off that technical debt.

Unit test coverage – the easy bit…

How a drive for change often starts is that a new “green fields” project is chosen and high unit test coverage is encouraged (or enforced) perhaps along side practices such as TDD.  The benefits can be seen by the team involved in the project and this message is taken on board by management.   Unit Testing has been deemed to be “a good thing”.

So now for the legacy code…right?

So now the organisation or team has bought into the benefits of having a good level of unit test coverage and want to roll it out to all their projects.   However, the problem seems insurmountable.  The code analysis shows that your current coverage is at < 2%.  How do you get up to your target?  Often the response is to only enforce coverage on the new projects that were built from day 1 enforcing high coverage.  This can mean that you are actually enforcing standards on a tiny proportion of your organisations code.  Another option is of course to invest in writing the test cases for legacy code.  However, this investment is rarely made made nor is it necessarily recommended.  Test cases are most valuable when written before 0r at the time that the code is written.

The third way.  Ratcheting up coverage

What we often recommend when we hit the situation outlined above is to take a continual improvement approach.  Find ways to gradually improve the quality of your code and build momentum.  Find some metrics that can show a positive view of the improvements been made, don’t simply compare your legacy projects 2% coverage with your green fields project at 80%.  The 80% is an impossible short-term target and actually acts as a disincentive to improvement.

Sonars now reports coverage of recent changes

Sonar has just introduced functionality to show the coverage on recent changes.  This allow you to enforce coverage on every line of code added or changed during a project and over time your overall coverage will get there.  It also has the effect of introducing tests for those parts of your code base that change more frequently and therefore get the most value out of them.

Sonar Dashboard
Sonar Dashboard

What is also pretty neat is the ability to show the source code marked up with only the code that is untested, but only for the period that you are interested in.  This gives developers the feedback they need to write tests that cover changed code.

Sonar marked up coverage
Filtered code coverage

Footnote:  Sonar for the uninitiated

Sonar is an open source quality platform.  It collates and mines data from a variety of code analysis tools as well as it’s own in built ones to give you a holistic view of the quality of your software.  The “7 axis of code quality” as described by Sonar are:  ArchitectureDesign, Duplications, Unit Test, Complexity, Potential Bugs,  RulesFormatting & Comments (Documentation).

Gradually increasing code coverage in untested projects

Posted on

Untested code
There are many reasons and excuses why some applications are untested by automated tests, or at least not very tested.  It could be an older application, the application might have been hard to test, or people writing it simply did not have the habit of writing automated tests.  Having said that, most of us either inherited or created an untested mess at one point or another.  This article attempts to explore techniques about increasing code coverage for such projects.

Coverage targets
Say we set a goal of 80% test coverage and fail builds if they don’t meet that threshold.  It is naturally quite feasible to meet this goal with every new check-in.  Thus it is much easier set and enforce these kind of code coverage targets on new projects rather than on older, untested ones.  It would be very disruptive to expect and enforce 80% code coverage on a project that is 5% tested straight away.  A more gradual approach is necessary.

Moving forward
Revisiting the case of an older untested project, lets see how we can work towards gradually increasing code coverage.  When working with an existing project, the only thing developers can claim is that new code will be tested.  Luckily, this claim can also be enforced by using tools like Clover and its history threshold.  The history threshold can be used to enforce that coverage is not decreasing, meaning that if new code is added to the application, it needs to be covered by tests.  This practice can be beneficial in building a culture of automated testing while increasing code coverage for an application.  Eventually, if efforts are taken to increase code coverage for the rest of the application, an absolute threshold can be instated to control that coverage does not fall below a certain acceptable level.

Testing can be disruptive
While we are fervent practitioners of automated testing and test driven development, we do recognise that in some situations, creating test for each line of code can be a bit disruptive.  Naturally, throwaway proof of concepts don’t need the same strict level of code coverage as applications that will need maintenance.  Furthermore, it is not trivial to work with and test code using frameworks, tools or languages unfamiliar to the team.  There are also frameworks that are not easily testable.  But as a team learns know to write automated tests effectively, the technique described above can be applied to make up for the initial looseness.

Wrapping up
Finally, while all code should be checked by automated tests, we find that this is not always the case.  Many reasons lead to untested code, but there are tools and techniques like Clover’s history threshold that can put a project back on track in a controlled and steady fashion.  When a situation is bad, we can at least ensure that we are making it better, rather than adding to the problem.

Implementing Maven in a Legacy Code Base

Posted on

It’s your first day with a new company or client. The usual things happen; you get introduced to everyone, shown where the toilets and fire stairs are located, pointed towards your desk, allocated a PC and login to the corporate network. Everything is going fine you login successfully, email works and you start to configure your PC to actually get some work done.

You install the Java IDE (in my case Eclipse) and get the URL for the source code repository. You check out the many modules that make up the application and start trawling through the code looking for hints on how the modules are built, pom.xml …, nowhere to be seen. Hmm, build.xml …, ah there you are, OK run the Ant build, FAIL, class not found, class not found…..

You look to the IDE for guidance, red crosses everywhere, same problem; there are multiple dependencies that are not in place (as far as the IDE is concerned). You figure I will discuss this with someone who is more familiar with the code base and application structure. What you find is that the application and module structure is tightly bound to that person’s IDE and is hidden within IDE metadata files. Worse still these metadata files are actually checked into the source code repository complete with hard coded locations to a particular person’s PC.

Sounds familiar? Dealing with a legacy code base that has grown over the years can be very difficult. In some instances the knowledge of the application is with one or two key staff members. Those people may have architected the application on the run and not followed industry standards. They may no longer be with the company. You’re left to work it all out, where do you start?

If you have a legacy code base that has multiple dependencies and is currently built via Ant, you can implement Maven within this code base. Here are some tips that may prove helpful.

1. Baseline

  • Get the application and all of its modules to a known working state.
  • Ensure that you understand the dependencies between modules and to 3rd party libraries.
  • Ensure that you can successfully build the application via the Ant build scripts.
  • Ensure that you can successfully deploy and run the application.

2. Create POM files for each Application Module

Create a pom.xml file for each of the dependent application modules. Ensure that it contains:

  • A &ltparent&gt section to provide the details of the parent POM.
  • A &ltrepository&gt section to provide the details of the enterprise remote repository.
  • A &ltdependency&gt section to provide the details of the installer provider. This will be required for step 3 (in my case I used wagon-webdav).
3. Deploy Application Modules to the Enterprise Remote Maven Repository

Modify the Ant build scripts of each of the dependant application modules to deploy the resulting artefact to the enterprise remote Maven repository. This can be achieved by using the Maven Ant Tasks (http://maven.apache.org/ant-tasks/index.html). The important points to remember are:

  • Ensure that you have a reference to the Maven-Ant task and that the file maven-ant-tasks-*.jar is on the classpath.
 <!-- Creates a classpath element for the Maven-Ant Task (http://maven.apache.org/ant-tasks/index.html) -->   <path id="maven.ant.class.path">        <fileset dir="${maven.ant.lib.dir}">             <include name="*.jar" />        </fileset>   </path>   <typedef resource="org/apache/maven/artifact/ant/antlib.xml"        uri="antlib:org.apache.maven.artifact.ant"        classpathref="maven.ant.class.path" />  
  • Create a classpath element that points to the dependencies within the pom.xml file create in step 2. This can then be used when compiling the code.
  •  <target name="initDependencies">        <artifact:dependencies pathId="maven.dependency.classpath">             <pom file="${project.dir}/pom.xml"/>        </artifact:dependencies>   </target>  
  • Create a deploy target that refers to the pom.xml file from step 2 and the correct enterprise remote repository for deploying artifacts.
  •  <target name="mvnDeploy">        <!-- Refer to the local pom file -->        <artifact:pom id="projectPom" file="${project.dir}/pom.xml" />        <!-- Defines the Remote Repository -->        <artifact:remoteRepository id="inHouseRepo" url="${maven.deploy.repository.url}">             <releases enabled="true"/>             <snapshots enabled="false"/>             <authentication username="${maven.deploy.repository.username}" password="${maven.deploy.repository.password}"/>        </artifact:remoteRepository>        <artifact:install-provider artifactId="wagon-webdav" version="1.0-beta-1"/>        <!-- Deploy the artifact using the new pom file to the Suncorp in house repository -->        <artifact:deploy file="${project.dist.dir}/${project.name}.jar">             <remoteRepository refid="inHouseRepo"/>             <pom refid="projectPom"/>        </artifact:deploy>   </target>  

    These application modules will now be stored in your enterprise remote Maven repository conveniently available for a Maven build.


    3. Create a Maven Project
    Depending on the architecture of your application you can either;

    • Create a new top level module as a Maven project
    • Convert the current top level module into a Maven project

    The key points with this activity are to;

    a. If you are converting ensure that you follow the directory structure require by Maven (http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html)

    b. Ensure that your new pom.xml contains all of the dependencies for the application. For the 3rd party libraries this should be a matter of digging through the various modules, finding the currently used 3rd party JAR files and adding entries to the pom.xml.

    For application modules create a entry for each module as follows:

     <dependency>   <groupId>com.company</groupId>   <artifactId>module-1</artifactId>   <version>1.0</version></dependency> 


    4. Test

    • Package the application via Maven (i.e. mvn clean package).
    • Inspect the resulting artefact.
    • Does it match the baseline artefact created successfully in step 1?
    • Does it run successfully?

    5. Repeat

    Now convert the next highest application module to a Maven project following the steps outlined above.