testing, testing, testing

In his speech to the party at Labour Pary conference back in October 1996, the leader Tony Blair  said: “Ask me my three main priorities for government, and I tell you: education, education, education.”

Ask me my three main priorities for software development, and I tell you: “testing, testing, testing.”

Back in the day, I tended to use JUnit as a quick and dirty way of doing a proof of concept, testing some Java method would work or experimenting with libraries I hadn’t used before. That all changed when I read Practical Unit Testing with JUnit and Mockito by Tomek Kaczanowski.

On page one of his book, he quotes Micheal Feather author of ‘Working Effectively With Legacy Code (2004)’

Code without tests is bad code. It doesn’t matter how well written it is; it doesn’t matter how pretty or object-oriented or well-encapsulated it is. With tests, we can change the behaviour of our code quickly and verifiably (sic). Without them, we really don’t know if our code is getting better or worse.

In the spring of 2002, I had a contract with the English Football Association working on their website for the FIFA World Cup taking place in Japan and South Korea.  I was writing the server side code to access the database and return the data as objects to the bea Weblogic Portal they were using. We appeared to operate in a blame culture, where if any part of the website didn’t work you blamed the other team. So the UI team were constantly claiming that fault was with the server side people when the umpteenth article about David Beckham or Sven didn’t display.

I soon got tired of this and I decided to implement my own “testing framework”, I use the word loosely.  I was using JBuilder as my IDE at the time and JUnit only appeared in Eclipse in November 2002.

So I wrote a series of Java Server Pages (JSPs) for every TLD (Tag Library Descriptor) that made a call to a method in my backing bean which then called my EJB’s method and rendered the result in the browser. Though it didn’t look pretty it proved a point and I was no longer the subject of pointing fingers from the UI team.

So when they tweaked the database or added new content I’d load my dozens of JSP pages in the browser and check my ‘legacy” code was still working.

By 2003 I’d switched to Eclipse and was running unit tests (JUnit 3) though I wasn’t big on assertions.

It wasn’t until the mid-noughties when I went for an interview at BP in London I realised the power of test-driven development (TDD) and how I was missing out. The first thing they said to me as I entered the room, well, in fact, the interviewer shouted it at me; “Are you test aware?”. I thought he was on some evangelical mission and frankly a bit weird. The shocked look on my face said it all and I blatantly wasn’t. I didn’t get the contract.

Reflecting on what had been said during the interview on the train home I thought maybe there was something in test-driven development, though that was the first time I’d experienced a company using it on that scale.

It wasn’t until a short piece of ‘consultancy’ at home furnishings retailer Dunelm, actually running lots of static code analyser tools on the Java code written by a third-party they had engaged and writing reports to accompany the tool’s output, did I see the real-world success of test-driven development (TDD).

Dunelm released a new version of the website into production nearly every day, sometimes multiple times. They were big users of continuous integration with Jenkins. Their Jenkins job first checked out the latest release of the code base from Git and then ran various static code analysis tools, followed the hundreds of JUnit tests. The progress of these test could be monitored by everyone in the department as they were displayed in real-time, on a huge wall mounted 100+ inch screen at one end of the office. The progress bar gradually changed from amber to green as each test passed. If one failed the whole bar would go to red and they would roll back to the previous master code branch which had passed every one of the tests and deployment of the new release would be abandoned.

The flip side of that occurred when I recently worked at another major online B2B retailer a couple of years ago who was going to attempt to upgrade their entire website, deployed on JBoss 4.0 (released in 2004) to JBoss WildFly 10.0 (2016). They were hoping to move through twelve years of innovation and major changes to the new architecture of (JBoss) WildFly in three months. Not doing it gradually, but with the ‘big bang’ approach.

As Eclipse was the defacto standard IDE at this place, they could have used JUnit integration and been writing unit tests since the original J2EE project started in 2004. From seeing Confluence based documentation (retrospectively) added from the period and talking to their Architects, they had decided it was too much effort and would “slow development down” and “any bugs would be caught in system and user acceptance testing”. Whilst I was there a senior developer was assigned the task of investigating the writing of retrospective unit test. I think she found the sheer number required overwhelming and though they had lots of meetings to discuss, nothing was decided. She went off on maternity leave and I left the month she came back.

The (very optimistic) estimated three months migration would involve almost all of the entire development department. During the ‘three months’ everything in all the Sprint backlogs was frozen and no new stories or tasks were added. The live website was not to be updated, no matter how important or serious the issue. As the sheer size of the tasks was realised no one was allowed to communicate with any of the upgrade teams and they were all moved from their recently refurbished spacious open plan offices to an old office with no outside windows and attached to a warehouse. They couldn’t come out until it was completed!

The migration was complicated and they had planned to move through each iteration of the application server upgrading all third-party libraries, the JVM, version of Oracle etc. for each major JBoss release number. Only when they had tested it would they move onto the next version, until they finally arrived at version 10 of WildFly running on Java EE7 with the latest release of Oracle and all the external third-party libraries they used upgraded as well.

Testing involved sitting down at a two PCs one with the current live version of the website and the other PC with the test release. They would then do the same product searches or purchasing sequences to see that they both behaved the same and a DBA queried the Oracle back-end to check that the same data had been persisted. Laborious and not exactly scientific, open to human error.

The man/women power and time invested was huge and overran by many months and bugs were still cropping up a year later. There’s no doubt if they had started writing those unit tests prior to the migration commencing and continued writing them in tandem, the whole process would have been a lot quicker, a lot smoother and the outcome would have been more positive. It would also have given them a huge collection of unit tests to run on the next and future upgrades. They also could have incorporated all these tests within their already in place, Jenkins environment which they used solely for their very complex ANT build process.

My experience shows that on a greenfield project implement Test Driven Development and in an existing long-term project, retrofit unit tests.

Tony Blair was in power for ten years, I doubt if many of the websites I’ve worked on in the past will be around for ten years. When they add those new ‘cool’ features or fix an existing bugs it will crash and be down for days if not weeks (TSB May 2018) and consumers will go elsewhere.

One way to stop this is testing, testing, testing. Preferably automated with a 150+ inch screen OLED 4K TV at one end of the office. Least the TV tech will have been upgraded, tested and working.

Leave a Reply

Your e-mail address will not be published. Required fields are marked *