This article discusses the technique of Test Driven Development (TDD) which is
part of the Extreme Programming(XP) approach. TDD provides the benefit of
writing high quality code faster with no bugs. With TDD you spend more time
designing and developing the interesting stuff, and spend very little time
testing, debugging, and fixing. I share the opinion with others in the industry
that the best practice to come from XP is TDD.
Do these kinds of development ideals appeal to you:
No more repeat bugs.
Full regression test of your system takes less than a minute.
You spend the significant majority of your time developing new functionality
rather than debugging existing code.
Debugging time is reduced by at least a factor of 10.
Tests are only ever manually tested once.
If the answer is yes, then I suspect we have something in common. Like me, I
suspect you DETEST time consuming boring testing, debugging, and fixing. Read
on and I’ll show you how to pretty much eliminate these activities.
What is Test Driven Development
Test Driven Development (TDD) was formally known as Test First Programming. In
TDD you write tests in code before the code you are about to write. This
provides the following benefits:
You will spot design flaws sooner rather than later.
By defining tests first you get to test the design at a very granular level
before the production code is built.
You will discover very quickly whether you fully understand the requirements.
Because if you cannot define the tests then the requirement is poorly defined.
You will discover further tests you hadn't previously thought of.
Testing time will reduce by a factor of 10 and will be exhaustive every time.
The tests will form part of a long term test bed that are run continuously.
The high quality from automated testing will significantly improve your
It will be extremely simple to identify the cause of a bug. [It will
be due to the code just written for the current test.]
I'll talk in more detail about the benefits later.
Building The System One Test Case At A Time
With TDD the entire system is built one test case at a time.
Some people choose to do more upfront design than others. In the XP world they
do very little design and head straight to writing tests and code and follow up
with heavy refactoring. Personally I prefer to do some design up front, even if
it is for a maximum of an hour sketching stuff out on a whiteboard, taking a
digital picture and popping it onto a Wiki. In my experience of leading
teams I've always found designing at the keyboard to
be more expensive than at the whiteboard. What is important is to
know when to stop the whiteboard work and start coding. Experience will
tell you when you are wasting time designing and it is time to try
something out in code.
Regardless of how you eventually get to it, you will have chosen a test case
which you need to code. A test case consists of an initial condition (state of
the system before the event), the event that was raised, and the expected
outcome (the state of the system after the event). I find object snapshots
are a great way of modelling this information because you can write the
tests straight from the before and after diagram.
An example of a test case might be to add a customer. The initial condition
could be something like 'displaying the add customer screen'. The event might
be to enter their name and click the Add button. The expected outcome could be
that the customer maintenance screen is displayed, and that the new customer is
shown in the existing customers list, in alphabetical order.
A test case should only consist of some data entry and one action to
submit that entry. If it does more than that then you are trying to do too
The Steps in a TDD cycle
A preliminary step is to get the latest version of the code and run all the
tests to ensure they pass.
Choose a test case.
Write the test. The test will not compile, since the code is not yet written.
Write the code stub. The test will now compile, but you will get a red light
because it should fail.
Write the code until all tests pass.
Lets look at an example.
Lets say you want to build an account object with a deposit
method that adheres to the following rules:
1. You can’t deposit a zero or negative amount.
If you are familiar with Design By Contract (DBC) then you can see that we
could state the behaviour rules like this:
Pre : depositAmount > 0
Post : Account.Balance = Account.Balance@pre + depositAmount
Inv : Balance > 0
Note also the use of the ‘@pre’ symbol. This means the value of the property
before the body of the method call was invoked.
So, that is what we want to build. Let's go through the steps.
Step 1: Choose a test case
Let's look at the test case where we deposit a positive amount.
There are other test cases you could do that check the rules. Personally I
would not write tests for those since they check negative behaviour. Doing so
seriously explodes the number of tests you have to write. Instead I would
code them as pre condition check, post condition checks, and invariant checks.
This whole subject I cover in my article Contractual
Test Driven Development - DBC with TDD
Step 2: Write the test:
Set up an initial condition and write some test code against it with some
My test code, written using NUnit 2.0 would look something like this:
Note that, in order to be able to test the Deposit method on the Account object
I needed to have a Balance property. Thus, TDD has driven out a design
decision. I still haven’t written the account object yet, which is why I have
the blue wiggly lines indicating it will not compile.
Step 3: Write the code stub:
To get things compiling I create an account class that does nothing, but
adheres to the interface defined in the test:
[I have also added the ‘using Domain;’ line to the test class to get it to
So, now when we run the test in NUnit we get a red light because the assertion
Step 4: Write the code until all the tests pass:
All we do now is complete the code until the test passes:
Now, when we run the test we get a green light and we know we have completed the
The tests now pass. There are also some additional questions you could ask. For
example, can I withdraw money?! Well, if we think in terms of TDD,
the only requirement we have at the moment is to deposit an amount, and we have
achieved that, so we move on. That is the essence of TDD with XP. Build only
what you need now. If we need to enhance things later then we will do so. We
will also refactor to change the design when required, with the comfort of a
suite of tests that tell us if we have broken anything.
Writing Tests For Bugs
All systems have bugs that need ironing out. When you come across a bug use
TDD and he follow steps to fix the problem:
Write a test that fails - a red light
Fix the code until it passes and all the other tests still pass - 'green
That's it. You will never see the bug again. It will always be tested whenever
the suite of tests is run.
Practicalities of TDD
At the time of writing I have 4 years experience both coding using TDD,
since it was labelled TFP, and have also trained and mentored two teams on the
techniques. Here's some practical advice and comments based on that experience:
Convincing Developers To Use TDD:
The only way I've found to convert traditional developers to TDD developers is
to show them exactly how it works. Sit in front of their machine and pair
program some tests. Words do not work. TDD is too counter intuitive.
However, as soon as you show them they immediately see the benefits.
Of over 20 people I've personally trained not a single one has reverted back to
'the old ways' of testing using manual interactive test harnesses. This is
simply because once you get good at TDD development is quicker. Why
would you want to slow yourself down?!
Time To Learn TDD
This is one area to be most careful about before you start heading in a TDD
direction. In my experience to get fully fluent with all the TDD
techniques takes a good developer about one year. Here's how I've tended to
progress students and how long it has taken.
On average I find it takes about 1 hour to show someone how to use a tool like
NUnit and get them writing tests. It then takes them about 3 months before
they can work on their own writing their own tests effectively. This
is the key. Any fool can write a test. It is writing the correct test, and also
writing the correct set of tests at the right level of
granularity. To get to that stage take about 3 months minimum. Whilst this
seems a long time, I find this is about the average. Remember that this way of
developing is a complete change in mindset. It doesn't happen overnight.
Once the developer has progressed to the stage where they can write their own
tests, they can then proceed the next level. This next level means writing test
code that is of as high quality as the production code. This means could
factoring out of code, and removing duplication. This takes about 3 further
months before they get to this stage.
So, we are 6 months down the line. What next? Well, we then move onto using
Mock Objects. I've not discussed Mock Objects in this paper, as I will devote
another article to it. Essentially Mock Objects is a technique that
enables the developer to test what they might consider the untestable. With
tests where encapsulation prevents you from being able to assert what you
want you can use Mock Objects to test objects from the inside
out. Mock Objects are also very good for testing edge cases, and for
reducing test code bloat. To get good at Mock Objects take a further 3 months.
Finally, the last stage is being able to design for testability. This means that
when you start drawing your various UML diagrams to achieve some requirement
you think from a Test-Driven point of view. This is not really something
you can teach, except to just keep asking the developer the same question,
namely 'How can you test it?' How long it takes to become a good Test-Driven
designer depends entirely on the experience the developer has on design. If the
developer is already a good OO designer then I would suggest 3 months is
enough, provided they have done the previous 9 months. If the developer is
still an apprentice designer then they need to get over the design
hurdle first before they go down the Test Driven direction. I think it is
unlikely you will find anyone will less than 5-7 years experience who would be
able to design good systems from a TDD viewpoint.
Common Pitfalls When Using TDD For the First Time
These tend to be the mistakes developers make when using TDD for the first
Tests are far too weak. For example, testing a recordset came back
from a query, instead of testing the insides of the recordset.
Test code is a mess and becomes cumbersome to maintain. It is not given as much
respect as the production code.
Tests are too big and do too much. They need to be broken down.
Test code is duplicated across multiple tests. Some tests are either not
required, or set up code needs to be factored out.
Tests are too far reaching. Tests should only test one object, and
not make assertions about other objects behaviour.
Of those mentioned above, the last one is by far the most common one I see. As
I've said before, any fool can write a test, but it is writing a good test that
it the key. A good test should only make assertions about the
responsibility of the object it is testing.
How Much Test Code Should There Be?
There is no golden rule. However, I you should find that you end up with as
many lines of test code as you do production code. It is very useful to monitor
the ratio of test code to production code throughout development. It will
indicate where your design could be improved, or where the test code is
becoming too bloated.
Testing The User Interface
This is tricky and quite advanced stuff. The best bet is to use a Model View
Controller design pattern, combined with Mock Objects. In my article about
Extreme Web Architectures I discuss a technique which I used to
ensure all testing of a large web based system could be done in
Total Test Time
In my opinion you must ensure the whole suite of tests can be run in under 1
minute, maximum. Personally I set a target of 30 seconds.
Tests must be quick to ensure they are run continuously. If they are slow then
the continuous integration process becomes less frequent and quality suffers as
developers fight to get the application running again. With slow tests the
Refactoring process becomes more time consuming, and it tends not to occur as
much, which results in increased code rot. Once quality suffers, productivity
Areas that become slow for testing are the front end, which my
article addresses and back end data access which techniques like Mock
It is of paramount importance that the tests run fast.
Benefits of Test-Driven Development
There are a wealth of benefits to using TDD.
Captain Courageous. Lets say you have a new requirement that gets
requested right before the release date. You have to try and get it in. Which
situation would you like to be in?
A) Whatever changes you make you can test the effects within seconds, or
B) You make the changes, but don’t have time to test, so release it hoping it
Of course, you want to be in camp A. But, if you don’t have the tests, then you
may well decide that the code is too fragile to even attempt B. The message
here is that with automated tests, we don’t have to worry about code being
fragile. We can try out some stuff we think might work, and test it in seconds.
We can be more courageous with our coding, and not have to code with cotton
Development is quicker.
As I've stated before, not a single developer I've trained on
TDD has ever gone back.
Lengthy regression testing is no more.
After you’ve built the test code the first time, you end up with a regression
test harness as well. On your next iteration when you perhaps enhance or extend
the code, you add more tests and run the lot to make sure nothing has broken
which was okay before. The tests run in seconds, not minutes or hours. No more
lengthy manual testing.
Every time you run the tests, it tests everything every single time. You do not
have to worry about forgetting to do certain tests. With manual testing, which
we all hate, we rarely do all the tests because we can’t be bothered.
With manual testing, the list of tests if more often than not in the developers
head. If someone else has to test their code, then they would not know where to
start. With automated tests, all the tests are there for everyone to use.
Manual tests with buttons called Command1, Command2 etc are very difficult for
other people to use. I’d go as far to say that they are pointless. With
automated tests, you simply press one button. It is a standard mechanism, and
everyone knows how to run the tests.
This article has been a simple introduction to Test Driven Development. It is
an alien concept to most developers, and it something that will take a while to
master. For it to be effective you need to be writing component based systems
that can be broken down into manageable testable chunks.
Extreme Programmers use automated tests continuously. It is at the core of the
whole approach. The idea is to continuously test. Not just unit tests, but
integration tests, and acceptance tests.
Good luck with these methods. Please feel free to email me if you need some
advice. I also like to hear from people who try this and find it as useful as I