codeproject The ideas presented here are nothing new; however, a
recent discussion with Dror Helper made me want to state my views on this subject (and make the whole world agree with me, whatever it takes), because to be honest, despite my best efforts in meditation, when I see a class called "MyClassTester" it makes me..
wanna start a holy war nervous.
The idea is, if you do TDD, you don't start with a class, because if you do, you already have some design before you have written any tests. So, you start with a user story. Ant let it be a Web app, to be more concrete. Here it goes:
When a user fills the registration form correctly and hits the "Register" button, several things happen:
- she becomes, in some sense, "registered" in the system;
- an email is sent, containing the confirmation link;
- she's redirected to a particular screen.
Now, if I were to start writing unit tests for a particular class, which class would it be? The specs say nothing about classes or such. So, first, I decide that it's going to be an MVC app, like all the cool guys doing these days. And I'm going to start with the most exciting thing: my controller. So, I'm creating a class called AccountControllerTester or something. Next, I have to write a test, so I meditate over the requirements, and here's what I think:
- "Registered" is kinda vague, but let's do it with the built-in Membership system. I'll inject this service into my controller, mock the service in my test and verify the call to it.
- Email is easy. A service again, I'll write it later (after the lunch), let's use an interface for the moment.
- Redirection is just return View(".."). Easy to test.
So you end up with either one test with three asserts (which is a bad practice), or three test methods related to one particular feature. What makes it a mess is that you're going to add more tests: related to input validation, and to the other features that are handled by this controller: password change and retrieval, email confirmation etc. For every action method, you can have 5-10 tests. On the other hand, the actual confirmation email sending belongs to a different class, so you probably put the test there, together with other email-related tests.
You see, one big problem is that each test class becomes huge. But what's more important, it is hard to tell what your system does. One purpose of tests is documentation, and it should be readable for outsiders. If I am to figure out the behavior of the system in the registration process, where should I look? How do I know that part of it is in the AccountControllerTests, and another part is in the EmailServiceTests?
And still another problem is that you just wrote a lot of code, but made
zero business value with it. I mean, I can't register at your site yet! And won't be able until you write all the pieces and connect them together (for which you probably should write an integration test).
To summarize, what you get with this approach is:
- premature, rather than test-driven, design;
- potentially brittle tests (since they are coupled to your classes, you can't refactor easily);
- huge test classes;
- documenting your classes rather than your system;
- no clear relationship between the specs and your tests;
- no business value until you have all the pieces.
There's a better way
Now, let's do it another way. Let's create a folder called Membership, and inside it a folder called "Registration". We're testing the case when we submit valid registration data, so let's make a test class called "IfSubmittedValidRegistrationData". The test methods will be called "ShouldSendConfirmationEmail" etc, so the test output will show something like,
WhenSubmittedValidRegistrationData.ShouldSendConfirmationEmail -- passed.
This is quite close to documenting your system!
Now, you can still write it as a unit test for your controller, if you prefer, but I suggest you start with integration tests. I use
Ivonna for testing, and it makes it a lot easier: my "integration" is only server-side. In addition, I can use several built-in CThru aspects, like EmailSpy, and I can use a lightweight in-memory database. I initialize the posted values (Arrange) and execute the request (Act) in the FixtureSetup method, and all my test methods are one or two lines of code where I verify the results.
- UserShouldBeRegistered -- I just check it via the MembershipAPI.
- ShouldSendConfirmationEmail-- I use EmailSpy that prevents the message from being sent, and saves it for further investigation, so that I can verify that it contains the confirmation link, is being sent to the correct address etc.
- ShouldRedirectToTheWelcomeScreen -- check the Ivonna.Framework.WebResponse.RedirectLocation property.
Now I can go the full Red-Green-Refactor way. I stuff all the code into my action method until all my tests pass. At this moment
my system actually works: I can register a user! But my code is ugly: it is a quick-and-dirty solution written just in order to make my tests pass. I want to make it better. So I refactor it.
It's at the refactoring phase that unit tests can provide a big value. Yes we all know that unit testing can lead us to a much better design. But sometimes it is enough to do it mentally. For example, "how would I unit test my controller"? Oh yes, I should refactor the email-related code into a separate class, and do dependency injection. Probably extract an interface like IMessagingService and implement it as EmailService. Whatever. But I actually write a unit test only if I can think of a good name for it. EmailServiceTester totally won't do. TheConfirmationMessageShouldBeActuallySentByEmail is more like it (and the corresponding unit test for the controller is ShouldSendTheConfirmationMessage -- note that it doesn't mention "email"). But in this particular case, it's probably not worth it.
There is another situation when we have a particular feature that produces some output depending on various inputs, and it's not a yes/no situation, like in the previous case. Take searching, for example. You start with an integration test, like before. And make it work. For one of several search form parameters, you know that the search produces the correct results that appear in a grid on the search results page. It would be wasteful to write integration test for every search parameter. So, assume we already have it refactored into several units (note that I don't say "classes"): SearchParamsReader (this is actually the MVC Binder), TheThingThatGivesUsSearchResultsDependingOnTheSearchParams, and SearchResultsWriter (the one responsible for displaying the results). You make three subfolders in your Searching folder (which already contains the integration test), each responsible for the corresponding piece of functionality. Actually, it's probably worth it to make just one subfolder and test TheThingThat.. Again, you don't put everything for testing TheThing.. into one huge class, but you create several classes: SearchByKeyword, SearchByMinMaxPrice etc. This is unit, not integration, testing, and yet it corresponds to user requirements and documents the system behavior. I could refactor TheThing.. into several classes, I could rename it, and the tests won't break.
Here's the recipe for happiness
- Make a folder corresponding to a feature, then a subfolder for a sub-feature etc, until you have a concrete action, like registration or search, or maybe a concrete context.
- Inside, create a fixture for each combination of context + action (like "submitting a duplicate username") and name it accordingly (WhenSubmittedExistingUsername).
- Put all preparation into the FixtureSetup method. You want it to be readable, so refactor all the nasty details into private methods and move them to the end.
- Each check should go to a separate Test method. Name them so that they match the requirements.
- More granular tests, if you need them, should go to subfolders.
- If you can't think of a decent test name (something that doesn't use the class/method names), it's probably not worth writing (but might be useful for driving your design).