08 Nov 2007, 13:15
Generic-user-small

Boo (2 posts)

Not sure if this is the right spot in the forum, but it seemed appropriate.

So let me start by saying I understand Mock objects; at least as much as the next guy. I think in theory they’re great, and I think in theory there is a lot of value in them, and I think for people in computer science courses it’s a nice exercise.

But…

I’ve yet to be sold on them. Why? They’re not pragmatic, and even though I’ve read countless material on them (include the PP book on unit testing), I’m still not sold. To be fair I’ll give my reasons. Please keep in mind, I’m playing devils advocate here, and while I may not be personally 100% sold on them I’m not saying that they’re a bad idea or that nobody should use them; please see the summary at the bottom to see how I really feel.

  1. Increased complexity and maintenance costs.

In ITS applications where the business is constantly changing, new features are being added, code is nearly constantly in a refactoring state, the extra work to program by interfaces slows things down. Every time you make a change in signature, add or remove methods, you have to update 3 places; the interface, the live implementation, and the mock object, instead of just one.

For developers who are reading code, instead of ‘Go to Reference’, now they have to ‘Go to Reference’ (which will take them to the interface), then ‘Find All References’ which they then have to find the right object; if you’re implementing the Mocks declaratively with controllers or some other means then it becomes even more painful because now you have to scan a XML file or something similar or ‘Find All’ and weed through the results.

All these extra moving parts (controllers, interfaces) in and of themselves aren’t that bad, but when added up they’re huge; especially when you’re dealing with legacy applications. As Fowler says, ‘it’s just extra costs, pull it out’.

Interfaces are abstract by nature, things that are abstract generally are not well understood.

And this gets one to thinking, why are the frameworks that are out there (for me .NET) not all interface wrapped? Maybe interfaces aren’t the answer to everything.

  1. Not gaining a lot of bang for the buck.

So the big ‘sell’ is you don’t have to test against the ‘real’ object. So what’s the point then? To save a few cycles on the processor? Avoid some DB connections or long running processes? OK, sure that’s great and I’ll agree with you to a point…but what happens when somebody updates the live object and your testing against the mock? You continue to have ‘green’ but when you push your code you get a production error. This is especially true in GUI tests; yes GUI tests are a PIA - but I can’t guarantee and know with 100% certainty that a column in a DataGrid is invisible unless my regex fails to find the HTML that is generated, mock objects for GUI don’t generally test HTML rendered by ASP.NET at run time.

So some arguments against that are ‘who ever updates the live object should be updating the mock object’ - well that’s great if they’re on the same team, but if they’re not aware of it, or not on the same team, or not in the same company, that makes things a little difficult. Or if who ever updates the live object doesn’t notify the users of it (because that never happens), or knows who is using it to notify them, and doesn’t document the changes, how is someone suppose to know what to update and when (except for that it suddenly broke in production but the unit test that uses the mock object still passes).

Another popular argument is that you should still have tests for the live objects, and the object that you’re testing, which may use the live object, has mock objects; so now you have two sets of tests for the same thing? How is that pragmatic? If you have tests for the live object at level 1 why would you want to use mock level ones when your testing level 2? If you’re level 2 tests are failing you need to ensure you’re level 1 tests are passing and if it turns out to be a problem in your level 1 object that you missed a test for, the mock object would’ve let you push that bug right into production, but your level 2 test which used the real level 1 object saved your butt.

  1. Right tool for the right job.

I know there will be people who say that some of these situations I point out are problems in the organization, or trying to use the mock objects where they don’t belong. I would agree with this 100%, but let me ask you this - in all your Googling, how often do you find ‘When not to use mock objects’ in the white paper/blog/wiki? Our industry culture tends to hop on the bandwagon; these kids out of college who have no real world experience read about stuff like this, idiolize Fowler and Beck (who I agree are pretty dang smart folks), and find that things like this fit exactly with the theoretical practices they’ve been working at for the last several years. They take this stuff to heart and neglect to get to that final paragraph (if one exists) of ‘here there be dragons’. How do you think we’ve come across so many applications that use patterns in an ‘anti-pattern’ way over the years? They look at the pretty pictures in Fowler’s or the GoF book, read the ‘common usages’ and BAM, that’s exactly what I need; let me implement it! Andy Hunt says Mock objects are great, let me use it for everything! Let me put a controller on everything so I can declaratively change things!

  1. Case Study

A recent organization I’ve worked with has two large projects executing simultaneously; both interface based, declaratively at that, and built to the roof for countless what if scenarios (change database vendors, multiple GUIs (web and forms), heavy use of IoC (interface of coarse), etc). The project I was assigned to I immediately got rid of several assemblies that were just bloated with interfaces for all of this stuff. Why? There is a snowballs chance in heck that the database vendor was going to change; if by some chance it did, there would be much bigger problems to worry about at an organization level then this program not being able to change a configuration file and be up and running. There is also a snowballs chance in heck that they would simultaneously have a web and forms version - the business had set a direction and they weren’t about to change anytime soon; and if they were, there again, there would be much larger organizational issues to contend with.

Just pulling the interfaces decreased the average bug fix time by 20% at least if not more, it decreased the average ‘new feature’ time by 35% if not more. These numbers may seem exaggerated, but without all those extra moving parts to contend with, there was less complexity. Less complexity made coders less loathsome to work on a bug. It’s a documented fact that as LOC goes up, the complexity and cost of an application increase exponentially. Interfaces, mock objects, configuration files, and controllers aren’t code free.

Another thing that happened is that I wrote several new unit tests that tested our encapsulation of another system in the organizations ‘web services’. These are painfully slow tests, but they’re thorough and they give me a chance to catch up on emails and design documentation, and I don’t run them more than once or twice a day. Now I could dramatically speed things up by using mock objects, but then a few weeks ago when the web service was changed and we weren’t told (because that never happens right?) we wouldn’t of been able to catch it before they pushed the change to production and we would’ve had production errors. Had I used mock objects on a certain web page I made we wouldn’t of caught an error caused by another developers bad merging of an ASP file; the output of the HTML generated by ASP.NET changed.

So meanwhile during all this the other project has continued it’s merry way with interfaces, declarative programming, and use of IoC/dependency injection all over the place. Amazingly enough they’re months behind and there developers are miserable, they spend a lot of time re-generating interfaces, updating implementations because of signature changes in several places and jumping through files in the GUI trying to find the implementation. Meanwhile the developers on the team where I removed these extra complexities that were nothing more than speculative generalities (Fowler - Refactoring) have been happier to come in to work and don’t loathe trying to determine what source code is suppose to do any longer.

Summary:

Mock objects are a powerful tool, on the order of ‘laser guided missile systems’…in the right hands they can be very effective to remove problems; in the wrong hands it’s a global disaster; and we (our culture) just preach the greatness, rarely if ever talking about the downfalls. If one is to use mock objects they need to be well versed in the rewards and risks, it’s irresponsible of us to do otherwise.

16 Nov 2007, 16:04
Generic-user-small

Humberto Enrique (1 post)

Boo,

Very interesting article, close to sounding a bit like ranting initially. Although I am not really a matured/seasoned developer yet, your article did make sense. I really don’t know much about mock objects and all that, but will slowly read up on that and get it. I did like your points about programming to “speculative generalities” or what ifs and what nots.

Thanks.

  You must be logged in to comment