This comment is in regards to http://pragprog.com/magazines/2010-01/much-ado-about-nothing by Paolo Perrotta and basically asserts that programs shouldn’t do this because Ruby lacks static analysis to counter what you lose in debugging techniques when following the article’s recommendation to use Black Holes.
Having thought about making each NilClass instance eat all methods sent to it (i.e. treating nils as “Black Holes” according to Perotta’s article) on last Friday, this article was particularly interesting.
I like how Smalltalk, and Objective-C have this behaviour because it means that you can skip having to write extra conditional logic to prevent accidentally causing a NoMethodFound (sic) exception, just like how this article mentioned.
However, after discussing this with some other folks at Shopify (where I work), Tobi Lütke pointed out that NoMethodErrors are an essential part of debugging your Ruby code, which Perotta acknowledges in his conclusion:
bq. On the other hand, Black Holes can cause their own share of trouble. After all, NoMethodErrors are there for a reason: they help you spot bugs in your code. A Black Hole can mask those bugs and make it more difficult for you to notice when things go wrong.
Perotta’s article then asserts that you should give them a shot anyway and decide if they work for you or not because “After all, there is at least one popular language (Objective C) where all null reference are Black Holes by default.”
Here’s my point of contention: a language like Objective-C supports this behaviour by default because they make up for not having NoMethodErrors by having some form of static analysis tell you that there’s a possible bug in your code. Because Ruby doesn’t suport that sort of thing yet, finding bugs becomes much more difficult and not worth the benefits gained from using this technique.
PS – Could articles in the future please feature an obvious way to respond to the author, even indirectly?