Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, more and better work can be done by less people - it's axiomatic in software. It just takes longer - and often looks like people sitting around thinking and experimenting.


I agree except for

"It just takes longer"

Which depends a lot on everything. As an extreme example a bunch of junior people might develop a feature for a few sprints using TDD, XP and whatnot and not getting it done while a single expert might implement the feature simply and elegantly in a few days and a lot less code.

My favorite example of this are two blog artifacts: The first of which describes an attempt to write a Sudoku solver using TDD, and as a counterpoint Peter Norvig's simple and elegant sudoku solver in python (as previously discussed in HN):

https://news.ycombinator.com/item?id=3033446


As someone who hasn't really managed to get into TDD (despite a few attempts), do you think this is generally the case? I do feel that TDD encourages a sort of shotgun approach to development, where you just try things until it works, rather than think the problem through thoroughly. Any opinions from people who have done both? I personally find fiddling with tests interrupts my flow of thought too much, so I would rather do it at the end.


I call this the rewrite fallacy. I honestly cannot think of a decent piece of work that I did that did not involve two three or a dozen rewrites - either from scratch (rare) or just piecemeal replacements until I reached a design I was happy with, that fit my domain and was appropriate.

Sometimes the rewrites were on the same project, sometimes the rewrites were on previous projects and I jus looked like a genius coming in to do something I had already done before , but I am always surprised if I can get things right first time (right enough is a professionals baseline, but actually right? Hardly ever).

So, no, I honestly think that thinking through a design, working out and testing out ideas and architectures is a necessary part of development - call it discovery, call it the first iteration, call it perfectionism, it's needed.

Tests should come as part of that. Slowly as you realise the approach you are choosin might actually be the right one and then increasingly as you fill out the space.


Boy, are we diverging :)

From my experience TDD is kinda newspeak in the sense that it does not help at least in the first stages of design. From what I've managed to gleam as valuable way of working is test driven implementation. I.e. when implementing something akin to a simple datastructure I can test the interface functions as I write them which verifies the interface is not silly and that it works.

Another place where implementing tests concurrently brings value is using the tests as a form of documentation.

I would not call TDD a design methodology beyond the fact that usually implementing interfaces (and verifying it works) is good practice while speccing it.

Tests bring lots of value to a codebase beyond the fact that it sounds impressive to have a 95% coverage. From what I understand of the pathologies of software development having the TDD tagged as methodology is to allow the programmers implement tests in peace, politically shielded from the more gung ho elements of the stakeholderkin.

Summary: IMO if production code has no tests something is wrong. But I would say the most value tests bring is in verifying the algebra of the interface, protecting against breaking commits and documenting the inteded usage through examples. I would not say writing unit tests is a particularly powerfull design methodology but playing with code is, and calling what the developer is doing as 'TDD' gives the developer the mental piece to do exactly this.


Most of the advantages you list are about testing, not testing before coding, which is what I was curious about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: