We do a horrible job of assessing our options before we start, explaining our choices to our colleagues (and bosses and clients), reacting to feedback as the project progresses.
Once upon a time, shortly after I got out of college, someone wrote an excellent paper on a timely topic: software programmer productivity. The time was 1986, the author was Fred Brooks and the paper was "No Silver Bullet — Essence and Accidents of Software Engineering".
(For the Cliff Notes version, turn to Wikipedia: http://en.wikipedia.org/wiki/No_Silver_Bullet. The paper is worth a read on its own, but who has time?)
The summary in the Wikipedia page is very good, and I will be stealing from it liberally in sum future rant about how badly managers tend to manage programmers and how badly programmers support their management. But you should read it now before the rant really gets going.
This is my banal insight: so little of the fundementals of creating software has changed in the intervening decades since 1986 that sometimes I want to cry. So much has changed about the incidentals that sometimes I want to cry. After 30+ years in the business, I was hoping for more improvement, not more novelty.
And boy, do we have novelty. We love novelty. We worship novelty. Take web UIs as an easy example: we have so many ways to make web pages, my head spins.
- Want HTML with embedded code? Try PHP!
- Want code with embedded HTML? Use Perl/CGI!
- Want an abstract environment which separates HTML and code nearly completely? Ruby-on-Rails is for you.
- How about an end-to-end integrated code & HTML environment? Microsoft Visual Studio is just the thing.
- Want to try to side-step HTML and have some portability? Java, in theory, can do that.
I do not have a problem with more than one way to do something, especially if the ways have pros and cons. I am not an operating system bigot: I used VM/CMS for large batch jobs. I used VMS for file-oriented processing. I used Unix for distributed systems. I used MS-DOS for desktop apps. I used OS/2 and then Windows for snappier desktop apps. Each had its pros and cons and its appropriate uses.
The same was true for computer languages: I have used BASIC for matrix math, I have used C for file I/O, I have used Lisp for list processing, I have used PL/1 because IBM required that I use it.
Then somehow this idea of appropriateness faded away from computing: desktop=Windows, server=Unix. Then server=Windows Server for some, Unix for others. C++ or VB as the one-and-only programming language. Then other contenders for the one-and-only programming language.
We all understand that hammers are good for driving nails and screw drivers are good for driving screws. It is clear that screws are better connectors in some situations and nails in others. Who would hire a carpenter who only used one or the other?
But we hire "Drupal Shops" and "C/Windows Shops." We hire "Unix guys" or "Windows guys." We pretend that there is a single best tool for all jobs--and then argue pointlessly about which tool that might be.
Consider this statement from the creator of the Ruby programming language: (full text here: http://en.wikipedia.org/wiki/Ruby_%28programming_language%29)
Matsumoto has said that Ruby is designed for programmer productivity and fun, following the principles of good user interface design. At a Google Tech Talk in 2008 Matsumoto further stated, "I hope to see Ruby help every programmer in the world to be productive, and to enjoy programming, and to be happy. That is the primary purpose of Ruby language." He stresses that systems design needs to emphasize human, rather than computer, needs:
A single tool to help every programmer in the world to be productive and happy? To me, that seems insane and it seems to reveal a worldview which I cannot support: the idea that there is a single best tool for all people for all problems in all environments. What hogwash.
I applaud the goal of creating good tools and making programmers in general more productive but I reject the notion that this job can every be done with a single stroke.