This may not always be about programming, but it is mostly aimed that way right now, with Part One. There’ll probably be a Part Two before long.
There are a lot of programmers out there who just don’t know that they suck. Maybe this will help illuminate them a bit.
- Arbitrary Limits: When you create a system, you set limits. That’s a given. Systems require bounds and definitions so that they can operate with them and upon things that fit within them — and in the case of programming, those things are data. If your system’s limits are set arbitrarily, however, chances are good that you’ve screwed up somewhere. This applies not only to the system’s design itself, but also to the way you go about designing it. If you use an arbitrary number to limit the length of functions/subroutines and methods for instance (such as applying the common assertion that functions shouldn’t be more than twenty lines of code), you’re missing the point: rules of thumb are guides to good practice, but not hard and fast rules, and the real expert knows when to violate them.
- Productivity Metrics: Anything a corporate middle-manager uses to measure your productivity is more likely to be a sign you’re doing something wrong than a sign you’re doing something right. I still occasionally see job postings for programmers that indicate you should be able to produce foo number of lines of code per day, when productivity should be measured by the solid application functionality that is created rather than how many lines of code generated that may or may not actually accomplish anything. As EW Dijkstra put it, “if we wish to count lines of code, we should not regard them as ‘lines produced’ but as ‘lines spent’: the current conventional wisdom is so foolish as to book that count on the wrong side of the ledger.” It’s not a universal truth that you should always “spend” as few lines as possible to achieve functionality, but it’s close enough to serve as a beneficial counter-principle to that which assumes a programmer that generates more lines of code is necessarily better. All else being equal, use fewer lines of code to achieve your ends — to measure productivity by lines of code is a pretty good sign you’re screwing up.
- Buzzword Compliance: If you require the use of buzzwords to describe the benefits of what you’ve done, something has gone seriously awry. If you’re a programmer, ask yourself this: Do you require terms like “enterprise”, “object orientation”, and “resource management” to describe the value of your work? Buzzword compliance might indicate that you can make an awful lot of money from your work, but it is close enough to irrelevant when measuring the quality of that work that it’s best to ensure that, when thinking about it away from the pressures of clients and bosses, you measure that quality entirely without the use of buzzwords at all.
When someone’s a beginner, it’s usually a good idea to stick to rules of thumb pretty much religiously, of course. That means that those arbitrary limits should be adhered to as much as you can manage. The reason, of course, is that most people before they become experts at something enough to know why these rules of thumb exist, and when they should be violated, would just screw things up if they tried to get creative. That’s why I, for one, try to avoid violating rules of thumb like the one about only having a single return from a function/subroutine or method: I seem to lack a true grokking of the reasoning behind it to the extent that I can violate that rule of thumb and be sure it’s a good idea.
I’m too lazy to do more today, so I’ll call Part One done.