This site will look much better in a browser that supports web standards, but is accessible to any browser or Internet device.
Can just anyone learn to program or is that skill limited to a few of us?
This question or some variant has been on my mind for most of my career. Early in my career, I thought programming required inherent talent, so only some people were capable of learning to program. As I gained more experience, I began to believe that programming was a skill that most people could master with sufficient incentive and desire. During the dot-com boom, I saw a large number of people get into programming because the salaries were good. After dealing with code written by some of these, I realized that (monetary) incentive and desire weren't enough. Some of these people couldn't program to save their lives, despite the fact that they were paid to do just that.
A related question is the inherent difficulty of programming. According to many people, programming is easy. After all, if you go to the bookstore, you will see books like Teach Yourself Programming with Java in 24 Hours or Teach Yourself Ruby in 21 Days. Some people look at the titles of books like these and ask themselves "How hard can it be?" I've seen many managers that seem to be of the opinion that they can hire any kid off the street and teach them to program.
On the other hand, many of us know that programming is harder than some would like to believe. (Why do we resist the idea that programming might be hard?). I've seen entry-level programmers struggle with basic concepts. I know from personal experience that most of the people I talk to have no grasp of what I do.
So here we have two different ways to look at the same issue:
It seems that every time I come back to this issue, I have a different view based on my current experience. Recently, I stumbled across some research that suggests there may be an answer to the second question above. A summary of the research (Coding Horror: Separating Programming Sheep from Non-Programming Goats) gives you a taste of the issue. Papers from the actual researcher fills in the details. It seems that many people find certain basic programming concepts impossible to master.
Although that is a strong argument, more research is needed to settle the issue once and for all. But does that really mean that a large portion of the population is incapable of any level of programming?
To answer that, I'm going to do something dangerous and argue from analogy. When I wrote Programming and Writing I pointed out that there were interesting parallels between programming and writing. So, I'm going to use most people's understanding of writing to make a case for most people being able to program, after a fashion.
Most people that have been through a public or private education in a developed country learn the basics of reading and writing. This education doesn't work for everyone and there is quite a bit of debate about why. For the purposes of this analogy, I'm going to ignore those people. Even after school, most of us continue to use our reading skills for reading signs, documents, credits on movies, and such. But, what about writing? How many people could write a critical essay, or a short story, or a novel after they finish school? I would dare say that most of us could not. Some might harbor a great American novel somewhere in their skulls, but few ever really begin writing it.
I see most programming as much like the problem of writing that essay or novel. Both require knowledge, some inherent talent, and a degree of skill. And, like writing that essay or novel, not everyone has the necessary skill and talent to write a significant program. But, that's not the end of the story. Most of us still use writing to a small extent even though we'll never be a Shakespeare, or even a romance novelist. We write a quick note to tell a family member where we are going to be tonight. We write grocery lists. We do Christmas cards. We write letters or email to friends and family. This is still writing.
Is there an equivalent in programming? I believe there is. Unlike many computer users, I am a big fan of the command line. Point and click are fine for some things, but give me a command line and I'm much happier. The reason is a little discovery I made around two decades ago, working in DOS. I wrote my first batch file. I stuck 2 or 3 commands in a text file and was able to run them as one command. Now, I had been writing code for several years before that point. Even then, this wasn't a new concept. Heck, even I had seen it several times before. But at that point, I recognized a different kind of programming than I had been doing in C, Fortran, or Basic. This was making the computer work for me instead of the other way around.
Later, when I first worked on a Unix system, I found an entire culture and system built around this concept. But, that still didn't take away the basic cool part. I could build a little text file that would tell the computer to run multiple commands at once instead of having to type them myself. This is something I don't get from a GUI interface. I can't compose multiple commands together very easily.
Back to the question at hand. This little batch file or shell script approach is not real programming, any more than writing a grocery or todo list is real writing. But, I suspect that most everyone could learn to do this level of programming with a little help. With a little more help, I'm sure most people could do a bit more. They might never be able to write a full accounting system, or a word processor, or even a web application. But, given a little training they could probably teach the computer a few tricks.
I ran across an unusual article a couple of months ago, Ethical Software by Alex Bunardzic » Right Tool for the Job is a Myth. My first thought was that the author must be out of his mind. After some reflection, I have a somewhat different view. Now I just think he is mistaken. I would definitely suggest reading his article and the associated comments so that you understand his viewpoint. I wouldn't want to misquote anyone.
The author maintains that the phrase the right tool for the job is a meaningless statement because in the software world there are not as many choices as in the real world. Honestly, I don't know what to say about that. Most of my programming career has been spent narrowing an infinite number of possible solutions down to a few that might possibly work. Since I can't get even a basic grip on his general statement, let's look at some of the specific examples.
For example, when implementing a software system that is going to be distributed across a network of computers, it is pretty darn impossible to choose the right tool for the job — because there is only one tool to use, and that tool is TCP/IP.
This statement can be easily refuted by pointing out that UDP is still in wide use on the Internet. Several distributed systems depend on this alternative to TCP, including DNS, VoIP, and TFTP. Obviously, there are multiple choices for the right tool in the area of distributing systems. Moreover, if you are working with small devices that do not support a TCP stack, you might end up using something completely different like the protocols from the ZigBee specification.
Each of these communications protocols has different strengths and weaknesses. Part of what makes programming interesting is the ability to look at these various choices and decide which advantages you want and which trade-offs you can live with. To a small extent, Bunardzic is partially right. More and more devices are getting TCP stacks, so it is becoming a viable option more often than it once was. But, that does not mean that it is the only choice.
What choices of tools do we have at our disposal if we wish to implement a relational database? No choices. There is only one tool — SQL (Structured Query Language).
The comments pick this one apart pretty well. The main argument is that the question is narrowly defined to specify basically one answer. If we change the question to one that a developer is more likely to encounter: How do we store the data we need for our application?, the answer is less clear. If you only have a handful of key/value pairs, a property file or INI file is probably a better choice than a relational database. If you need to store gigabytes of of floating point information, raw binary files in a directory may be the best solution. For other problems, a CSV file might be exactly what you need.
In most of these situations, SQL would not be an option, because the data is not relational.
The last example solves the Business Logic problem with Object Oriented Programming. Although OOP is the current default programming paradigm, that does not mean that a rules-based expert system might not be a better solution in some domains. I've seen a number of environments where a pipeline of simple procedural pieces did as good and clean a job as any OO design. In many cases, these alternatives will not be best, but that does not mean that they do not exist.
The worst problem with this article is that it misses the point of most of the right tool for the job comments I've seen over the years. At least in my experience, this suggestion was normally used when someone was really misusing technologies. Here are some examples of decisions that might generate this kind of comment:
In almost any application, there is the possibility that someone will choose a suboptimal technology. In many of those cases, it is just a matter of ignorance of the alternatives. Education and experience are the only solutions.
I'd like to close this with a quote from Larry Wall that captures the essence of this problem.
Doing linear scans over an associative array is like trying to club someone to death with a loaded Uzi.