Which programming language/tool/other technology to learn?

When I was an undergrad studying computer science I remember walking with a couple of friends and fellow students through one of the buildings on campus on night. As it happened it was the building where the computer science department was located. We were (probably) heading to one of the computer labs, and as it happened the chair of the department had come out of the elevator and was on his way out of the building. We said hello and he then decided to take the moment to share a bit of advice, which was to learn Java. That was it, fairly simple. I think at the time we looked at each other and more or less shrugged our shoulders and got on with things.

Somehow I’ve remembered that bit of advice from our department chair since then (although certainly I didn’t act on it at the time in a meaningful way). That was about twenty years ago — perhaps 1993 or 1994 — and at the time Java was still a pretty new language, and none of our regular computer science courses was using the language. With a few exceptions most of the programming I was doing in my classes as an undergrad was in C. My introductory programming class in college used a language called Modula-2, which probably very few remember now (and I believe that was the last semester they taught that class with Module-2, switching to C afterwards.) There was also a course on programming language design introduced us to Scheme (a close cousin of LISP). The point though was that the department chair recognized that this was going to be an important language and was worth investing the time to learn it.

When getting started in this business there is an incentive to try to learn a lot of different languages, or development tools. After all, the more one knows, the more potentially valuable when hitting the job market. Or at least we might be inclined to think — a point I’ll return to later. But there are any number of things one might spend time learning and only a subset of those will prove to actually be useful. This is a classic problem of course — how do you decide what to focus on and what to set aside? A couple of years ago when I was teaching I had a student more or less ask this question as well. I think since I was an undergrad the problem has become much more challenging. There are simply more languages and applications in active use now. We have more choices for databases, and mobile app development didn’t even exist ten years ago.

Trying to decompose this problem is worth doing I think. For any given language, tool, or other bit of technology we might pose a few questions:

How widely used is it now?

A widely used technology can be comforting to get into since you may assume there will be more resources to help get up to speed with it, as well as a broader community to tap into. If we are talking about programming languages, then you might look to see which ones there seems to be more demand for. The Jobs Tractor Language Trends – February 2013 report for instance shows Java and PHP being more popular now. But a narrow market segment isn’t necessarily bad either. In some niche areas a less widely used language or technology might be the dominant one in that area. Also, new technologies have to start out somewhere and can sometimes take a while to find their audience, which leads to the second point.

Potential for growth?

Besides current popularity another variable is what kind of growth might be expected for people that know Technology X.

Don’t confuse how widely used something is with demand — they are related but not the same thing. There’s still a market for COBOL and Fortran developers, for instance.

Open vs. Proprietary

Right now there is a division between open technologies and closed, proprietary ones. With proprietary technologies you can expect your investment cost to go up if only because of the need to acquire the necessary software (and license).

Getting into mobile app development is a good case study here. Becoming an iPhone developer, for example, requires a certain up-front investment: you need a) a Mac to have XCode and the whole development environment, b) an iPhone and c) join Apple’s iPhone developer program. However, that’s a popular platform to develop for (at the risk of understatement) so yeah, it’s pretty compelling. It should be said, given this example, that becoming an Android developer isn’t free either — but you can get by with a less expensive computer with Linux and you can get Java for free.

Is it something you are interested in?

I think there’s something to be said for pursing things you actually are interested in versus things that you think will be good to know, but are otherwise not that into. You’ll generally do better at things that you are intrinsically inclined towards. At the end of the day I think this is the one to weigh most.

Avoid spreading yourself too thin

Finally, to get back to the point about trying to ‘learn everything’, at a certain point I think it is important to recognize that you can’t do possibly do that , and certainly not at the same time. There is likely an added bonus that comes from having experience with a variety of tools — seeing how different languages handle the same or similar problems can be insightful, for instance — but that kind of knowledge comes over time. Over the span of a career your primary tools are going to evolve anyway.

The list above is just an attempt to try to sketch out how to think about this problem in some systematic way; I certainly don’t think I have any definitive answers here. What I can say is that, at least for myself, I’ve decided that there are certain areas I don’t see myself investing time to learn things and instead focus on other areas. I’m less inclined to get into .Net and Windows development at this point in my career, for instance, since that would be a pretty significant switch from where my current skill set is, and quite frankly I’m not as interested in it. This is certainly not to disparage .Net and Windows development — it’s just a choice of where to spend resources.