Other fields certainly are not immune to the declining value of knowledge capital. Ask any older auto mechanic how much his knowledge of carburetors is worth these days, just as one example.
Any profession that works with continually advancing technology is going see this effect. Granted it does seem more pronounced in the computer field... but yet there are some areas where you don't see it so much. If you are a "C" programmer, everything you've learned over the decades is still relevant and useful. Unix system administration is another example. Yes things change, but a lot of the old knowledge, e.g. tools like sed and awk are still plenty relevant.
Also the author doesn't acknowledge that a lot of knowledge transcends language. Knowledge of data structures and algorithms carries over as language fads come and go, and there's a lot of core knowledge you don't lose in the transition from one to the next.
I've been in the software profession for over 20 years, and what I've noticed is that the stuff that fades away quickly is the stuff that sucked from the get-go. COBOL? Sucked, now mostly gone. Mainframes? Sucked, now mostly gone. Visual BASIC? Sucked, now mostly gone. C? Dangerous, but powerful... still around. Unix? Pretty fun, very flexible, still around. SQL? Still one of the best ways to work with data, still around.
I'd respond to the author if I could but the article is from 2007 and comments are closed.
The author tends to cite jobs in law and finance as better routes of work. I'd disagree. Only graduated recently (2010) but I went to a T[op]14 school. I've since worked at two small law firms (personal injury and foreclosure) and also abroad as a financial analyst for a real estate division of a conglomerate in Asia. I'll try to be short about my points.
1. While law and finance don't necessarily suffer from the same knowledge turnover rate as programming, they do suffer typecast-inducing problems. Lateral movement among subfields is extremely difficult. This becomes a bigger problem if the first field you land in is meant to be a stepping stone.
2. Prestige is bullshit. I've associated with V5 lawyers and Bulge Bracket bankers. Very few of them are not socially awkward. You may hear rumors about "models and bottles," but rest assured, it's mostly just bottles.
3. Foreignization is a growing concern for law and automation has been a huge concern for finance for a while. There are various work arounds to the rules regarding unauthorized practice of law. As for finance, there are a few small documentaries lamenting the fall of the floor trader. Guess who's been replacing them?
4. Management roles are nearly ubiquitous. That said, you do get escalating responsibilities in law. The problem is, will you last the grind leading up to it? A good friend of mine once messaged me while I was working abroad. It was 2a.m. in NY at the time. He told me his entire week was spent indenting, printing out, and placing into a folder a table of contents for a case. Among my friends, he's at the best law firm out of all of us.
5. Conditions in law and finance are part of the golden handcuffs you graduate with. They're good, no doubt, but they come at the price of lots of hours. Those who work in law are at the mercy of clients. Same could be said about finance in some areas.
I responded to this, not to say that a career in computer programming doesn't suck. It might. But the grass is always greener on the other side, and from where I'm standing, my lawn is looking pretty brown. I responded because I disagree with the underlying message: you are better off getting a law or finance job through a law degree or MBA. Degrees aren't measures of success and they aren't golden tickets to it either. There are pros and cons to every career path. Just have to figure out where, for you personally, the former will outweigh the latter.
TL;DR: law and finance sucks too. But in the end it all comes down to personal preference.
> But in the end it all comes down to personal preference.
Yep, I have always chuckled when I see someone trying to decide what career field to go in to based on how much money they imagine they will make. In any career where there is oppurtunity to make large sums of money there are 10,000(,000,000) other people, as smart or smarter than you, scrambling to do the same thing. If you get into a profession to make money and not due to a passionate interest in the area, you are unlikely to rise to the top no matter how clever you consider yourself to be. It is one of the great paradoxes. Some of the smartest, most talented people I know in the field (and also very likely to be in the top earning bracket) are MUCH more motivated by passion and interest in the area than paycheck. Conversly most of the people I have met that are motivated primarily by paycheck size are, honestly, a lot less talented than they believe themselves to be and not really the 'in-demand' types that would allow them leverage to get the greater paydays they seek.
I disagree with many of the points here. First, it sounds like this guy is on the Microsoft Treadmill (TM). It's a widely know fact that MS routinely retires technologies and platforms at a breakneck pace (I mean, really? you're going to retire a whole programming language -- VB 6.0?).
Well, I work in C and C++ on linux/unix platforms. The C programming language has been around for decades and will continue to be around for decades to come.
He keeps saying that old knowledge is worthless. This isn't entirely true, as I believe that this knowledge builds up into a kind of intuition. Intuition gained over several years allows seasoned programmers to very quickly pick up on new concepts, programming languages or platforms.
I still can't get over the fact that MS retires programming languages. It's almost as if they want to squeeze more money out of their customers, so they cook up some new technology that is the must have. Oh, that's what they're really doing? Oh, ok.
It's a good point. In videogame development, C has been a standard for a very long time. Wolfenstein 3D was written in C, 20 years later, Modern Warfare is still being written in C (albeit a bit modified). Most other game development is done in C++, also not exactly a brand new technology.
Had you invested correctly in the right languages, you'd be doing just fine today.
"So what advantage does a 60-year-old .NET programmer have over a 27-year-old .NET programmer when they both have, at most, 5 years of experience doing .NET programming? Absolutely none."
Umm, I would say that 40 years of experience building software applications is a pretty huge advantage. I've been writing iPhone apps longer than most of the older guys I work with, but that doesn't mean that I know more about development than they do.
Languages and technologies may be temporary knowledge capital, but they're also one of the least valuable things a good developer will have.
As far as prestige and foreignization? It's amazing how much things can change in 5 years. Programming is (sorta) becoming cool thanks to things like The Social Network. 2/3 major operating systems are developer almost exclusively in the United States. All major programming languages are in English. Software development isn't going anywhere in the US.
Umm, I would say that 40 years of experience building software applications is a pretty huge advantage.
I have to admit, I don't see a whole lot of gain after about five years. As everyone is saying, CS knowledge is timeless, but aside from the basics, it's really not that useful outside of research positions. Most of software development is making CRUD apps, integrating APIs, and other tedious but non-complex stuff like that.
Regarding prestige, most of the time when I'm at a non-techie party, when asked what I do, people have nooo idea what "software engineer" means. :-)
Most of software development is making CRUD apps, integrating APIs, and other tedious but non-complex stuff like that.
Isn't that sort of like saying that most of the restaurant business is flipping burgers? Just because something is more common, that doesn't mean you won't be able to move up the food chain a little after 40 years of programming. Companies still have plenty of hard problems to solve in network performance, hard drive performance, UI architecture, cryptography....
If the vast majority of jobs are in burger flipping, and the limited number of elite chef positions are generally filled by a small subset of candidates (e.g. top schoolers) who were on the elite chef track from the get go, then yes, it's exactly like saying that.
You don't really "work your way up" from years and years of CRUD apps and API gluing to hard CS. In fact, if you get a CS degree and don't go straight into hard CS, you will forget most everything you learned in a few years, not to mention miss out on years of focused practice.
I agree with that. I'm personally stuck in one of these positions and it's a black hole that's really hard to get out of. Once you've learned most of what there's to learn about CRUD implementation and gluing them together, you need to get out as soon as possible or a couple of years down the line you'll simply not be qualified enough for any other position in CS.
That or you can write code in whatever spare time you might have from your 8-10 hours a day job. Based on what's said around HN, people putting in 14 hours a day coding are not that uncommon these days, so perhaps it's something programmers are better get used to. Remember to blog, have a rich github portfolio, contribute to OSS and run your own consulting consulting company in the spare time from your full-time job.
My dad is a 60 year old lawyer, working his tail off for 40 years straight in a suit every day, spending his wealth to look as rich as friends. I am looking forward to being unemployed at 60, after working 40 years, on a pile of money saved from a career spent on intellectually interesting work in comfortable class these
Well, a lot has changed since 2007. I'd argue that programmers are held in high regard now--witness all of the news articles in the past year about "software engineers taking over the world". Further, the work conditions might be better? Maybe it differs by where you're working, but every hip software company out there touts its awesome facilities and free food and free everything.
>I'd argue that programmers are held in high regard now--witness all of the news articles in the past year about "software engineers taking over the world"
This doesn't prove much. The media you follow might have these articles, but most of the world still don't know much about programmers or hold them in high regard. If you work in Silicon Valley, I'd say it's a bubble, outside of which people think very differently about ICT.
I have trouble thinking this 60 year old programmer exists. There were only 2000 computers in existence in 1960. Today a 60 year old programmer that spent a career in computer science is a very small group of people.
I'm going to wait and see how this really turns out.
> [P]eople born in this country have more rights to the money being created here than foreigners. Asian countries feel the same way about foreigners. Asian countries are, typically, a lot less open to foreign worker immigrants than is the U.S.
I would like to know why being born in a given country should entitle you to more or less opportunity than anyone else who wants to do business in that country.
I would settle for an explanation of how bad immigration policies can be justified by pointing to states with worse ones.
I fear you are confusing positive with normative claims, even if your highly doubtful positive claim is true. How do you make the citizens of a given state better off on average by preventing them from engaging in mutually voluntary associations with whomever they want?
When you read that San Jose is one of the fastest growing cities in the US, do you conclude that San Jose will soon be a desirable place to live and do business, or that undeserving foreigners from other states are "taking San Joseans' jobs?"
The fact that the OP is confusing a positive claim: "states often exhibit little or no regard for the people outside their boundaries" with a normative claim "states _ought_ to act that way" is plainly evident.
That counties and countries are two different things is also obvious, but you need to offer reasons to believe that they are morally distinguishable with respect to economic growth and immigration.
Your analogy doesn't hold. San Jose is not a country, and it has no legal right to prevent US citizens from entering its borders. If (eg.) more people enter than its hospitals or roads can comfortably handle, then it can appeal to the Federal government for assistance. The US cannot.
I don't think the original statement is entirely true either - I can think of plenty of countries which don't promote the well-being of their citizens - but I would class those as broken countries, rather than trying to say that the notion of a country is broken. Certainly most western countries exist to benefit their citizens.
The analogy holds precisely for the reason that San Jose does not consider itself beleaguered by the influx of immigrants who want to live there. The fact that you could entertain thought experiments in which the net consequences of immigration are negative (e.g., a giant flashmob of 50 million people immigrates simultaneously) does not suffice to show that San Jose's hospitals and roads are overwhelmed in the actual world.
In the actual world, San Jose is not in fact appealing to the federal government for assistance; it is reaping the benefits that attend a growing metropolitan population and tax base. On average, San Joseans are better off when more people become San Joseans, for all the same reasons that New York is a more desirable place to live (as measured by the demand for housing) than Wichita.
I've never claimed that there are no states who do not take their citizens' interests into account. I claimed that states often take little or no consideration of the interests of non-citizens. In absolute terms, for example, malaria is a much more pressing human affair than corn subsidies. But corn subsidies primarily affect people within the US whereas malaria primarily affects people outside the US, so corn subsidies dominate malaria in contest for the attention of the US government.
There are plenty of cases where immigration can be negative, particularly in the short term, so controlling the amount of immigration makes sense.
In the case of corn subsidies you can argue that the political process has been derailed in favour of special interest groups. If you follow the chain from corn -> HFCS -> diabetes and heart disease, it's not even acting in the citizen's best interests.
Similarly, it's also trade issues that make malaria worse than it has to be. Screens for windows would make it much less likely to spread - but the material is too expensive for most of the people who really need it.
I don't mean to distract with the examples of corn subsidies and malaria. I don't think it's terribly difficult to find a plethora of examples in which a state's concern for rather trivial internal affairs dwarfs its concern with globally momentous affairs that happen not to affect its citizens. The fact that the spread of malaria could be, but has not been, mitigated by relatively cheap counter-measures just goes to show.
I am not claiming that immigration is an unalloyed good, nor that there are no caveats to consider. I am claiming that freedom of travel is a human right, and that this freedom is largely compossible with the flourishing of the new states into which immigrants move. I just cannot locate any moral claim I have against people who want to move to the part of the world circumscribed by US borders, nor any moral obligation they would have to recognize one. Where is the argument? I can certainly understand that citizens in certain industries would prefer that immigrants with similar skill-sets not immigrate, but protection from competition is not a human right.
What makes someone a citizen? In the US, being a citizen is a birthrate or comes via a long, tedious bureaucratic naturalization process. Other countries have different definitions of citizen.
IMHO citizenship by birthright is a pretty absurd notion, but its largely the default for historical reasons tied to war and conscription. Originally governments only gave you the right to citizenship in exchange for the ability to send you to war to protect often economic interests.
The world would be for more efficient and interesting is the "market" for citizenships were far more liquid and people could easily choose their government the way they can already do internationally by moving from state to state and city to city.
A career in doing stuff you don't like sucks. If you like programming, then you're not going to mind spending time to brush up on your skills. You just see it as shiny new technology & toys to mess around with :)
Anecdotally, a relative of mine who's now in his 60s and has been a mostly unemployed programmer since the 1970s. He's yet figure out how object-oriented programming works, it's very hard in his words, although I don't think he complains much about it on the internet.
I didn't get very defensive here, you and I probably understand(even more considering it's 2012 and there's a lot of different stuff now) that a lot of this things are not 100% certain...but you gotta admit a lot of things are true of the profession and yes they will get to hit a lot of people(and us, maybe) with their pants down...
Any profession that works with continually advancing technology is going see this effect. Granted it does seem more pronounced in the computer field... but yet there are some areas where you don't see it so much. If you are a "C" programmer, everything you've learned over the decades is still relevant and useful. Unix system administration is another example. Yes things change, but a lot of the old knowledge, e.g. tools like sed and awk are still plenty relevant.
Also the author doesn't acknowledge that a lot of knowledge transcends language. Knowledge of data structures and algorithms carries over as language fads come and go, and there's a lot of core knowledge you don't lose in the transition from one to the next.
I've been in the software profession for over 20 years, and what I've noticed is that the stuff that fades away quickly is the stuff that sucked from the get-go. COBOL? Sucked, now mostly gone. Mainframes? Sucked, now mostly gone. Visual BASIC? Sucked, now mostly gone. C? Dangerous, but powerful... still around. Unix? Pretty fun, very flexible, still around. SQL? Still one of the best ways to work with data, still around.