Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Will programming continue to be a lucrative profession in the future?
51 points by thefutureholds on July 3, 2015 | hide | past | favorite | 84 comments
Being a programmer right now is, in general, great. The job market is on fire and the industry is rapidly changing. Supposedly, the rate at which new jobs are created for computer scientists even outpaces the rate at which newly graduated CS majors enter the workforce.

Do you expect this trend to hold?

If so, how crazy will it get? The rumors posted on here of $300k salaries for non-famous engineers already seem unbelievable to me.

If not, what will cause it to end? The only doomsday scenario for developers in the first world that I can think of is a rise in extremely competent dev shops in developing countries.

(I realize that no one can predict the future, but I'm curious about the HN sentiment here.)



The estimate is that 80% of all effort in software goes to maintenance of existing systems, of which 21% goes into bug fixing: https://en.wikipedia.org/wiki/Software_maintenance

With every new system successfully brought online by one programmer, they are creating a future need for four programmers to maintain that system.

Just to keep the existing body of software afloat, the existing body of programmers is dramatically insufficient. We have known this problem for decades. You will find lots of old papers raising exactly this issue.

Concerning developing countries, to an important extent they have already been absorbed into the global workforce. Certainly India is already supplying its best and its brightest. Only a limited percentage of the workforce is capable of working as a programmer. Furthermore, developing countries increasingly consume their own supply of software services. Therefore, third-world supply may already be largely exhausted.

With the world increasingly dependent on software (Has it every been increasingly dependent on lawyers?) stopping to hire programmers pretty much means scrapping systems on which organizations depend. This happens all the time already, but the net effect is still to add new systems.


If you refer to this line: "over 80% of maintenance effort is used for non-corrective actions." it doesn't say that 80% of the software development effort goes to maintenance at all... it says that, of the X% maintenance effort, 80% is not used for bugs. Then it gives an example for the accuracy of this number, 21% of bugs, so 21%+80% ~= 100%

It doesn't attempt to explain what X is at all. It could be 10% or 90%, however in my experience, and according to other's point of view[1], 60-90% of time dedicated to writing code being maintenance time sounds legit

note: modifying = 5 * writing new code (20%). Understanding is another matter, but also important: [1] http://blog.codinghorror.com/when-understanding-means-rewrit...


The world is increasingly dependent on lawyers, the demand for which increases with the size and complexity of the economy. More economic activity means more agreements to be negotiated, disputes to be resolved, and regulations to be enforced. Moreover, this is high-end legal work, and many new lawyers will enjoy long, lucrative careers. It's not clear to me that I should advise my children to embark on a short, low-pay, low-status career in legacy software maintenance.


It depends on where you live and what factors you include. Where I live, I'd you just look at money, you'd probably be best of going into construction. You may make less than me, but due to how our taxes work, it'd take me decades to make in salary what the construction guy saves by (a) entering the labor force years earlier and (b) using his skills, network and resources to build his house for a fraction of what it'd cost me.


Who knows what the future will hold, but in the US, there are more new lawyers graduating each year than there are law jobs. Software has taken over many of the jobs the entry level lawyers used to perform.

Law (in the US) reacquires an extra, very expensive, 3 year degree, and many law school graduates are never able to find jobs as lawyers.

The market will probably eventually correct itself, but the days of law school as a sure-fire way to a high paying job are probably over.


The "more new lawyers graduating each year than there are law jobs" factoid is both stale and irrelevant. Most of those new lawyers had admissions scores well below the 50th percentile and are graduating from third and fourth tier law schools. Who cares about them? I doubt many would have had bright futures in programming, either.

Relatively modest undergraduate performance can get you into a law school like Northwestern's, where 87% of the class of 2013 "found work and reported a salary", and at least 43% of the class had a starting salary of over $160k, a very respectable ending salary for a programmer, especially outside of SFBA (http://www.lstscorereports.com/schools/northwestern/sals/201...).


Almost 50% of new law school graduates can't find jobs. Of course it's likely that the unemployed 50% is the bottom 50% of applicants.

Look at the distribution of lawyer income. There is a large group bunched near the bottom making 40k-60k a year, and a smaller group at the top making above 160k. You need to do really well to be in the 160k group.

>Relatively modest undergraduate performance can get you into a law school like Northwestern's

The median LSAT score for Northwestern is 168 about that's right at the 96th percentile, so about 4% of people taking the LSAT will score a 168. Even Northwestern's bottom quartile score is just below the 90th percentile. Their median undergrad GPA is 3.75. How are those numbers relatively modest?

So yes, someone who scored in the 96th percentile on the LSAT and had a 3.75 GPA in undergrad has a decent chance of spending 3 years at a top tier law school where they have a 43% chance of making over $160k a year upon graduating.

Northwestern also costs about $300k to attend.

>87% of the class of 2013 "found work and reported a salary"

I don't thinks that's really saying all that much. It doesn't say they're working in jobs requiring a law degree. Of course 87% are working at some kind of job--they owe $300k in student loans. Another way of looking at it is--13% of graduates from a top tier law school are unemployed with $300k in debt.

No one is arguing that lawyers from top tier law schools can't make a decent salary, but there are only a few thousand slots open in the top law schools each year. If you're in the top few percent of law school applicants and you think you'd enjoy practicing law, then by all means go to law school.

But looking at the averages, the median salary for a software developer is about $93k, and the median salary for a lawyer is $113k (from the bureau of labor statistics). Total cost for law school is over $150k on average, and the opportunity cost for not working as a software developer for 3 years is much more than that. Add in interest for student loans (and forgone interest on potential savings) and it will take over 2 decades before the average lawyer pulls ahead of the average software developer.

Add to that the fact that software developer jobs are expected to grow at a significantly higher rate than lawyers, and that lawyers constantly place near the bottom on job satisfaction surveys.

By the way I, initially planned to go to law school, but every lawyer I talked to was so discouraging that they eventually talked me out of it. A few of them were very successful family friends, but they absolutely hated their jobs, and they warned me that there are much easier ways of making money.


If you're a programmer who can pass algorithm interviews, you have a great chance of scoring near the 90th percentile the first time you attempt the LSAT, and the 96th percentile with practice. Scoring much below the 80th percentile on the LSAT is very poor. In Canada, few students are admitted to any law school with scores that low.

Say you spend $300k on law school. Over a 30 year career, you only have to make an average of $10k extra per year to break even.

The "Jobs Data" tab offers more details: "79.2% of graduates were known to be employed in long-term, full-time legal jobs", "93% graduates were employed in long-term jobs", etc.

The number of software jobs are expected to grow, but is the growth going to be in jobs you really want, or will they all be for 23-year-old coding bootcamp grads?

Again, who cares about the nationwide averages? The 50th percentile Northwestern law grad makes $160k right out of school, and is on track make several times that as a law firm partner, or somewhat less as in-house counsel. My impression is that most programmers struggle to hit $160k any time in their careers, at least outside of SFBA.

When someone describes the downsides of their job, I take it with a grain of salt. Often, it's a case of "the grass is always greener". Sometimes, members of high-status professions want to downplay their success. In any case, most of the lawyers I've talked to say they enjoy their work (though they do work much longer and less predictable hours than programmers).


>If you're a programmer who can pass algorithm interviews, you have a great chance of scoring near the 90th percentile the first time you attempt the LSAT, and the 96th percentile with practice.

That's probably true. But again, there are only a few thousand slots available each year at top law schools, so for the vast majority of programmers this can't work. Just a few hundred each year taking your advice would change the equation.

>Say you spend $300k on law school. Over a 30 year career, you only have to make an average of $10k extra per year to break even.

That's true, but the average is more than $300k. The average programmers makes $93k a year, since he can work 3 fewer years because of the 3 years in law school, that's $279K in lost wages + $150k for law school.

Sure the lawyer will likely eventually pull ahead, but extra money near retirement is worth less than money early on. If the programmer invests the extra money early on, the lawyer may never actually pull ahead.

>"The "Jobs Data" tab offers more details: "79.2% of graduates were known to be employed in long-term, full-time legal jobs"

Legal jobs doesn't mean working as an attorney, or jobs requiring a law degree. It could mean $15 an hour paralegal work, so that statistic isn't useful.

>The number of software jobs are expected to grow, but is the growth going to be in jobs you really want, or will they all be for 23-year-old coding bootcamp grads?

That's possible, but the new jobs for lawyers could be just as bad. From the Bureau of Labor Statistics "Some recent law school graduates who have been unable to find permanent positions are turning to the growing number of temporary staffing firms that place attorneys in short-term jobs."

Software has been eating into jobs that were traditionally done by lawyers, and it will continue to do so.

On top of this, lawyers are limited to practicing in states where they have passed the bar exam, meaning their ability to move to find jobs is much more limited.

>Again, who cares about the nationwide averages? The 50th percentile Northwestern law grad makes $160k right out of school, and is on track make several times that as a law firm partner, or somewhat less as in-house counsel.

And they admit about 200 new students per year. So yes, if you can get into Northwestern and you like law, then it's a good decision.

>(though they do work much longer and less predictable hours than programmers).

That's a huge caveat. The average programmer could have been the average lawyer instead, worked more hours each week at a higher stress job so that by that he can break even in 20 years, and spend the last 10-20 years of his career making a bit more money.

If you like law and can get into a good school, then practice law. But I hardly think the extra, debt, stress, and hours worked makes it worth it for purely economic reasons.

>When someone describes the downsides of their job, I take it with a grain of salt. Often, it's a case of "the grass is always greener".

This would be the case for both programmers and lawyers, but job satisfaction surveys show that lawyers consistently rank near the bottom below programmers.


Programmers of the world, you are at the forefront of job creation!


I agree that it's impossible to predict the future. There may be some discontinuity that we haven't foreseen. That said, it hasn't happened yet.

I think the run of the mill work will get more commoditized. Building web sites is already pretty cheap. Programmers in Europe make a fraction of what their counterparts in the US make.

But the fact remains, programming is still more of an art than a science, despite all the effort that has gone into making it a repeatable technical discipline. As smcquaid said, software is hard. Many new products require invention on the spot, albeit using more well understood techniques as the years go on. But as the scope of techniques continues to expand, so does the problem space within which they must be applied. Software is being used to streamline more and more of our world.

As much as we try to make it repeatable, it is still an art. Agile has gone a long way towards this goal, which is one reason we see run of the mill work being commoditized. Many people are certified in Scrum. But that will never completely solve the problem. As long as there are new challenges to be solved, those who do it well will be compensated accordingly. There's a reason good doctors and lawyers still make a good living, centuries into the advent of their professions.


Medicine and law require licenses to practice, as well as an exclusive pathway to the main game via lengthy tenure in supporting roles. This is in part to to protect their clients and patients. I want to know that the guy operating on me has relevant training and experience. But it also acts as a barrier to entry, keeping wages high in each industry.

Programming requires you to have a free account on Github and some demonstrated ability.

Keep your game up and your network current.


When I started programming in the '90s I thought "this is it. We are so far advanced, there's nothing left in the computing field except making slicker UIs. Hardware will get faster but computer languages are at their pinnacle." Of course I was entirely wrong, but I did believe as a result salaries for software engineers would begin to decline as more people entered the field.

Now in 2015 I believe we are only at the very beginning - a humble start. There will be things during our lifetime that change the computing landscape. Biological circuitry, quantum computers, virtual reality to name a few. Advanced technology will increase the demand for capable software engineers to solve problems we can't even dream of today.

However I don't think $300K salaries are going to be the norm for engineers solving for 140 characters or how to deliver television shows on demand.


Why not? These are the services for which there is demand.


I believe demand for engineers to solve these problems will wane as the solutions become codified into various open source projects.

Think of how difficult a problem that involved terabytes of data was to solve before the advent of Hadoop and now consider how easy it is with solutions like Spark.

Overall this is a good trend. I'm not saying that $300K engineer salaries are going away; I'm saying high salaries will pursue the engineers who are solving unique and demanding problems.


Law of Headlines says "no".

My experience also says no. The two issues I'm facing are:

1. Due to technology churn, after a couple years of experience, your experience loses its market value faster than you can get new experience.

2. At 40, I'm starting to feel age discrimination. When you go on an interview and everyone else is <25, you see that you "aren't a good cultural fit". Younger programmers have started talking down to me like my experience is irrelevant. Then they ask me to debug their code for them.

As a programmer, you can make good money from 25-35. After that, it's starting to look like it's over.


Technology churn. I have been sailing on the concepts of http://www.tcl.tk/doc/scripting.html for the last 10 years. It is true that not all IT employers or clients reason at that level of thinking. But then again, that is good thing (tm). It means that we do not need to waste time discussing and that we can all save time. To cut a long story short, I do not experience any technology churn at all.

age discrimination. In online recruitment, nobody asks anybody else's age. What for, anyway? Nobody ever asks me for a resume either. A github account is more than enough. I am also over 40 and I have never made more money than today. I certainly did not make more money when I was 35.

You see, I have always instinctively felt that I needed to stay away from certain corporate situations and practices. I have always found them absolutely imbecile, annoying, constraining, and ultimately useless. What you are complaining about are issues that only occur in a corporate cocktail of idiocies, of which the ones you have mentioned, are just two. If you have always evolved in that soup of stupidity, you should not be surprised that you now suddenly get hit by that kind of things. It was just an accident waiting to happen.


I see a lot of companies with websites that say they "value diversity" and yet have no one person in any team photo over the age of 40, which suggests their definition of "diversity" covers a singularly narrow range of characteristics.

This may contribute to the lack of women in software development: women tend not to be quite as stupid as men with regard to probability, and recognize that a profession where they are obsolete at 40 is not a good bet. Most of us won't make enough to retire on before 40, so only an idiot would go into the business if that really is what we face as we age.

"Software development: it's not a career, it's a lottery ticket!"


I think this will change when there are so many programmers above 40 that companies can't afford to turn them down. Due to age pyramids I can't really imagine total age discrimination to persist in the future.


It heavily depends on the company; Facebook may be all 20-somethings, but Microsoft is much more balanced. As one data point, I'm 43, joined Microsoft 2 years ago, have an offer from another big tech company.


In Canada here, you have this opposite issue I was just fired because a couple of 35+ said I was not a culture fit LOL, despite bringing a project out of development hell and giving the product a doable realistic feature.


This scares me. I'm thirty-two and therefore not very far from peaking, including salary unless you get a management function.


I'm the youngest senior where I work at 36. Our leads are approaching or past 50. This may be a localized problem.


It is localized, the work force is aging and the population is not growing at a rate to replace those aging out. For the most part I have seen the ration of young faces to old ones lessen. It seems my generation 40 is really the generation that grew up coding and therefore the age bounds are being pushed with us. I know in certain markets that is not true, but in others it is.


Im going to go with the in 50 years if you cant code on some level you will be seen as mildly illiterate. Coding will be core to nearly all jobs. There will still be people writing software as software engineers, but around the edges it will be specialists in a discipline writing code to extend the core software to do what they need.

Really, its just taking how many people use excel and growing that up. With technical improvements like simpler to code languages, more predictable API expectations (that is they all tend to work the same as a convention), and general maturity of software as a idea (its not old by any standard). Along with social expectations of what you need to be able to do. Today its use excel, a generation or 2 it will be basic coding.


If you're saying some jobs that today require "Excel" will require "Excel with VBA", sure. But I'm skeptical that programming will become some sort of new literacy. It seems to me that programming has narrower applicability than, say, high school math. Still, most non-programmers seem to forget their high school math, and they get by just fine in careers where numeracy would be useful from time to time, but is just not essential. No one considers them illiterate.


> most non-programmers seem to forget their high school math

I wrote a book to fix that: http://noBSgui.de/to/MATHandPHYSICS/

This book is like calling `apt-get install hs-math mech calc`.


I was not very interested in math in school or university (I suppose due to the lack of context and focus).

Now after working a few years as a web developer, a goal of mine is to revisit and study math and physics.

I've bookmarked your "No-Bullshit" Guide. Thanks for sharing.


Coding is getting much, much easier. Building an application that might have taken a team of 10 a year in the 1980s now takes a team of 2 just a few months. There's lots of reasons for this, but largely it's because programming languages are higher level, frameworks make the hard things trivial (especially UI and networking), and patching broken code is far easier so you can ship a buggy product without customers getting as annoyed. With SaaS things are even easier - essentially there's a single installation to patch and you can log every error rather than having to wait for feedback.

These changes will continue. Software will get easier and easier to build. What takes times today, say algorithm optimisation, will take a fraction of the time in a decade because we'll have better tools for doing it and hardware will be fast enough that it won't matter as much. The fact it'll be a less skilled job will exert downward pressure on wages.

Conversely though, the amount of software needed will continue to increase, so demand will keep wages up.

The question is which force will win out in the end.


> Coding is getting much, much easier.

I don't know about that one. If you want to build a compiler or a scripting engine, you will still be dealing with the one or the other variation of lex and yacc and then get bitten by the intricacies and gotchas of writing up a compilable LALR1 grammar. You should also have a reasonable command of C (or C++) but that is rather easy in comparison. Building a compiler is as hard now as in the 1970s when they first started using automated tools for that.

> Building an application that might have taken a team of 10 a year in the 1980s now takes a team of 2 just a few months.

Yes, if you are always building a variation on the same database application, it should indeed become easier after a while. That is the essence of a "framework". You are always building the same application, with just a few variations here or there. That will indeed give the wrong impression that building software is getting easier and easier. Building such frameworks, however, is not becoming easier, and we continuously have truly new applications to build, not just variations on the same one.


To use a car analogy you're suggesting "the automotive industry still finds designing a car difficult because look at Formula 1 racing!", while ignoring everything that makes designing and building cars much easier/quicker/safer/cheaper.

There'll always be edge cases where generalisations break down but looking at them doesn't give you any insight to the bigger picture.


I think you're always going to see that. When a system is pioneered, it is very, very, very hard to get any kind of traction.

To extend the car analogy. Karl Benz spent more than 3 years trying to find an engine configuration that worked and did not blow up. Then, again, he spent years to get an engine of the correct size to provide the correct amount of force for his horseless carriage, and things like transmission, gearshifters, ... Designing the first automobile, start to finish, took one man, going through 3 companies because they wouldn't let him, from 1871 to 1885, 14 years total. And keep in mind that the 1885 version was extremely unreliable, hard to control. It got out of control during a demonstration and crashed it into a wall 3 months after it's construction. It took until late 1888 for a "usable" model to get onto the market (and this is using a flexible definition of the word usable : his wife made 3 design changes on her first trip with the car, one because the brakes didn't work downhill, another to the ignition and something about the fuel line)

But 2 things change for today : building a working car, is much easier. Anybody can do it in a week. Building a car that people might want to buy ... 2-3 years at least. Building a car with a new engine (the only way you're going to really improve performance), or any large difference (e.g. hybrid-electric) 6-10 years at least. Building a fundamentally different car like the tesla took barely less time than Karl Benz needed : 2000 (real work on Tzero started) to 2009 (working prototype presented) to late 2012 when the first model S rolled of the assembly line. This happens because the standards have gone up a lot.

Yes writing software is easier. But writing software that qualifies as "good" has much higher standards, which makes things take longer. I would argue that these 2 are in competition. Some years the tools win, and you can write software faster. Other years people introduce the web and "standard" 5 middleware layers (javascript client-side, load balancing, web server, business logic server, database server) and it takes a lot longer. Other years people demand that a local bakery's webpage is "scalable" and it takes a LOT longer.

If the car industry is a good example, the time and effort required to write software will effectively not go down by more than 20-30%, unless you compromise on quality.


It's not getting easier, it's just that the tools are getting more powerful.

Programming in 2015 is at least as complex as programming in 1990, in fact I would say it is more so, with the huge variety of frameworks, tools, APIs, etc you have to deal with.

It's just that all those tools and APIs allow you to produce many times more functionality per coding hour.


It already is a bad profession. All professions are bad, if your goal is to make money.

If your goal is to maximize money in your pocket, standard employment is a bad model.


What will happen to the value of a programmer over time?

Do you think opportunities to make an economic impact with software are increasing, decreasing, or staying the same? (I'd say they are increasing, as the world depends more and more on software.)

If you want to earn high wages, you should probably aim for skills that are (1) in high demand, (2) in short supply, and (3) not easily replaced by substitutes. So find a niche where there will be real demand and supply is limited (because it's hard to master: think "programming" + electrical or mechanical engineering, or computer vision, or...).


Depends how far into the future you're asking. 5 years, 10 years, 50 years? My two cents is that everything is cyclical. It's easy now to forget about the 2007 real estate crash, the 1999-2001 dotcom bust, the recession in the late 80's and early 90's. We're roughly due for another economic downturn, but the cause is anyone's guess and the software industry is not immune to external market forces. Nothing stays up forever, and nothing stays down forever either (in economics and business).


That remains to be seen. Software has been around far less time than the boom/bust cycle.


...and yet it's already been through one iteration, two if you count the 1983 video games crash (video games are software), and more and more learned heads are starting to look uneasily at the idea that it's on the upslope of a third.


No. Give it another few years and the fact that it's a red-hot job market will catch up and we'll end up like lawyers. The perception of high salaries and the reality of a lack thereof.


I think it will be more than a few years. It's basically been like this for at least the last 30 years (as long as I've been in the industry). Demand outpaces supply.

Schools see programmers as a money making industry for them, so they mint new programmers as fast as they can (regardless of ability). People in other industries move over to programming because there is such a demand that you can even get a job without any qualifications. The end result is that even while demand is outstripping supply, the vast majority of the supply is under qualified to do the job. Employers burned by under qualified employees agree to spend more and more money to hopefully find the needles in the haystack. If they collude to agree not to pinch each other's needles they get their hands slapped ;-).

Even if there is a world-wide economic downturn (like we just experienced), it plays into the hands of high salaries because under performing companies dump under performing employees on the market making it even harder to sort the wheat from the chaff.

It's a bit like professional sports. There is an endless supply of people willing to play sports for a living. The competition to get the best of the best drives salaries up for everyone who gets a job. I was surprised to learn that one of the Japanese soccer players that I follow who has been sitting on the bench of a poorly performing 2nd division European team for 2 years is making more than $1 million a year.

However... although I don't see salaries going down for a very long time, I do see it getting progressively harder to get a job. Perhaps, as you say, like the lawyers. I think businesses are going to want to get value for their money and similar to the sports industry I can see a much more laissez faire attitude towards employees. Produce and you can stay (maybe). If we don't like you -- for any reason -- we'll dump you for the next guy.

Maybe I should become an agent...


You say this like programming has never been a red-hot job market in the past. Also a lot of lawyers are still highly paid even newer ones. Being a lawyer, like being a programmer is no guarantee that you will be making 6 figures but you're a lot more likely to in either of those jobs than most others.


As a lawyer turned programmer, I've written a bit about this here: http://www.williamha.com/economics-of-software-development-v...

I also did an interview on Above the Law about how some people in the legal profession may be better off in software: http://abovethelaw.com/2015/06/should-you-leave-law-and-lear...

And for the record, the majority of new lawyers in the United States are paid as much if not less than an junior software developer these days.


I'm a developer, my wife is a lawyer though not in the US.

I think the assumption that you call out yourself(64 yo developer) is the main difference over a lifetime of work. How many developers are going to be finding much work over 45-50 unless they are already well positioned as an independent consultant or have a name brand? That's 15-20 years more earning potential for the lawyer.

The other thing is the startup x-factor. Because so many start ups fail studies have shown developers that go the startup career route actually make less on average than their counterparts working for larger established companies. Very few people have the drive to make partner at a large firm but even fewer people are lucky enough to be an early employee at a company that makes their equity worth more than a years salary.


I know I'm going against the normal view of things here, but as I near 50, I don't see any difficulty with people my age finding work. The group I work with would kill to find someone good with 30-40 years of experience. They are just really hard to find -- either they are already happy where they are, are a consultant, are a manager/executive, or they aren't very good.

The problem is this: If you have more than 20 years of experience and want to continue being a employee programmer you only have a few choices. You can advertise yourself as a senior programmer and try to command a senior salary. Or you can advertise yourself as a more junior programmer and ask for a low salary. If you have the ability to do the senior position, then there is not problem. If you do not... who the heck is going to hire someone with 20 years of experience who is still performing like a junior? Even if they are cheap, there is value to the employer of imagining that the employee will develop over time. Someone who flatlined 15 years ago is not someone you want on your team.

The moral of the story is: if you want to be a programmer for your whole career you need to stay on the front edge of the industry. You have to work your butt off day in and day out to not only stay current but always improve yourself. There will never be a day where you "make it". You will have to justify that senior position that you are going to occupy and show that you are better than the up and coming young wolves behind you. If you are not prepared to do that, then you are better off not being a programmer.


The thing is not everyone is capable of being a senior developer or in a leadership role regardless of how much they know. I'd go so far as to say most aren't.


> How many developers are going to be finding much work over 45-50 unless they are already well positioned as an independent consultant or have a name brand?

Why exactly is this? And why would anyone go into a profession where they will be unable to find work when they are half way through, and at the peak of their powers? In every other intellectual enterprise--doctors and lawyers particularly--they hit their peak earning years just as software developers because (supposedly) unemployable.

This makes little to no sense. It isn't like good older developers become magically incapable of learning new langauges or frameworks. It isn't as if they become less reliable than a 20-something just out of school. It isn't as if they suddenly forget 25 years of history that lets them make more accurate effort estimates than anyone else.

So where does this perception that developers must be young come from, and why does anyone go into a profession that by definition (apparently) is going to require the to change feilds mid-career?


I'm not promoting or saying I agree with it(which I don't) I'm saying it is a reality for a lot of older developers.

I suspect it is that people think they are less flexible and less willing to work unsociable hours though.


The lawyers who find positions may get paid well, but the issue isn't just salaries - the legal market was completely swamped about 5 years ago, around the same time many of my friends were graduating law school. A good number of them never managed to find positions with firms, and many either returned to grad school or are (funnily enough) now programming.

edit: a word


I've seen this a couple of times already. It always shakes out, with the "real programmers" always having jobs and increasing salaries.


Forecasting future is obviously impossible, but we could approach your question in a structured way.

First of all, allow me to question your premise. Software engineering is a quite lucrative job in the US in general, and in Silicon Valley more specifically. But in many European countries command average salary (with variations due to seniority, obviously). In Eastern European countries, India and China it is for sure a well paid job but not nearly as lucrative as it is in the US.

Clarified this point, the drivers of such high salaries are resource scarcity and high demand.

On the demand side, there are few relevant uncertainties: basically whether the economy will sustain the growth of the software industry (or, there is no tech bubble) and, on a longer timeline, whether evolution in programming technique will outpace the growing demand for software or not.

On the supply side, the main question is whether there will be an immigration reform or not and how it will impact the tech sector. Furthermore, more people are training now in coding than before, both in universities and after university using bootcamps or similar institutions. So the critical uncertainty is whether the new trained people are enough and good enough.

My humble opinion, given all these uncertainties and their probable resolution in the long term, is that sooner or later, "software salaries" will have to adjust downward. The only question is when and how fast. Aka, it will be something in the immediate or int the far future and it will be a hard landing or a smooth erosion over time?


There are a few things working against the prospect of it being lucrative and a few working for it. I think it's a safe field for 5 - 10 years, beyond that it starts becoming a bit murky. Here are three potential reasons:

1. Influx of supply. This has been steadily happening for, well, the entirety of the life of the profession, but the prospect of big money at a lower barrier is making this feasible for a lot more people. Money attracts, which draws the parallels to the attorney drives in the past.

2. [ Feeding #1 ] Cascading loss of professions will move people into other parts of the labor market. As automation removes entire professions, those people will be forced into other markets and will begin to tighten competition and feed the supply influx further.

Which leads to the unfortunate ...

3. Increasing automation of developer tasks will reduce the need for humans for some programming / development. Not entirely, obviously, and not in the near-term, but a lot of the functions that require humans - general problem solving scripting, testing, simple goal-based programming will begin to fall to automated agents. On a similar note, congregation of technologies and philosophies will remove a lot of the dissonance between stacks, devices and platforms.


I think there will always be jobs where you get paid to write code.

If you narrow your definition, it'll look different. The jobs may not always in the Bay, they may not be using the coolest front-end framework or functional language de jour, and you may not get paid to go to 4 conferences a year. It might be maintaining custom billing software in .NET in Boise, but in the end it'll be programming and reasonably lucrative.


The whole tone of this question smacks of "gold fever". I'm guessing you're thinking about entering the field, so I'm going to tell you what someone told me during the original dot-com bubble:

Only become a programmer if you truly enjoy it.

If you're getting into programming solely because you think it's "on fire", you're going to have a bad time. It is always possible at any given moment for various reasons that VCs could collectively pull back on tech, or some major employers could downsize, and flood the market with excess talent. No one can predict this shit. There was a time post-bubble when many programmers could hardly give away their skills let alone make big bucks. Not long later there was a time when it was a foregone conclusion that all software development was going to India. Now here we are talking like "is this money wagon going to go to infinity?" and I say stop. Just stop. Do it because it suits you. Do it because you like the work. Please don't do it for "the money" which may or may not deliver for you, ever.


I believe the statement to be true. The rate at which new jobs are created for computer scientists even outpaces the rate at which newly graduated CS majors enter the workforce.

Meaning demand is going to increase at an increasing rate, supply will remain at a constant growth rate.

In the future, while technology may replace many things, It will never replace entertainers, engineers, designers or people who create creative things in general.

The killer fact to me is that software is hard. It is something that cannot be seen. Imagine trying to diagnose traffic problems when you can't see the cars or the roads or the traffic signals. Software is so hard that even if you build a system that accomplishes the main objective, it can be extremely hard to modify the system in the future if the quality is low. Top talent will always be in demand. Over time a few expensive talented programmers will always create a better product than hundreds of cheap mediocre ones.


Eventually, sure, the trend will die down. But given that the demand has been generally increasing for decades, I don't see a reason to think that it will die down until decades from now.

Look around and you see terrible software everywhere. The medical industry, the legal industry, the insurance industry, small businesses, government entities, etc etc. Almost all of them have terrible websites, terrible internal tools, terrible uptimes, terrible everything. Billions of manhours are spent on tasks that could be better done by software. There's good money in solving those problems, the problem is that there's only so many good engineers in the world, and the hot technology companies snatch up almost all of them.

They say that software is eating the world, and it's true, but so far the world is only 5% eaten. There's plenty of work left to do.


I compare it to the engineer of the industrial revolution and it will continue to be the same until there is a fundamental shift in the way we transfer instructions to computers. The biggest threat to the profession of software development is AI, when something can infer what a non-technical person means it will spell the end of the dedicated developer. Then much like the telegraph operator, the profession will disappear almost over night. That being said, a lot of professions will be gone just as quickly.

Until them I would like to offer some perspective from someone who has been in the industry for some time and to have seen the tail end of the home computing revolution, the birth of the web and all the iterations till modern day.

I had this discussion with a buddy that left software dev after the .com bust and the first wave of offshoring development jobs. He, having come from manufacturing was certain that offshoring spelled the end for US developers.

Personally, I had a different outlook, one that we should all remember when we sit at the negotiation table and that is, it is estimated that 10-20% of the population is mentally competent enough to write software and I would say that is a pretty good estimation. Probably another 5-10% border on competency so at best 30% of the population even has the capacity to code. We know the old 1 in 10 rule, that for every 10 programmers you hire 1 will be really good. So that puts us at what 3% of the population being exceptionally good developers? Probably another 6% being above average. Probably in the range of 10-15% being competent and the remainder costing companies more money than they are worth in lost time from the good developers carrying them.

So if you think about it somewhere around 9% of the world population is truly capable of good software development. And that is why despite the .com bust, the waves of offshoring and whatever will come before AI, the developer is one of the few bastions of the American middle class. It is by it's nature self limiting as far as competition goes.


Nope, in fact you are going to see salaries start to retreat as more people get into the industry.

The field that will collapse the hardest is going to be data science. I speak to this as a data scientist, where most I meet don't have the knowledge to perform their duties. Eventually our salaries will decrease to those of traditional office workers/middle managers.

What is a engineer to do in this situation? The answer is to specialize, or gain exclusive access. Specialization is obvious, and exclusive access are things like clearances, certifications, and networks. Of course, this omits paths such as entrepreneurship.


> Nope, in fact you are going to see salaries start to retreat as more people get into the industry.

That is what they said 15 years ago. Getting into the industry is not hard. It is staying that is hard. You need a respectable amount of talent and will power for that. There is an incredible amount of tourism going on in our industry. Put out an advert for a programmer and you will understand why recruiting programmers is so exceedingly costly:

http://blog.codinghorror.com/why-cant-programmers-program/

Like me, the author is having trouble with the fact that 199 out of 200 applicants for every programming job can't write code at all. I repeat: they can't write any code whatsoever.


Is this actually the origin of the fizz buzz? I was under the impression it had been around at least sense the 90s.


>>Nope, in fact you are going to see salaries start to retreat as more people get into the industry.

This would only happen if the growth of supply (of labor) is greater than the growth of demand (for software).

Which has simply been not the case. Just the opposite in fact.

As software "eats the world," we see an ever increasing demand for programming. This is likely to remain the case for the foreseeable future.


Coding is a valuable skill because computers can do certain tasks far more efficiently than humans can, and in ways that add massive value to our lives. Every engineering industry I know has been transformed by tools created by programmers. Whether you are solving matrices or simulating aerodynamics, the cost of doing certain tasks and prototyping has reduced to almost zero thanks to computers.

And, as far as we can currently tell, there's a massive amount of untapped potential. Whether your programming trading engines or social media, the right program, even a small program, has massive transforming power.

Even more significant, if a programmer writes a piece of code that makes a 0.01% improvement to a product, it's virtually free to distribute that improvement to hundreds of millions of people . A programmer may not even need to be very good to add millions of dollars of value to a company if they are put in the right position.

As time goes forward, we are seeing substantial developments to the coding practice. Languages are getting better, compilers are getting better, theory is getting better, test suites are getting better. All of this results in programmers being that much more useful when put in the right situation. Iterating on programs is also very cheap. Rather than needing to build an entire new plane engine, you just recompile your code and throw it at the test suite again.

Will it stop? Probably not in the next few decades. As simulations and other tools get better, we may see other engineering industries catching up, but right now programming is unique in how accessible it is, how quickly you can iterate on a design, and how easily you can distribute improvements/products to millions and even billions of people.

Even if the supply of programmers expands dramatically, I can't imagine running out of things to throw them at. There's always another feature, or another product idea, or some other project that would take a substantial amount of programming resources to complete.

High salary programmers are here to stay. I don't know where the ceiling is, but as the tools continue to improve, so will the justification of the salary. Even if we are in a bubble right now, the bubble bursting will still leave programmers in a very comfortable position.


I believe that people who can work as part of team to solve difficult problems using computational tools will continue to be highly paid.

I doubt whether people will continue paying high salaries for hacking up crud apps.

Concretely, I expect the peak salaries for people who program for a living to continue to increase more or less monotonically. However, I also expect the average salary in the field to fluctuate, and I believe that in 40 years from now, these past few years will stand out on an inflation-adjusted graph as "the good years".


No.

- Risk of loosening of immigration restrictions

- Large and increasing supply of skilled workers from East and South-East Asia, who are willing to work for lower wages.

- Skill based work is hard to defend and retain in the presence of significant competition.

- Tech companies may create and make widely available training programs to become a software engineer, also increasing supply. (I don't understand why this hasn't become the main purpose of MOOCs).

Chamath Phalipapitya (sp?) called programming the "blue collar work of the 21st century".


MOOC's have been shown to be ineffective most people dropout, and if you learn from an MOOC its unlikely you will hit most of the concepts, I think I have seen that when I worked as a tech recruiter for a year, there were tonnes and tonnes of iOS "Developers" who learned from coding shoppes and mooc they never make it past fizz buzz or even building anything more sophisticated then a twitter api call app.


We've seen the code that comes from companies and societies that treat it as blue collar work. Bloated and unmaintainable. Quality over quantity.


All these factors don't matter if demand increases as it has done during the entire prior history of programming.


The job market in CS is cyclical, with a roughly 20 year cycle; we are now at the point where supply will start to catch up with demand; 'everybody' will study CS, and in 4-5 years we will have a glut of beginners and bad programmers.

But, having experienced the bust of 2001-2002, I expect that, even at the bust, it will not be that bad. Also, programming trains you to think mathematically and analytically, so it is easy to move to a slightly different function.


The best place to be, will be in a position to leverage automation and artificial intelligence as a multiplier effect of your labor and income potential. Instead of competing with those things, instead of going against them.

The ability to run massive Internet services with very small teams, will continue to increase. The spoils will perpetually increase for those teams. Ride that trend, rather than having it ride you.


And then people make lots of money maintaining COBOL and Java code. Or securing custom PHP applications...


I think in the future, there will be programs to write 50% of the code for you in some way and (if DARPA get this right) you may have code that lasts a lot longer and does not need to be rewritten. So like programmers are making jobs in other professions irrelevant now, they will likely make jobs of other programmers irrelevant in the future.


I don't really understand the thing about computers writing code and this resulting in less demand for programmers. Don't computers already write a lot of code in the form of compilers, frameworks, preprocessors etc.? Seems that these technologies only make demand go up.

If anything, code generation is the worst option out of any form of code reuse and results in code that is entirely unmaintainable. Code generation is often a sign of a language that handles abstraction badly, like Java.

I can imagine some form of ai that instead o being trained to recognize images etc., is trained to present an interface. However this will likely be deathly slow and require custom code for anything custom anyway.


A friend of mine told how powerful SharePoint (a M$ product) is for automating workflows, information sharing, reports and all. He's technical, but definitely not a programmer and he's able to run/script all the business processes in his company.

So it makes sense what you're saying---with the right tools (Frameworks, GUIs, DSLs), many of todays "coding jobs" might go away.


People like Chris Granger are making programming more accessible:

http://www.chris-granger.com/2015/01/26/coding-is-not-the-ne...


The necessity of the human practice of software development doesn't end until the onset of general artificial intelligence. In the time between then and now it will increasingly be the hammer that is used for humanity's nails.


As long as technology exist will always be a need for programmers. We can't progress without new technology.


xyz will be a lucrative profession if and only if 1. xyz is gated 2. xyz is intrinsically hard 3. fewer people attempt to get into xyz over time 4. xyz yields higher wages per hour relative to other professions.

I don't see programming fitting any of these axes, let alone all of them.


I don't think the presented axes are correct (#3 I'd argue isn't necessary, though there may be a different supply factor that is, and #4 itself is equivalent to being a currently lucrative profession, it isn't a separate consideration that is necessary), and programming certainly currently meets 4, and arguably meets 2, at least for some significant subfields of programming.


In the far future? No, of course not. It depends when computers can program themselves (aka hard AI or, if you will, the "singularity"), which is either 10 years, 20 years, or 100 years from now. It is difficult to predict when programming will end as a profession, but it will be one of the last ones to go if you even believe in the end of work.


I think the hype right now is overblown.


Yes, the wages will go down, but so will everyone else's even more.

Computers, AI and automation are eating all jobs it makes sense to me by the time programmers are redundant everyone else will be long screwed.

PS IT enrolments are down, not up ATM


Yes.


I'm 34 years old, and I don't think this question matters at all. As someone who's been professionally programming since well before YC existed, I fall into a strange category: I'm really young.

To anyone even close to my age or older? Be very worried. To everyone else? Be worried about the non-programmers. If you can hack it at hacking, then you have nothing to fear in any economic environment.

Any individual might see a downturn because the overall economy sucks, or because the dominant culture doesn't recognize them as fully human. We live in far stranger times than that. Don't worry about the upper $300k tier, rather worry about the bottom falling out of the $60k tier market. (That's still triple the poverty line.)

If you are worried about anything else, you should literally die.


> To anyone even close to my age or older? Be very worried.

Huh? Why do you say this?


I don't know. I got downvoted on this post a lot.

Technology jobs are far less secure for the humans older than me with more experience. Ageism wasn't the only reason I posted this, but I'm guessing nobody that downvoted me was older than me. (eager to be wrong)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: