Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In 2014, the company decided it could do without many of its testers. Mary Jo Foley reported that "a good chunk" were being laid off. Microsoft didn't need to bother with traditional methods of testing code. Waterfall was out. Agile was in.

An average software dev today is expected to do the work and have the skillset that used to take a half dozen people or more.

There were of course even more roles in the prehistory, but if we think the 2000s, I can count at least: RDB design and management; planning and specification work; interfacing with the customer; testing; merging UI and backend engineering to "full stack"; merging coding, operations and admin to "devops"… I'm pretty sure that the only reason devs aren't yet expected to make their own sales is that the sales department is a profit center and, as such, sacrosanct.





As someone who worked in the film industry for 15 years, this is why I get weary of anyone telling me a new tool “will just make things easier.“ All it does is raise the expectations of what I am supposed to do even if it leads to my role expanding every six months without compensation.

I don’t know a single specialized camera operator anymore. Literally every shooter I know is also a competent editor, which I do think is neat and makes us better Cam Ops, but it also means people expect everyone to shoot and edit. Also capture excellent sound. Don’t forget the set has to look good. Make sure you’ve got a good Rolodex of locations ready to go as well.

We don’t need to keep every individual role just because it’s traditionally been there, but in a lot of industries we’ve clearly gone the wrong direction. And with something like QA/QC I could see that being a huge problem because the payoff is not obvious so upper management is going to want you to get something out the door no matter what state it is in.


Even early in its history, Microsoft was famous for merging these together into a single role: "developer". I remember reading (but can't find now) an article about how IBM has all these fancy roles like designer, architect, tester and the lowly programmer, and Microsoft's approach of integrating then is what allowed then to succeed over early competitors.

Remember Steve Balmer chanting "Developers, developers, developers" (in about 2000)? That's why.

I'm not saying I totally agree (although I think I do at least a bit), just that this is hardly new.


The “developers” mentioned in the monkey boy song were actually third-party developers. Ballmer wasn’t talking about Microsoft’s internal teams or nomenclature.

https://www.windowscentral.com/microsoft/the-real-story-behi...


Try the Developers remix. Total ear worm.

https://youtu.be/ug4c2mqlE_0?si=qtqu7tOC7Xpw67aN


I see your remix and raise you : get on your feet! https://m.youtube.com/watch?v=edN4o8F9_P4

Exactly the clip I meant! :-)

> Even early in its history, Microsoft was famous for merging these together into a single role: "developer".

No. Microsoft was famous for having a role called Software Development Engineer in Test.

> Remember Steve Balmer chanting "Developers, developers, developers" (in about 2000)? That's why.

No. Ballmer's chant was about 3rd party developers.


> An average software dev today is expected to do the work and have the skillset that used to take a half dozen people or more.

I think that depended (and still depends) a lot on the organization and the nature of the product.

I distinctly remember doing backend and some frontend development, requirements specification, database design, customer interfacing and even a bit of ops, all on the same job and with the same title in the 00's. That was in a small-to-medium company and my clients were on the small side so the projects might not have even had half a dozen people to begin with.

Larger organizations and more enterprisey projects would have had more specialized and limited roles: customer/specs people, possibly frontend and backend devs, DBAs, testing people, and those in charge of ops and environments. In my experience, that's still more or less true in enterprisey development today.

I think a part of the problem is that while new technologies have emerged and reduced the need to manually work with some older or underlying technologies, they haven't replaced previous skills.

Containers have reduced the amount of work needed to deal with deployments and environments but they haven't removed the need to know servers or operating systems. Cluster management can reduce the amount of manual work on setting up containers but it doesn't remove the need to know the underlying container engine. So now you need to know Linux servers and containers and k8s and whatnot just in order to manage a local backend development setup. At the same time, frameworks have made a lot of frontend work more manageable but they haven't made JavaScript or other underlying stuff disappear.

Thus the scope of what being a fully-versed full-stack developer entails has grown.


No doubt, but the breadth of required knowledge today is vast.

Sure, we were "webmasters", but there is a huge difference between tinkering with some PHP, MYSQL, HTML, and Apache, and being an expert on the latest cloud offerings, security practices, etc. One could spend six months in analysis paralysis these days without writing a line of code.


I don't think that is real - I don't believe every company was able to afford having DBA, Dev, QA, Business Analysts, Ops etc. as always fully separate FTE.

Only biggest companies were able to have that. If you have a single application to run there is no work for DBA as FTE, in big company where you have multiple projects you can most likely have DBA as a department that handles dozens of databases and running infra. Same with Ops, you can have SRE or OPS doing your infra if you have dozens of applications to run.

Problem is having separate QA/DBA/Dev/Ops departments was breaking because people would "do their stuff" and throw problems over the fence. So everything would go to shit and we have seen it in big companies.

Other thing is - I have read about multiple companies trying "to be professional" burning money on exactly having separate roles, but in reality you cannot simply afford FTE or having full department of DBA or QA or OPS or just Dev - unless you basically are swimming in money.


My guess is that this change has its roots in the move from physical media delivery of software to internet delivery.

My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.

I.e. where there is a barrier to entry, quality of results tends to improve.

I also see this in ease of publishing to social media, bias of “old music is better” (time has sorted wheat from chaff) and so on.

Perhaps there’s a well known description of this phenomenon somewhere already…


> My guess is that this change has its roots in the move from physical media delivery of software to internet delivery.

> My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.

I think the principle is: the greater the impact of a mistake, the more effort you'll put in (up front) to avoiding one. The more friction, the greater the impact.

When software was distributed on physical media and users had no internet you basically had only one (1) chance to get it right. Buggy software would be buggy effectively forever. So (for instance) video game companies had insane QA testing of every release. After QA, it'd get burned onto an expensive cartridge and it'd be done. People would pay the 2025 equivalent of $100+ for it, and they'd be unhappy (to say the least) if it didn't even work.

Once users had internet and patching became possible, that slipped a little, then more. Eventually managers realized you could even get away with shipping an incomplete, not working product. You'd just fix it later in a patch.

Now, with software being delivered over the internet, often as SAAS, everything is in a constant state of flux. Sure they can fix bugs (if they chose too), but as they fix bugs they're constantly introducing new ones (and maybe even removing features you use).


I don't mind doing some of these tasks but if I could go my entire life without speaking to another customer, or even their engineering team, I would die happy

It's simple supply and demand. If the average dev can do that, then that's what will be demanded. If the average dev can't do that, then there's no use demanding it since there's no one to fill that opening (at that price point).

The software dev supply market is absolutely saturated.


Like a reverse Henry Ford (assembly line)

some (many?) startups hire for impact and are looking for people (ICs!) who can "move metrics", so not that far from sales



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: