Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The market is mostly being flooded with web developers (reddit.com)
52 points by synergy20 on April 14, 2023 | hide | past | favorite | 48 comments


I do suspect it's true that there are more web developers (frontend or fullstack) laid off than any other category, but that is likely a result of much higher demand, given that A) web had a decade+ head start on mobile and B) the app store gold rush had already started to taper off years ago, leaving much higher friction and thus lower demand for true native apps (including the resulting strategy shift towards web views/react-native to hedge on development costs).

Overall this means native might be slightly less saturated, but I think it's a real risk for junior developers to conclude that that means it's a better career path than web. Web is still where the greatest demand is. Other specialties like embedded have potential but aren't at the same volume. And of course there's always going to be demand for a smaller number of maintainers of legacy systems (eg. COBOL/mainframe) that provide extremely high ROI but are generally undercompensated due to lack of competition. Ultimately after a multi-decade run up like this, the market will be saturated with people who are just chasing the paycheck but aren't actually suited the job. If you have the natural curiosity and attention to detail needed to succeed as a software engineer, just pick something you're interested in and hunker down while to give the dead weight time to wash out. It happened in 2000, it'll happen again.


There is no flood of skilled developers... There is a flood perhaps of outdated devs, novices, people who don't know how to negotiate or apply themselves, and phonies... Reports like this are pure sensationalism... There are still far too many disciplines and contexts within development to put the career of "being a developer" all into one bag.


Yes there absolutely is a flood of skilled developers, in fact there's so many of them they're expected to solve two Leetcode hards in a 45 minute interview to get a job now. Look at how many people got laid off in the last year, a lot of them are skilled


there is that endless spiral of tools 'n frameworks that drive developers to bestow the Next Best Dev Tool Ever that must be considered. If you're at the front of that wave, you're good to go but the detritus left behind has to be comprehended by others for some time after.

I start seeing those pictures of kids sorting mountains of western e-waste when someone declares they want to get into dev/ops


How broad are we considering?

Isn't that like 90% of the software market?

Every role I've had and damn near every listing I've seen in the past few years is some CRUD "backend" gig, or some JS-of-the-week "front-end engineer." I consider all of these "web developer". Even the ops stuff is infra in service of CRUD.

Seriously, where are the jobs writing native code where the enduser runs an actual native executable on their machine that doesn't use HTTP to get state or whatever? Video games? People implementing ML stuff? Data science?


Look for companies that require transformation of some data from one form to another, from one place, to another. These jobs don’t require any coding of interfaces or even thinking about users. They are jobs where you write code that creates competitive advantage for the company.

These will be advertised as backend gigs, but they are not the kind of backend gigs where you are just writing code to help serve the frontend. You are writing code to make things possible that weren’t possible before. This is what it means to truly work in tech, not just web dev.


Web devs are the comedians of software development in that they’re the lowest form- not because the job is easy, but because bootcamps (which were/are all web dev bootcamps) cranked them out, and if someone tells me they’re a web dev it’s more likely than any other specialization that they are we’ll just say “junior level”.

I used to be a web dev, and got tired of what you’re describing- things like the CRUD app of the week in the JS framework of the year… it’s all problems that have already been solved. Meanwhile I find data engineering and data science to be actually-interesting.


> I used to be a web dev, and got tired of what you’re describing- things like the CRUD app of the week in the JS framework of the year… it’s all problems that have already been solved. Meanwhile I find data engineering and data science to be actually-interesting.

Yep I asked the question as I'm in the same boat. The churn is unpleasant and the work isn't particularly exciting in my experience.


Mobile is pretty big.

But yeah I agree.

* AI/Data Science/Research

* Backend/Database

* Desktop

* Embedded

* Games/Graphics/VR

* Mobile

* Web


I work in embedded and even embedded stuff nowadays often touches web-app kind of work through IoT.


I mean that's what happens when you signal for two decades that web is the highest paying salary for the lowest upfront investment. Furthermore, the ability for companies to do their compute in the cloud just means more HTTP traffic.


Also lots of online DIY tutorials, so many frameworks, etc. The learning got too easy, the tools got too good. For the best really. Was getting to a point where the flourish mattered more than the content.


Don't agree with this statement.

I have multiple projects in Javascript and now doing react for the 2nd time and the conditional statement is though to swallow for me, which is easy in JavaScript harder in React due the abstraction.


I take it you haven’t seen our open-source commoditized commercial trash fire?


> highest paying salary for the lowest upfront investment

Where does this notion come from? Web is a goddamn mess and anyone doing it right is doing the lord's work. Everyone else gets fired as they should.


So, everybody should get fired?


I don’t think that’s true. iOS developers are paid better and learning Swift + UIKit is a simpler target than learning HTML + CSS + JS + framework.


It depends a lot on the developpers' initial situation I think.

Developping for iOS before getting hired usually means paying for both an Apple machine and an iOS device. If you already had both that's a non issue, if you were on android + windows that's a tougher gap to cross.

> Swift + UIKit vs HTML + CSS + JS + framework

It's personal preferences mostly, but I think Swift is tougher to beginners that Objective C was, and requires more upfront concepts to be fluent. I'd call it a wash compared to HTML + CSS (which are not simple by any way, but there's much better tooling to understand how it works)


I imagine there is much less demand for iOS developers than general web developers though, so maybe the ease of entry makes the calculation different.


Most "web developers" couldn't develop an application from start to finish to save their lives, and of those who manage to find jobs, you can wait a decade and they still won't be able to do it. It's painful attempting to interview a lot of them because they think that skinning Wordpress should earn them six figures.


As a new developer I saw this already with collegeages of mine, which couldn't answer the most basic questions, but this also true for DevOps.


I should say that on the flip side, one of the most talented devs I worked with received a finance degree from his university and then went into development because he didn't want to manage people or spend every day cold calling prospective clients. He had been at an employer several years before me and while I was definitely understood product and design better due to my own background, that guy was fantastic at being able to drill down on hard topics and come up with functional solutions that scaled. I was a fierce advocate of his for awards and promotions until he left to go somewhere else where he wouldn't be siloed into the hard jobs.


Is there any evidence provided to support (or refute) this claim?


None but based on what's taught at coding bootcamps nowadays it isn't farfetched.


It’s funny that we call them bootcamps and then expect anything to come out of them besides soldiers.


I think "mercenaries" would be the more appropriate analogy.


I think with mercenaries there is an expectation of skill while with bootcamps you expect the soldiers to be green as hell.


The evidence is pretty clearly anecdotal, if you read the actual comment and not just the title.


I'm aware - I read the post and the top X comments. I'm wondering if there was anything of any substance anywhere in the thread.

I'm also interested to hear if anyone here has any data. Hence the question.


My apologies, I'm used to comments asking for evidence being a form of accusation.


Knees be jerkin'


There's something to mobile development, for sure. I personally would rather be online, building things atop standards & protocols, & I hope the web continues to eat native apps lunch forever & ever more & more, that's universal & competent & safe & user empowering in vastly deeper ways, but I definitely also respect mobile devs & it's definitely an area. It's real tech work.

It's just super super super sad, but so many really good really important really high expertise bits of computing are just trash fire worlds. Firmware developers should be gods, should be paid incredibly well. They know real shit, facing obscene & intricate challenges. But below the bottom of most sw devs stack lies some incredibly shitty code made by incredibly underpaid labor, from orgs & people absolutely desperate just to get the hardware up and call it a day.

You do not want these jobs. The whole computer industry has kind of adapted to expecting people who make chips to just have gobsmacking poor setups & standards. Quality at the sw base is not regarded. Software/firmware is just a fourth fiddle to these orgs.

This is far from universally true. But you want to play in places that make money. Firmware is so important so real such a base of it all, but it's not a profit center. It's also just a tiny segmemt, miniscule. The last 8 years have seen hardware companies gobbling each other up like crazy, intense intense intense considation, and there's just so few jobs out there to make the hardware go. That should put a premium on quality & effort & trying, should make it a competitive advantage, but it's so rarely true. Such a sad shame.

The hyperscalers as is usual for their lot have basically stepped in once again with a patronage model to pay good people to de-shittify & comoddify a bunch of the lower levels. Chrome books have some wild ass Embedded Controller tech that gets passed around as a less-tragic off-the-shelf-replacement for like 50% of PC system building, is built with some overall intent & purpose (& as is Google a way, utterly uncaring about existing compounded expectations & blank slated).

OpenCompute is a more visible broader hyperscalers effort, where like 30% of the work is just building sensible firmware standards & implementations that are competemt & agreeable & don't suck. They just got tired of endless proprietary shovelware & were like, we'll do it.

It is shocking how confined computing has become. The web was like a tiny niche for ages. Until node.js (2009) almost no one took Javascript seriously, everyone had other shit they did & maybe dabbled a little in js (I did use some JScript at my first job 2005+ because I was doing a ton of js anyways but it was raw & rare).

There's a ton of wild-lands-esque datascience shit going on & those folk could definitely use some help, but it doesn't seem like being the CS person to help the org is a regular enough thing.


21 ampersands, in case you're wondering.


And two ands.


Obviously written by chatgpt.


his answer is human-generated, gpt can not be so true yet.


/s

(To be clear, it was in fact not written by gpt)


i could tell just by how long it was. Chatgpt always seems to spit out too much text


And my axe


The problem with firmware is that it should be created/improved by the users of the device, but the manufacturers neither publish the source code nor document the hardware sufficiently for anyone else to write better firmware from scratch.

The hardware manufacturers don't care because they aren't going to lose a large proportion of sales from the small number of their customers who even know what firmware is. But those few in large organizations with their own resources or who are firmware programmers who happen to own the device, would fix the problems. For everyone. Only they're not given what they need to do it by the OEM. Meanwhile the OEM fails to do it themselves.

What we need is for large customers to figure this out and start demanding open source firmware from hardware vendors.


The sky is falling... etc. It is true, however, that AI is going to reduce a team of 10 to 1, regardless of what field they're in. One day soon we will see companies run by an AI employing umans for 1:1 sales only, until we become more used to being sold stuff by bots. We're wittnessing the birth of a new age, and people are talking about web developers.


I’ve been flirting with being a human liaison for autonomous organizations for a decade

ten years ago once we got the fabs going for bitcoin miners we noticed that machines could make other machines, acquire a fungible digital resource and transact with other agents, human and machine alike. the window of opportunity turned out to always be too small in the mining world, but that theoretical piece of the puzzle was suddenly solved.

AI that starts making money on its own accounts is going to come next. Someone could do this with Llama hooked up to a node. Maybe start by moderating discord servers for a few hundred dollars of crypto per month, just like children do.


Yes. I'm just surprised at the downvotes. Hmm. We're talking about jobs, right? Everyone's job is very much at risk unless people adapt to a P2P economy with humans and AIs in it, trading with and employing each other.


I take that back I'm suyrprised. HN crowd is mostly very pedestrian.


Everyone here aspires to work for an ad conglomerate that's busy destabilizing democracies while they "own" the compose button in an app. Of course its very pedestrian.


When someone ask for my profession and I reply I'm a software developer, the followup question is often whether I'm frontend or backend. But none apply to me.

Btw, to me, a full stack developer would be someone who work on the application and the OS (kernel and WM) and the compiler and even the hardware :-)


Does that mean I can hire someone to do all the shit CSS wrangling I’m currently stuck with?


huh, that explains why I cant get hired doing that

I remade my resume like that but seems that was a mistake

okay back to specialized stacks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: