Hacker News new | past | comments | ask | show | jobs | submit login

Hiring of juniors is basically dead these days and it has been like this for about 10 years and I hate it. I remember when I was a junior in 2014 there were actually startups who would hire cohorts of juniors (like 10 at a time, fresh out of CS degree sort of folks with almost no applied coding experience) and then train them up to senior for a few years, and then a small number will stay and the rest will go elsewhere and the company will hire their next batch of juniors. Now no one does this, everyone wants a senior no matter how simple the task. This has caused everyone in the industry to stuff their resume, so you end up in a situation where companies are looking for 10 years of experience in ecosystems that are only 5 years old.

That said, back in the early 00s there was much more of a culture of everyone is expected to be self-taught and doing real web dev probably before they even get to college, so by the time they graduate they are in reality quite senior. This was true for me and a lot of my friends, but I feel like these days there are many CS grads who haven't done a lot of applied stuff. But at the same time, to be fair, this was a way easier task in the early 00s because if you knew JS/HTML/CSS/SQL, C++ and maybe some .NET language that was pretty much it you could do everything (there were virtually no frameworks), now there are thousands of frameworks and languages and ecosystems and you could spend 5+ years learning any one of them. It is no longer possible for one person to learn all of tech, people are much more specialized these days.

But I agree that eventually someone is going to have to start hiring juniors again or there will be no seniors.






I recently read an article about the US having a relatively weak occupational training.

To contrast, CH and GER are known to have very robust and regulated apprenticeship programs. Meaning you start working at a much earlier age (16) and go to vocational school at the same time for about 4 years. This path is then supported with all kinds of educational stepping stones later down the line.

There are many software developers who went that route in CH for example, starting with an application development apprenticeship, then getting to technical college in their mid 20's and so on.

I think this model has a lot of advantages. University is for kids who like school and the academic approach to learning. Apprenticeships plus further education or an autodidactic path then casts a much broader net, where you learn practical skills much earlier.

There are several advantages and disadvantages of both paths. In summary I think the academic path provides deeper CS knowledge which can be a force multiplier. The apprenticeship path leads to earlier high productivity and pragmatism.

My opinion is that in combination, both being strongly supported paths, creates more opportunities for people and strengthens the economy as a whole.


I know about this system, but I am not convinced it can work in such a dynamic field as software. When tools change all the time, you need strong fundamentals to stay afloat - which is what universities provide.

Vocational training focusing on immediate fit for the market is great for companies that want to extract maximal immediate value from labour for minimal cost, but longer term is not good for engineers themselves.


A formal apprenticeship still includes academic training - either one or two days a week at college, or longer blocks spread throughout the year. I can't speak for software engineers, but the mechanical engineers I know that have finished a German apprenticeship have a very rigorous theoretical background.

I actually think it work fairly well, if it wasn't regulated.

Eg a company like Google (or similar) could probably offer you better on the job vocational training than going to uni would do to teach anyone programming.


do most people know country codes to the degree that they know CH is Switzerland? as feedback, i found this added an unnecessary extra layer of opacity to this comment

I like how the smallest Eurocentrism is greeted with the wagging finger to be inclusive on hackernews, while the expectation is that 50 state acronyms are well understood by any reader from Lazio, Lorraine, or Thuringia ;)

Whoops, definitely read that as China until your comment.

I feel like hiring juniors still exists, because I still hear about loads of "boring" small startups that do the "we hire juniors and seniors".

Juniors cuz they're cheap and motivated (and potentially grow into very good team members!), seniors to handle the trickier stuff. Mid-level people don't get picked up cuz there's enough junior pipeline that the cost-benefit doesn't work out.

Thing is these companies tend to have, say, college pipelines and the like for juniors. Internships and the like. It would be really painful to not have internships lined up at "your local startup" in this day and age.

My impression is that a lot of the junior dev pipeline is in smaller places that don't show up in job boards in the same way as the rest of it. Maybe that's dried up too, but I have my doubts.

You still need somebody to work the robot, even if the robot is "doing the coding"!


I don't think it's been dead for 10 years. I'd place it at maybe 3? I teach at a mid-ranked university and the majority of my fresh out of college students were getting good-to-great entry level offers just a few years ago. The top 5-10% were getting well into six figure offers from FAANG or similar companies. But the entry level job market really tanked in mid 2022 when all the big tech companies did rounds of layoffs, and it's been much harder since then.

This was pretty much the height of the “bootcamp” era, and while some were still great at teaching solid programming skills, many people who were mediocre coders saw it as a quick way to make money off those who don’t know better. In my opinion, companies started noting more and more that they were hiring people who can barely code in a real software environment but had focused training on resume padding and interview prep, and that many of the college graduates weren’t any better off, so they ramped up to far more rigorous testing, live coding projects, and more and more rounds of interviews just to weed out all the people who had been taught how to BS their way into a tech job. Now the interview process has become so cumbersome that it is more worthwhile to filter out nearly every applicant immediately.

We should factor in the hiring sprees that really distorted the market from 2020-2022..

2020-2022 was not "normal hiring", it was much higher than normal.

Even the large, conservative, boring enterprise where I work was hiring about 5 new developers a week in those years.. I know because I did the "Welcome to our department" technical onboarding workshop and I saw everyone coming in the door every week.

Before 2020 I ran that workshop once a month or so? Probably 10 times a year at most.

So of course then the layoffs came when ZIRP ended, as everyone had to adjust back to real world conditions and grapple with the fact that the team sizes they had now were not sustainable (or necessary?) when free money ended.

Couple that with "efficiencies" from AI and generally more conservative scope and ambition by a lot of companies in these uncertain times, and the market looks drastically worse than it did in 2020-2022.

But if you compare the market today to 2018 (for example) instead, it's still worse but definitely not as worse.

Lastly, hiring is also very broken today due to AI, with candidates using it to flood companies with applications and then companies using AI to auto-filter that incoming flood (and likely losing out on good candidates in the process) simply because they can't handle the volume.

I have a lot of sympathy for folks coming right out of university trying to land their first good job. I'm sure it's the toughest it's been in a generation...


It started with the end of ZIRP and LLMs finished the job.

"Money isn't free anymore, let's set ours on fire / give it to NVDA"

More like, "Money isn't free anymore, so let's try and automate away some more labor". It's very much not setting the money on fire from business POV - it's the opposite actually. It's also a constant in history - the more expensive human labor gets, the more effort is put into eliminating the need for it (whether by automating jobs away or redesigning the product or the entire business to not need them in the first place).

> Hiring of juniors is basically dead these days and it has been like this for about 10 years and I hate it

We still have a large funnel of interns that end up becoming junior devs, and then progressing normally. I don't know the exact ratio of interns that end up actually getting hired as full-time employees, it's definitely low, but I think this is more of a function of most of them not actually being any good.


Bottom of the barrel consultancy shops will hire as cheap as possible. E.g. some liberal arts major whose only coding experience is a 2 week 'bootcamp'.

They will sell them as 'junior SE' on the first 2 projects, 'medior SE' on the next and then they'll be 'senior SE' within 18 months tops of leaving bootcamp.

The projects by these shops are nearly always troubled, requiring many (customer paid in change requests) reworks before (if ever) getting in production.

They seldom are hired to do projects for a client twice, but it's a lemom's market and there's plenty of fish in the sea.

So what happens with these shops is that their SE's will rely even more than average on AI assistants, and deliver a v0.1 faster, but with 10x more code and holes than before, taking even longer to get in production but given lemons and fish have not changed they'll still hire, now even cheaper out of 'prompt engineering bootcamp'


Some of this relates to a culture of job-hopping. It seems uncommon these days to stick around at a company for many years.

If your next hire is only going to stay for 1-2 years, it doesn’t make sense to hire a junior team member and invest in their growth.


That sadly makes sense. I’m in a position lately to influence hiring decisions and I’m noticing a similar bias in myself.

As a job hopper myself, I can’t fault others for doing it though. I never hopped for the money. I just got bored or felt isolated in my role. But a nice consequence is that my salary actually appreciably increased, as opposed to colleagues/friends who stuck with the same company.


I've often given developers I mentor the advice they should "zig-zag" to grow their career and get varied experiences rather than stay in one place too long, but my advice was 2-3 years at each place at minimum.

I think anything less than that, and you haven't had time to really learn an ecosystem, and more importantly you might not have had a chance to live with the consequences of your technical decisions (i.e. supporting something in prod that you built).

I know plenty of people who started somewhere, left for a while, and then came back and ended up in a position higher than they would have gotten if they had stayed put and tried to climb internally.

And yes agreed that moving around will 100% grow your comp faster than staying put (in most cases).


I mean I wish I could stay, but companies are greedy and refuse to give out decent raises or promotions regardless of your contributions. The only real way to make more money is to hop between jobs, all the while these companies are making record profits year after year.

Like right now I've been at current co for 3 years. At the start I was getting decent raises after big projects. I now have increasingly more responsibility, I'm doing hiring, I'm doing mentorship, I'm doing huge feature work, I have to waste half my time talking to the braindead stakeholders. And what do I get for that? Absolutely jackshit, I'm getting paid the same I was when I had a quarter of the responsibility and work, yet the company is boasting about making ever more money as they lay off entire teams of people.

Why on earth would I be loyal at this point, it's clear they don't give the slightest inkling of a shit about me or anyone else who does have "Head of" or "Chief" prepended to their title.


That's a self-inflicted wound on the part of the companies though, with them offering relatively shit pay for people who stick around compared to people who switch jobs.

You get what you optimize for, really.


> But at the same time, to be fair, this was a way easier task in the early 00s

The best junior I've hired was a big contributor to an open source library we were starting to use.

I think there's still lots of opportunity for honing your skill, and showing it off, outside of schools.


> The best junior I've hired was a big contributor to an open source library we were starting to use.

From my experience no one cares. You're lucky if recruiter even looks at your CV, not to mention your GitHub profile.


This is why I've managed to help startups I have worked at make better hires; I look at the code candidates have written, and poke through the commits on GitHub.

As a candidate, you highlight it yourself!

Agreed. One of my mentors early on was a self taught engineer and honestly I'd trust him a lot more than some of the engineers with degrees

Hmm, I thought Matz had a story like this but AI tells me it's probably apocryphal. Ruby developer applies for ruby job where they want more years of experience than existed since he wrote ruby.

Oh well, I know that it happens, saw it in 2010 with "cloud" when it was basically still just EC2,S3,RDS, and whatever the not-haproxy-but-haproxy load balancer was called, ELB. Job poatings asking for half a decade or more of experience. I always get the feeling there's some requirement they post jobs public but only hire internal, but I have no way to prove that; I have heard others say this, though.


I recall that story as well. I think it might have been in a ruby weekly email if you sub to that.

We have had a lot of success hiring right out of college over the last 10 years

The place where I work at hires an ungodly amount of juniors and fresh-grads (because a lot of them drop out and quit before they do any meaningful work). We're talking people that are completely unproductive and unusable for any sort of commercial project. We then spend at-least a year or two giving them a salary whilst they do toy projects and get trained. Literally doing what I remember doing in 1st/2nd year college with group projects and pet-assignments, complete with grades and feedback etc. Even after all of that, we still have to "train" them with hand-holding on an actual project work before they are a net-positive. Sooner or later someone will realize that they can just forego all that wasted training effort and just hire someone that is already productive. There is always a small percentage that are amazing and they get pushed through to projects very quickly. Which is a shame, because they then watch their fellow cohort sit around doing pet-projects and receive a salary, whilst they slog through a real project with deadlines, stress and the risk of failing.

This is entirely a combination of two things: The quality of grads coming out of college/university, and pressures coming from the market. Colleges have been pushing through entirely unqualified students, some even language illiterate, into the market place and what we're seeing is a response to that. Now couple that with the pressures that companies are facing, and you can see why none of them want to even take on the risk of training and up-skilling someone just so they can find the actual good employees which are a small percentage.

Of course, in my company's particular country and context, government regulations make it impossible to fire someone and there is huge pressure to keep-up DEI quotas despite no actual good DEI candidates being available, and we have a mess. Day to day is glorified baby-sitting people not-knowing what to do, dealing with their "feelings" (usually feelings of inadequacy and sometimes snobbish entitlement) and still trying complete a project at the same time.


Why does your company even hire those people? They seem like a net negative as far as your profit-and-loss is concerned?

I mean even compared to just not hiring any juniors.


The problem is getting hired. With all the resources available today, learning programming is easy compared to pre-LLM, pre-Stack Overflow, pre-Google days of learning to program. I dare say an autodidact in the original dot com boom, transported to today, would be fine, as far as being useful to a company goes. You don't need to know every frontend framework, all possible backends, and be a Linux god at devops, all at once. Sure there's more stuff today then in the 00's, but no team is using all of all three of those simultaneously, so what you practically have to know isn't too much for a motivated individual to learn.

The problem is getting hired. If seniors are having problems getting callbacks for interviews right now, then a young kid with a thin resume isn't going to get hired, no matter how senior their skills are in reality.


One problem I’ve seen is that junior now means “hustler who is faking it” or “code boot camp grad who doesn’t really understand anything.” If I ask for a “junior” I get someone who googles and asks ChatGPT.

High salaries in programming has attracted a lot of people who have no passion for the craft. To get good you have to have that and it takes years of either school or passionate autodidactic learning or both.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: