Hacker Newsnew | past | comments | ask | show | jobs | submit | readingnews's commentslogin

I have indeed lived life wrong. I work in HPC as a Systems Engineer (right now, in 2025, with graduate degrees in engineering, and 25 years of systems admin / engineering experience) and do not make what this person made in 2017, much less in 2025, OR 2-5x that amount for that matter (total dream salary, geez)... at one time I was the data center manager and teaching CS classes, at the same time, working 80 hours a week.

How the heck do these people secure these high paying jobs? There is some club, and I am not in it. Sorry to rant, but that 1FTE salary is huge.


Take a look at salary reports from places like https://www.levels.fyi/2024

If you think $124k a year is high compensation for someone with 17 years of experience in Portland, your compensation expectations are way off.


Wow, I read your informative link. Where are these jobs? I went through a round of interviews last year for Sr. positions, across a number of locations in the U.S., and quite frankly, the average salary for the positions interviewed for was $80k less than most of those in the list, and $230k less than the SWE manager in the list.


The page lists the locations, and the businesses, where these jobs are placed. Unless you live on the coast (or end up in Denver/Austin), you're going to have a harder time reaching these salary numbers.


> in Portland

Corvallis


"Graduate degrees" listed as a reason.

Yes, designing chips is hard, it takes a lot of knowledge. This is why medical doctors need to go through all that schooling... designing a tiny chip with more transistors running software that does amazing things is very difficult.

My Ph.D. is in computer engineering, specifically VLSI and chip design. This was from a few years ago. I _probably_ should have gone into industry, I mean, after all, it is what I went to school for and wanted to do. However, the starting salary for a chip designer (Intel / AMD / HP / IBM) was literally less than I was making at a side job (I worked my way through my Ph.D) as an IT sysadmin. Not only that, people that I knew well that graduated before me would call me up and tell me it was worse than hell itself. 80 hour weeks? Completely normal, outside of the 2 hours of commute time. Barely make rent because you live in California? Check. Pages / Calls all hours of the day outside of work? Check. 80 hours? You mean 100 hours a week leading up to a release, right? Check.

Looking back on it, it seems this was "the challenging" and if you made it past this (something like 5 years on) things calmed down for a chip designer and you moved into a more "modest" 60-80 hours a week role with less pressure and somewhat of a pay increase.

Yes, how do you attract talent under those conditions? It is not flashy work, takes a lot of schooling and the rewards are low. At least medical doctors can kind of look forward to "well, I can make _real_ money doing this", and have the satisfaction of "I helped a lot of people".


These truths mostly also apply when answering the eternal* question:

  Why is everybody outside music, movies, crypto & pizza struggling to attract talent?
* Snow Crash (1992) might turn out not to be so precisely prescient due to upcoming dedollarization, AI democratization/bubble burst (the exact option depends on your personality type), & the solid state battery boom:

>When it gets down to it — talking trade balances here — once we've brain-drained all our technology into other countries, once things have evened out, they're making cars in Bolivia and microwave ovens in Tadzhikistan and selling them here — once our edge in natural resources has been made irrelevant by giant Hong Kong ships and dirigibles that can ship North Dakota all the way to New Zealand for a nickel — once the Invisible Hand has taken away all those historical inequities and smeared them out into a broad global layer of what a Pakistani brickmaker would consider to be prosperity — y'know what? There's only four things we do better than anyone else:

music

movies

microcode (software)

high-speed pizza delivery


I have never done any chip design work, but I have seen some hobbyists going on with HDL without a degree. It's definitely not professional level but I suspect they are hireable materials at least. This leads me to ask the two questions:

1) If chip design (or X, anything) is so vital, so important to national security, why do universities insist that a degree of X include a lot of unrelated courses? You can argue that universities are not just for employment (yeah, as if most people go to university just for fun), but by the name of God, I really hate it when my university forced me to go through all those BS selective courses to reach 120. If you ask me, it's just money grabbing.

2) Why can't students go straight to a fab or whatever after bachelor and do their masters THERE? Isn't the industry a much better place to do that? Actually, why don't the industry simply hire high school students and go from there? Companies used to do that in the 50s/60s. I don't know if they still do that but I think it's rare.


These are horrendously difficult questions, though a partial answer to 2) is that labor (with "Baumol" training costs factored in) was so cheap up till the early 60s that high schoolers were easily competitive with college grads..

https://en.wikipedia.org/wiki/Baumol_effect#Education

Easier question to answer:

  Why is it so hard to scale payouts to craftspeople (designers, writers, musicians, YouTubers, actors even.)
Even easier question to answer:

  Why is the exchange rate of social status to USD so inelastic


Ah, thanks for the link. I do hope that labor gets more share in the revenue. I kinda love the idea of "cooperative corporations" where shares are more or less evenly shared by the employees (there are ofc differences between different levels, and I'd imagine a large amount of stocks are "frozen"). I think it's going to be a lot harmonic for all employees (even the managers).


Note the original paper by Baumol and Bowen:

"On the Performing Arts: The Anatomy of Their Economic Problems" (1965)

https://sci-hub.ru/https://www.jstor.org/stable/1816292

>It is largely for this reason that performing arts organizations in financial difficulty have often managed to shift part of their financial burden back to the performers--and to the managements, who also are generally very poorly paid by commercial standards.

Imho the paper continues to function after you patch it with your favorite calling, design-driven industry or organizational structure :)


Are you asking why college isn’t a vocational school or technical school? Or are you wondering why the USA doesn’t have apprenticeships in the way Germany and others have?

A college isn’t really meant to be a narrow tracked vocational school - but it’s fair to ask why aren’t their more vocational schools for high tech fieldss


For number 1, there really does need to be some wider reforms, but I fear that won't happen until the whole student loan paradigm crashes and burns. I had a couple of fully paid semesters that were 100% electives I had no interest in. I would have preferred to graduate a year earlier (and thus, a year richer) or take courses that were actually relevant. Problem is that universities talk out both sides of their mouth -- raking in huge amounts of cash like a corporation, feasting off guaranteed loans from 18 year olds, while espousing some nobler concept of education/enlightenment on the other hand.


I know of a bunch of people, some who only have high school degrees, who are entirely self-taught and are doing tinytapeout (https://tinytapeout.com/) chip designs. Yeah, that's not nearly at the skill or scale of designing CPUs for Intel/AMD/NVIDIA/Apple/Marvell/etc but it's still chip design!

Your concerns about horribly long hours and lower than IT/software pay are the most concerning part to me. But, if there's really a shortage of engineers who know how to do chip design, hopefully the market will take care of that via supply/demand at least once things get really out of whack.


Every time I read a post like this I'm thankful the market all but forced me into software.


It's always crazy to me that you can build crappy websites in the most flexible environment imaginable and make way more than those doing the actually deep and challenging work required for those websites to run in the first place.


Fundamentally, in programming the programmers have the means to compete with their employers if they want to. Shifts power a lot.


Looks like some kind of a gatekeeping ritual to rationalize why upper management salaries are in millions. I think we can also see in other industries too.

You hire hundreds of interns and entry level workers to let them fight in the bloodbath for 100h a week. Pay peanuts. Let them do all the work.

The ones who survive get a bit bigger salaries. Those who still persist in upper level bloodbaths are upgraded into millionaires. And paying them millions looks acceptable as it is so hard to reach the top.

While you clearly could share all those millions between entry level and paid internships, don't have 100h weeks and have a healthy industry.


Thanks for posting, I thought the same thing... my (useless data point of one) results showed 100% accuracy except the last four, which I thought "wow, I am just guessing now, can literally not see a difference".


No, OP is correct. I was teaching CS at a uni two years ago... files, directories, filesystem hierarchy, but yes, even just a file, this is a strange concept to them.

It is not a insurmountable hurdle, but it is interesting in the sense that things like git, programming, etc, all deal with files and filesystem hierarchies, and the students have never seen this, so it makes it one more thing to add to the (ever growing) list of things they need to know before we jump in.


That's just crazy to me. I'm not saying anyone is lying, just that I am in disbelief.

I taught some cybersec classes maybe 4-5 years ago and while students definitely struggled with some (what I would consider) "basic" stuff like CLI, variables, loops, etc... no one had an issue with directions like "copy this file to here", "extract the files to there", "set up this directory and point this tool to it", stuff like that.


People have had trouble with hierarchical file systems since day one. I distinctly remember being the 20 something Gen-Xer tasked with teaching boomers computers, and a large percentage just never understood why you'd want to put a folder inside a folder inside a folder. They would never do that in their filing cabinet, after all! Or why you would want to put a folder anywhere else besides the desktop since they would lose it. These people have had desktops that look like this[1] since the 90s.

1: https://www.reddit.com/r/mildlyinfuriating/comments/auu67x/p...


>> be able to understand and empathize with the various (and often opposing) groups involved in a topic

Interestingly, I have seen Elon (DOGE) and others outside of politics (that mega-church leader) telling the public (dare I say, their followers) that one of the main problems with America is empathy, and that we need to _stop_ empathizing with others.


Interesting. From what I've seen, the lack of empathy is the root of most of the political problems in the US.

If people put the welfare of others first, for example, taxpayer funded universal healthcare wouldn't even be something that was debated, it would be implemented with as much fuss as we have over painting lane markers on streets. But Americans care less for their fellow American than most other countries out there it seems.

How would removing what little empathy that there is improve matters?


To them, removing empathy allows doing “what needs to be done,” like sending undesirables to a desolate work camp in a foreign country without any legal recourse.

See also: “the sin of empathy.” https://www.reddit.com/r/SaltLakeCity/comments/1i942hf/ogden...

Peel apart the layers and at the root of it all is white male supremacy — by any means necessary.


I am not sure, and it may be this persons culture/background, but I do know that at a college/uni, your advisors/reviewers would tell you not to do the adjective/drama stuff, as it adds no real value to a scientific/technical paper.

e.g. potentially-devastating, hugely disruptive, special critical, greatly reducing, valuable milestone, almost completely, ambitious pragmatic, most or nearly all, existing practical.


I have not actually heard that argument. It has always been noted that Benz invented the car, and Ford invented the assembly line, for cars.


Yeah, as an American I never was taught that Ford invented cars. He invented cheap cars.


Oldsmobile invented the assembly line for cars: https://en.wikipedia.org/wiki/Ransom_E._Olds#Assembly_line


He was the first to mass produce cars. But given that cars and mass production already existed it was probably only a matter of time before someone decided to mass produce cars.


>> Connectors are actually extremely difficult to make.

While your points listed are valid, we have been making connectors that overcome these points for decades, in some cases approaching the century mark.

>> I'm not surprised at all that they are running into issues here, these cards are pulling 500+ watts. That is a LOT of current.

Nonsense. I used to work at an industrial power generation company. 500W is _nothing_. At 12VDC, that is 41.66A of current. A few, small, well made pins and wires can handle that. It should not be a big deal to overcome that. We have overcome that in cars (which undergo _extreme_ temperature and environmental changes, in mere minutes and hours, daily, for years), space stations (geez), appliances, and thousands of other industrial applications that you do not see (robots, cranes, elevators, equipment in fields and farmlands, equipment in mines, equipment misused by people)... and those systems fail less frequently than Nvidia connectors. But your comment would lead one to think that building a connector with twelve pins on it to handle a whopping (I am joking) 500W (not much, really, I have had connectors in equipment that needed to handle 1,000,000Watts of power, OUTDOORS, IN THE RAIN, and be taken apart and put back together DAILY) is an insurmountable task.


One word: cost.

Look up how much industrial/automotive connectors cost, and you'll see the huge difference in quality.


Those GPUs aren’t particularly cheap, even a $100 connector and cable wouldn’t be a huge deal breaker for a $2000-3000 device if it means it’s reliable and won’t start a fire (that’ll cost way more than $3100)


Yes cheap connectors exist and there is a marked for it, like everything "cheap". But to what point one wants to "defend" a trillion dollar company, on a product that was never marketed as "cheap", that actually comes with a hefty price tag, to skimp on something that is 0.01% of there BoM cost. If you sell for a premium price you should better make sure your product is premium.


I've bought cars that cost me less than a nVidia card (and they were running).


Which new cars cost less than 2000$-1000$?


They didn't say new cars.


Then what's the point of such an arbitrary comparison? It's normal that plenty of commodities that were expensive when new have been devalued by age and can cost less on the used market than the top of the line BRAND NEW cutting edge GPU today, which itself will be worthless in 10-20 years on the used market and so on.


Presumably, the point is that a working car is more complicated & cheaper (in this case) than the graphics card, while the graphics card can't figure out how to make a connector.

I read it as a kind of funny comment making a broader point (and a bit of a jab at nVidia), not a rigorous comparison. I think you might be taking it a bit more seriously than was intended.


An old legacy car is definitely not more complicated than designing and manufacturing a cutting edge silicon made for high performance compute.

The price difference is just the free market supply and demand at work.

People and businesses pay more for the latest Nvidia GPUs than for an old car because for their use case it's worth it, they can't get a better GPU from anywhere else because they're complex to design and manufacture en-masse and nobody else than Nvidia + TSMC can do it right now.

People pay less for an old beater car than for Nvidia GPUs, because it's not worth it, there's a lot better options out there in terms of cars and cars are interchangeable commodities easy and cheap to design and manufacture at scale at this point, but there's no better options easier to replace what Nvidia is selling.

Comparing a top GPU with old cars is like comparing apples to monkeys, it makes no sense that doesn't prove any point.


>An old legacy car is definitely not more complicated than designing and manufacturing a cutting edge silicon made for high performance compute.

A car is more complicated than a connector, at least.

Anyways, the rest of your comment is again taking a humorous one-liner way too seriously. Thanks for the econ lesson though, I guess. I liked the part where you explained to me the basics of supply and demand like I am in 5th grade.


>A car is more complicated than a connector, at least.

The connectors on a new car cost more than the connectors on a new GPU part for part.

>I liked the part where you explained to me the basics of supply and demand like I am in 5th grade.

You'd be surprised about the state of HN understanding of how basic things in the world work.


used objects and imports from economically isolated land are traded at meme value, doesn't count.


That would be relevant if the margins on GPUs weren’t astronomical.


No, not for a connector for 500W, on a $2000 GPU from one of the worlds biggest companies. They can do better.


Well surely they can take that cost out of the $5090 people are paying for these cards.


They could use a common XT90 or something similar. You find high amperage connectors on all the RC lipo batteries and they are cheap enough, you find them on $100 products (batteries).

I regularly work with 100amp+ at 12v. It’s obvious the connector NVidia is using is atrocious and we all know it.


Nvidia is clearing 4 figures on each 5090. They can afford another few dollars on connectors.


There is something strangely magical about the pictures of the city streets at night. Lack of trash? How clean the roads are? The lighting? I have been out at night in plenty of places (not Japan), but it never looks like this. At first I thought it was their camera, but I think it is just Japan at night?


Used to use sublime, and wanted to purchase a license, but it is only good for the version you are on. Since I use gentoo and it upgrades frequently, my license would require me to hold it back, and eventually it will break. How do others deal with this? Just keep purchasing a license?


I used ST2 for like five years in its unregistered state. (I rationalized that by telling myself that nothing I used it for made me money, but I still feel a little gross about it.) In practice it was no problem.

I paid for an ST3 license, then I paid again for an ST4 license. Whenever ST5 comes out, if they ask, I'll pay for that too.


That is kind of why so many software companies switched to subscription models.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: