Hacker Newsnew | past | comments | ask | show | jobs | submit | fooker's commentslogin

Coming soon to a programming ecosystem near you:

LLM(eat apples in fruitbasket)

vs

foreach (apple in fruitbasket) apple.Eat()

Your comment can be repeated almost word for word here.


Not at all. I'm comparing two different syntaxes that can reliably compile to the same machine code. A syntax that produces non-deterministic results is a completely different matter.

> non-deterministic

You can control the amount of non determinism.

And also, it is interesting that you think modern compilers are deterministic.


While an LLM with a fixed seed is technically deterministic, it’s still unpredictable.

> And also, it is interesting that you think modern compilers are deterministic.

A compiler, unless it has a bug, will always produce output matching the specification of the language.


Please do not engage in bad-faith arguments centered around pedantry. You know very well what I mean.

I know what you mean, and it was correct a year or more ago. Now, you are wrong.

AI is reliable enough to not mess up this translation now, especially if you configure it right (top p, and temperature parameters).

This abstraction lifting is absolutely happening in front of our eyes now. For the exact same reason the C for loop is less readable.

The different is that you don't yet store the prompts, just the generated code. That difference is not going to last too long. Storing prompts and contexts along with generated code is likely how we are going to be doing software engineering for a few decades before whatever the next leap in technology works out to be.


You could just lock the seed to get "deterministic" behaviour, but you are missing the point of programming languages completely. Programming languages are a set of rules that ~guarantee predictable behaviour down to the bit level. If you were to try to recreate that with LLMs, you run into two problems: one, your LLM is now a programming language where you have to put exact specific inputs in to get the correct outputs, except you don't have a specification for the inputs and you don't know what the correct outputs even look like because you aren't a software engineer. Two, even with a locked seed, the LLM is still going to output different code based on the exact order of letters in your prompt and any change in that will change the output. Compilers can execute a variety of optimizations on your code that change the output, but in the end they are still bound to hard rules and you can learn the rules when you get output that does not match what your expectations were from the input. And if there is a bug in the execution of the rules by the compiler, you can fix it; that is not possible with an LLM.

This talk about replacing software engineering by people who have no idea what software engineering is gets unbelievably tedious. The advent of Javascript did nothing to replace software engineers, it just created an entirely new class of developer. It lowered the barrier to entry and allowed anybody to write inefficient, bloated, buggy, and insecure programs. Our hardware is advanced enough that for many trivial applications there is sufficient overhead for inefficient and bloated programs to exist and be "good enough" (although they are causing untold damage in the real world with security breach after security breach). However, lowering the barrier to entry does not replace the existing engineers. You still need a real software engineer to develop novel applications that use the hardware efficiently. The Duchies of Javascript and Python are simply a new country founded adjacent to, and depending upon, the Software Engineer Kingdom. Now a new duchy is being founded, one that lowers the barrier to entry further to make even more inefficient, even more bloated, even more buggy, and even more insecure programs easier than ever. For some use cases, these programs will be good enough. But they will never replace the use cases that require serious engineering, just as Javascript never did.


> Programming languages are a set of rules that ~guarantee predictable behaviour down to the bit level.

No, that’s your interpretation of what a programming language is, based on nostalgia and wishful thinking.

If you get a programming system that sacrifices these and works better, people are going to use those.

FYI I am a compiler developer who has contributed to about three widely used compilers, I assure you I understand what programming languages and compilers are.


> If you get a programming system that sacrifices these and works better, people are going to use those.

Sure. That's a very large "if", though, one that evaluates to false and will continue evaluating to false for the foreseeable future. I am told over and over that I am being replaced and yet I have not seen one singular example of a real-world application of vibe coding that replaces existing software engineering. For starters, where is the vibe-coded replacement for Clang? For Linux? For IDEs? For browsers? I don't mean a concept of an idea of a browser, of course. To make the claim that the new programming system works better than the old system, it must produce something that is actually superior to what people currently use. LLMs are clearly completely and totally incapable of this. What they are capable of is making inferior software with a lower barrier to entry. Superior software is completely off the table, and it is why we don't see any existing software being replaced at scale, even though existing software absolutely has flaws and room for improvement that a "better programming system" would be able to seize upon if it existed.

That you can confidently assert that I am wrong, while having zero real-world evidence of superior software engineering produced by the new system, is indicative of a certain level of cultish thinking that is overtaking the world. Over and over and over again, people keep making these grand claims promising the world is changed, and yet there is no tangible presence of this in reality. You simply demand belief that it will change, as though LLMs are a new religion.


You are wrong about modern AI not being able to produce deterministic output for the for loop we are talking about.

You are right about modern AI producing superior software engineering.

You should read the chain of comments again and try to understand the non deterministic leaps of logic you have made :)


> Storing prompts and contexts along with generated code is likely how we are going to be doing software engineering for a few decades

You specifically asserted that prompts would be the future of software engineering. If my memory is not mistaken, this was edited later to hedge with "likely" and did not include that when originally written?


> If my memory is not mistaken

Your memory is as impeccable as your logic :)

> asserted that prompts would be the future of software engineering.

That’s exactly not what I asserted. Can you find the critical difference?


We'll have to leave it here then. I am fairly confident you edited the comment after the fact, but I cannot prove it, so further discussion is fruitless.

> It's not a technical problem

How do you do encryption?


A probable implementation is that you bootstrap the initial key exchange using web PKI (if you want to talk to Alice@example.com then your client makes a TLS connection to example.com and asks for Alice's public key) and thereafter you use something like the Signal ratchet thing.

That technical solution is significant and unsolved. I don’t think it would likely work without some major new standards either.

Serving 2+ billion daily users is a technical challenge at least

No amount of what ifs and buts is going to change the fact that the tech is now mature enough to make software in hours that previously needed man-years.

And it is getting better at a pretty rapid pace.


At least it’s not among the thousand AI note transcribing startups that have sprung up in the last two years.

I don’t understand if it’s something to do with the recommendation algorithm, but my LinkedIn is constantly full of those and my career has nothing to do with this at all.


"Create a product around a problem you know well" they said, and I'm thinking, as I'm typing/writing into my notes/journal; "what could I build that would have hype and is close to something I know?"

Probably every note taker has had the thought "What if I did it better?" at one point or another.


> it's still a strict upgrade over AVX2

If you benchmark it, it will be slower about half the time.


I do not know what benchmarks do you have in mind.

All the benchmarks that I have ever seen published about AVX-512 vs. AVX2 on AMD Zen 4 have shown better performance on AVX-512. Very frequently, the AVX-512 performance was not a little better, but much better, despite the fact that on Zen 4 both program versions use exactly the same execution resources (but AVX-512 programs have less instructions, which avoids front-end bottlenecks).


for the simplest cases it will be about the same speed as avx2, but if you're trying to do anything fancy, the extra registers and instructions are a godsend.

Well, try it out for a realistic program.

It makes for nice looking code, yes. But is often slower (for various reasons that are well understood by now).


Please mention some of those reasons.

I’ll be so happy to buy a EOL H100!

But no, there’s none to be found, it is a 4 year, two generations old machine at this point and you can’t buy one used at a rate cheaper than new.


Well demand is so high currently that it's likely this cycle doesn't exist yet for fast cards.

For servers I've seen where the slightly used equipment is sold in bulk to a bidder and they may have a single large client buy all of it.

Then around the time the second cycle comes around it's split up in lots and a bunch ends up at places like ebay


Yea looking at 60 day moving average on computeprices.com H100 have actually gone UP in cost recently, at least to rent.

A lot of demand out there for sure.


Not sure why this "GPUs obsolete after 3 years" gets thrown around all the time. Sounds completely nonsensical.

Especially since AWS still have p4 instances that are 6 years old A100s. Clearly even for hyperscalers these have a useful life longer than 3 years.

I agree that there is hyperbole thrown around a lot here and its possible to still use some hardware for a long time or to sell it and recover some cost but my experience in planning compute at large companies is that spending money on hardware and upgrading can often result in saving money long term.

Even assuming your compute demands stay fixed, its possible that a future generation of accelerator will be sufficiently more power/cooling efficient for your workload that it is a positive return on investment to upgrade, more so when you take into account you can start depreciating them again.

If your compute demands aren't fixed you have to work around limited floor space/electricity/cooling capacity/network capacity/backup generators/etc and so moving to the next generation is required to meet demand without extremely expensive (and often slow) infrastructure projects.


Sure, but I don't think most people here are objecting to the obvious "3 years is enough for enterprise GPUs to become totally obsolete for cutting-edge workloads" point. They're just objecting to the rather bizarre notion that the hardware itself might physically break in that timeframe. Now, it would be one thing if that notion was supported by actual reliability studies drawn from that same environment - like we see for the Backblaze HDD lifecycle analyses. But instead we're just getting these weird rumors.

I agree that is a strange notion that would require some evidence and I see it in some other threads but looking at the parent comments going up it seems people are discussing economic usefulness so that is what I'm responding to.

A toy example: NeoCloud Inc builds a new datacenter full of the new H800 GPUs. It rents out a rack of them for $10/minute while paying $6/minute for electricity, interest, loan repayment, rent and staff.

Two years later, H900 is released for a similar price but it performs twice as many TFlOps/Watt. Now any datacenter using H900 can offer the same performance as NeoCloud Inc at $5/month, taking all their customers.

[all costs reduced to $/minute to make a point]


It really depends on how long `NeoCloud` takes to recoup their capital expenditure on the H800s.

Current estimates are about 1.5-2 years, which not-so-suspiciously coincides with your toy example.


It's because they run 24/7 in a challenging environment. They will start dying at some point and if you aren't replacing them you will have a big problem when they all die en masse at the same time.

These things are like cars, they don't last forever and break down with usage. Yes, they can last 7 years in your home computer when you run it 1% of the time. They won't last that long in a data center where they are running 90% of the time.


A makeshift cryptomining rig is absolutely a "challenging environment" and most GPUs by far that went through that are just fine. The idea that the hardware might just die after 3 years' usage is bonkers.

Crypto miners undervote for efficiency GPUs and in general crypto mining is extremely light weight on GPUs compared to AI training or inference at scale

With good enough cooling they can run indefinitely!!!!! The vast majority of failures are either at the beginning due to defects or at the end due to cooling! It’s like the idea that no moving parts (except the HVAC) is somehow unreliable is coming out of thin air!

Economically obsolete, not obsolete, I suspect this is in line with standard depreciation.

There’s plenty on eBay? But at the end of your comment you say “a rate cheaper than new” so maybe you mean you’d love to buy a discounted one. But they do seem to be available used.

> so maybe you mean you’d love to buy a discounted one

Yes. I'd expect 4 year old hardware used constantly in a datacenter to cost less than when it was new!

(And just in case you did not look carefully, most of the ebay listings are scams. The actual product pictured in those are A100 workstation GPUs.)


American foreign policy has been remarkably consistent over the last several decades.

Sometimes it’s the carrot and sometimes it’s the stick but the policies remain the same.


> Some quibbles

Two wars.


One War. In 1962, i.e. 64 years ago. In meanwhile, US supported genocidal regime of Pakistan killing Hindus and many more in what is now Bangladesh. Sent billions more to Pakistan which are then used to fund terrorist activities in India. and some more recently under Covfefe.


"The Defence Ministry of India reported: 88 killed and 163 wounded on the Indian side, while 340 killed and 450 wounded on the Chinese side, during the two incidents.[6][7]

According to Chinese claims, the number of soldiers killed was 32 on the Chinese side and 65 on the Indian side in Nathu La incident; and 36 Indian soldiers and an 'unknown' number of Chinese were killed in the Cho La incident.[8] "

War might be overstating it a bit, "incident" might be more appropriate, but we can round up in the spirit of comity.

So adding it all up, the Chinese had 1-2 small foreign wars per decade in the 50s-70s, zero since 1979. It still doesn't justify the phrasing "threatening all their neighbors" in 2025, aside from Taiwan specifically.

In the case of the line of control with India, it's reached the point where they're having ritualized fistfights at high altitude for pride, that's just comical. It's not threatening.


Personnel die in those "fistfight", so not comical but also not a war.

> And let’s be real, “car enthusiasts” are going to disappear in one or two generations.

Not sure if you have realized this, but we have a pretty decent numbers of horse enthusiasts now.


Sure, but compared to an era when horses were used as a practical form of transport the number is effectively zero. Horses are a novelty that wealthy people play with. ICE cars will go the same way.

I think we are going to be driving hybrids in large numbers for several decades until there's a new battery technology that triples energy density over the state of the art now.

It can be invented sooner of course, technology prediction never really works.


Apart from Toyotas, hybrids are kind of unpopular precisely because they're a compromise. Not many people who do make the switch to EV go back.

Additional tipping points will come when cities start banning combustion engines on emissions grounds. Then gas stations start closing. After a while you get the reverse condition to EV range anxiety: having to drive further and further out of your way to fill up. Maybe you get a script-flipping service, an EV comes to the few remaining unconverted combustion vehicles with a small bowser of fuel.


But that is far in the future, long after my recently purchased ICE needs replacing.

The 1970s suggests that when gas stations start closing they may close a lot faster than people expect. Gas has a weird margin and is often something of a "loss leader" to convenience stores (or now full supermarkets). On the supply side it is capital-heavy and demand-driven in interesting ways that will respond to demand drops in terms of a "ratchet effect", in which many types of temporary shutdowns will result in permanent shutdowns that will be more expensive to restart than there should be investment interest.

Gas logistics is fascinating with how many possible places exist for disruption (in the negative sense more than the tech sense) to cause domino chains.

Things have the possibility to get "very interesting" in a unique shutdown spiral. I don't know how soon we'll see it, or if we'll see it, but if it happens I don't think it is "far" in the future, even as the 1970s starts to feel too far out of cultural memory to use as an allegory.


This is nonsense. Hybrids are outselling EVs in the US.

Hybrid adoption in the US is soaring. It's doubled in just a few years. They're hugely popular in the US precisely because they're NOT a compromise.


They are hugely popular in the US because there are more of them for sale and they have a lot of momentum from Toyota getting lost in the Hydrogen distraction.

From the perspective of a BEV with a modern range, hybrids have terrible all-electric range (if they even have true electric range) and worse maintenance schedules/cost of ownership. That's the compromise: less weight for good batteries for pure electric range and higher cost of ownership for high weight moving parts that you don't need in trips below electric range.


>and worse maintenance schedules/cost of ownership

Sorry, but this is simply incorrect. The Toyota Prius has the most reliable powertrain on the American market in most studies. This is the decades old Toyota hybrid planetary gearset engine and eCVT. It has less moving wear parts than an ICE engine, a generous warranty, absorbs brake wear, etc. It's pretty umambigious at this point, so I'm not sure where you're sourcing your facts from (vibes?).

Given this fact wasn't understood, there isn't much more content to engage with. Modern hybrids are popular because they're very good and side-step all of the myriad problems with electric vehicles.


More reliable than the average ICE, statistically, I will grant you. Though I think that has more to do with Toyota dominating US auto manufacturing reliability statistics in general, more than particulars of hybrids over ICE. The comparison, however, was to a full ("uncompromised") BEV. A full BEV has much fewer moving parts than ICE or Hybrid. (Especially because good electric motors don't even constitute "moving parts" when compared to an internal combustion engine, thanks to magnets.)

It doesn’t matter that it “has less moving parts”. It is the most reliable powertrain on the market full stop, including full electric powertrains.

https://www.consumerreports.org/cars/car-reliability-owner-s...


Almost everyone I know with an EV also has another car right now.

Solid state lithium batteries, which are starting to roll out now, are big enough jump. They have 50-80% more capacity which gives enough range for a day of driving. They charge faster so can charge on regular stops. They aren't flammable.

Cars don't need triple energy density, 600mi range and 15min charge is plenty. There is cost factor, but batteries have gotten cheap that isn't an issue. People talk about lithium shortage but it hasn't shown up.


> 600mi range and 15min charge is plenty.

Agreed.

So, it would be a matter of how fast the solid state batteries you speak of make their ways into actual cars. Right now, there aren't any.

And battery tech has always over promised and under delivered.


The smartest direction for someone manufacturing for the US market right now is probably EREVs aka serial hybrids - pure electric drivetrain, onboard gas generator. Only use gas when you're taking the kids to Grandma's, or when some obstacle prevents you from charging normally. Ford seems to be going in this direction, possibly a little late.

I agree, this is likely the technology for the next few decades. Maybe with some supercapacitors or something like that thrown in.

Electric motors are far more efficient at producing mechanical energy than a engine+transmission combo is. But at the same time, batteries suck.


"If I asked what people wanted they'd have asked for a longer range horse." Henry Ford, probably.

Maybe you’re the kind of person who believes the glass is always full if you can make the glass arbitrarily small.

If I’m a glass enthusiast or glass-filling-liquid enthusiast, sure if the alternative is those things not existing!

This is main difference between backpacking in the US vs backpacking in India/Nepal/Bhutan.

You just pack clothes, no matter how remote your destination is, there’s going to be food and shelter available every 6-8hours.


"Backpacking" in the US is conceptually and vernaculary different from trekking, not to argue something you probably know already and aren't claiming. The guesthouses in these countries were also government sponsored or owned-outright in my experience. There's an economic benefit to providing employment for the caretakers and of course for foreign tourism and even local travelers.

Maybe highway rest-stops are the closest analog for the US but even many of those have been shuttered by governments driven to parsimony.


Not just guesthouses though, it’s pretty easy to find a place to sleep in small villages.

The word for it is ‘home-stay’, there are a few houses in every village that are set up to accommodate guests for a very reasonable amount of money.

And these villages are pretty much everywhere.

I have been lost in the Himalayas, and it was not that much work to walk down the river to a village.


Isn't that due to a different relationship to travel? Many on foot, villagers passing through from one area to another perhaps for the market, it makes sense that there will be more opportunities for "walk-in" accommodations. In the US the expectation would be someone flying or driving long distances, or perhaps taking a bus, but not to sell produce at the regional market on foot. And foreign travelers to the US are often people of some financial means or are operating in specialized systems geared towards immigrants, like some of the "mexican" coach services in some states.

I did find, like it seems you did, that I loved traveling through Nepal and the accommodations you've described. Remarkable and tough people living hard lives with resilient cheer.


Yes, walking is perhaps the primary factor here.

I have heard of people trying this sort of backpacking in Europe and South America but it seems to have involved a lot more planning than what you can get away with in the Himalayas.


That's actually true in India, because India has a huge population. So no matter where you are in India, you are never too far from civilization. Some people have backpacked across India with just a few essentials.

Nepal and Bhutan has less population, so the density of humans vs nature is less there.

But if you stick close to the hiking trails in this vast & beautiful subcontinent, you can backpack easily from one rest stop to another.

And after some hours of tough trekking, when you encounter the warm welcome, bonhomie, food & drinks, music & laughter from the natives/locals, it is an amazing experience, worthy to be cherished. Faith in humanity - restored!


Nepal, Bhutan and most of the Indian states with the Himalayan mountains have a very similar population density.

Yeah, true.

And those Himalayan mountains are breathtakingly beautiful!


>there’s going to be food and shelter available every 6-8hours

In Nepal? That sounds like a risky assumption to make.


Yes

Of course if you go completely off-trail for days all bets are off.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: