Not at all. I'm comparing two different syntaxes that can reliably compile to the same machine code. A syntax that produces non-deterministic results is a completely different matter.
I know what you mean, and it was correct a year or more ago. Now, you are wrong.
AI is reliable enough to not mess up this translation now, especially if you configure it right (top p, and temperature parameters).
This abstraction lifting is absolutely happening in front of our eyes now. For the exact same reason the C for loop is less readable.
The different is that you don't yet store the prompts, just the generated code. That difference is not going to last too long. Storing prompts and contexts along with generated code is likely how we are going to be doing software engineering for a few decades before whatever the next leap in technology works out to be.
You could just lock the seed to get "deterministic" behaviour, but you are missing the point of programming languages completely. Programming languages are a set of rules that ~guarantee predictable behaviour down to the bit level. If you were to try to recreate that with LLMs, you run into two problems: one, your LLM is now a programming language where you have to put exact specific inputs in to get the correct outputs, except you don't have a specification for the inputs and you don't know what the correct outputs even look like because you aren't a software engineer. Two, even with a locked seed, the LLM is still going to output different code based on the exact order of letters in your prompt and any change in that will change the output. Compilers can execute a variety of optimizations on your code that change the output, but in the end they are still bound to hard rules and you can learn the rules when you get output that does not match what your expectations were from the input. And if there is a bug in the execution of the rules by the compiler, you can fix it; that is not possible with an LLM.
This talk about replacing software engineering by people who have no idea what software engineering is gets unbelievably tedious. The advent of Javascript did nothing to replace software engineers, it just created an entirely new class of developer. It lowered the barrier to entry and allowed anybody to write inefficient, bloated, buggy, and insecure programs. Our hardware is advanced enough that for many trivial applications there is sufficient overhead for inefficient and bloated programs to exist and be "good enough" (although they are causing untold damage in the real world with security breach after security breach). However, lowering the barrier to entry does not replace the existing engineers. You still need a real software engineer to develop novel applications that use the hardware efficiently. The Duchies of Javascript and Python are simply a new country founded adjacent to, and depending upon, the Software Engineer Kingdom. Now a new duchy is being founded, one that lowers the barrier to entry further to make even more inefficient, even more bloated, even more buggy, and even more insecure programs easier than ever. For some use cases, these programs will be good enough. But they will never replace the use cases that require serious engineering, just as Javascript never did.
> Programming languages are a set of rules that ~guarantee predictable behaviour down to the bit level.
No, that’s your interpretation of what a programming language is, based on nostalgia and wishful thinking.
If you get a programming system that sacrifices these and works better, people are going to use those.
FYI I am a compiler developer who has contributed to about three widely used compilers, I assure you I understand what programming languages and compilers are.
> If you get a programming system that sacrifices these and works better, people are going to use those.
Sure. That's a very large "if", though, one that evaluates to false and will continue evaluating to false for the foreseeable future. I am told over and over that I am being replaced and yet I have not seen one singular example of a real-world application of vibe coding that replaces existing software engineering. For starters, where is the vibe-coded replacement for Clang? For Linux? For IDEs? For browsers? I don't mean a concept of an idea of a browser, of course. To make the claim that the new programming system works better than the old system, it must produce something that is actually superior to what people currently use. LLMs are clearly completely and totally incapable of this. What they are capable of is making inferior software with a lower barrier to entry. Superior software is completely off the table, and it is why we don't see any existing software being replaced at scale, even though existing software absolutely has flaws and room for improvement that a "better programming system" would be able to seize upon if it existed.
That you can confidently assert that I am wrong, while having zero real-world evidence of superior software engineering produced by the new system, is indicative of a certain level of cultish thinking that is overtaking the world. Over and over and over again, people keep making these grand claims promising the world is changed, and yet there is no tangible presence of this in reality. You simply demand belief that it will change, as though LLMs are a new religion.
> Storing prompts and contexts along with generated code is likely how we are going to be doing software engineering for a few decades
You specifically asserted that prompts would be the future of software engineering. If my memory is not mistaken, this was edited later to hedge with "likely" and did not include that when originally written?
We'll have to leave it here then. I am fairly confident you edited the comment after the fact, but I cannot prove it, so further discussion is fruitless.
A probable implementation is that you bootstrap the initial key exchange using web PKI (if you want to talk to Alice@example.com then your client makes a TLS connection to example.com and asks for Alice's public key) and thereafter you use something like the Signal ratchet thing.
No amount of what ifs and buts is going to change the fact that the tech is now mature enough to make software in hours that previously needed man-years.
At least it’s not among the thousand AI note transcribing startups that have sprung up in the last two years.
I don’t understand if it’s something to do with the recommendation algorithm, but my LinkedIn is constantly full of those and my career has nothing to do with this at all.
"Create a product around a problem you know well" they said, and I'm thinking, as I'm typing/writing into my notes/journal; "what could I build that would have hype and is close to something I know?"
Probably every note taker has had the thought "What if I did it better?" at one point or another.
I do not know what benchmarks do you have in mind.
All the benchmarks that I have ever seen published about AVX-512 vs. AVX2 on AMD Zen 4 have shown better performance on AVX-512. Very frequently, the AVX-512 performance was not a little better, but much better, despite the fact that on Zen 4 both program versions use exactly the same execution resources (but AVX-512 programs have less instructions, which avoids front-end bottlenecks).
for the simplest cases it will be about the same speed as avx2, but if you're trying to do anything fancy, the extra registers and instructions are a godsend.
I agree that there is hyperbole thrown around a lot here and its possible to still use some hardware for a long time or to sell it and recover some cost but my experience in planning compute at large companies is that spending money on hardware and upgrading can often result in saving money long term.
Even assuming your compute demands stay fixed, its possible that a future generation of accelerator will be sufficiently more power/cooling efficient for your workload that it is a positive return on investment to upgrade, more so when you take into account you can start depreciating them again.
If your compute demands aren't fixed you have to work around limited floor space/electricity/cooling capacity/network capacity/backup generators/etc and so moving to the next generation is required to meet demand without extremely expensive (and often slow) infrastructure projects.
Sure, but I don't think most people here are objecting to the obvious "3 years is enough for enterprise GPUs to become totally obsolete for cutting-edge workloads" point. They're just objecting to the rather bizarre notion that the hardware itself might physically break in that timeframe. Now, it would be one thing if that notion was supported by actual reliability studies drawn from that same environment - like we see for the Backblaze HDD lifecycle analyses. But instead we're just getting these weird rumors.
I agree that is a strange notion that would require some evidence and I see it in some other threads but looking at the parent comments going up it seems people are discussing economic usefulness so that is what I'm responding to.
A toy example: NeoCloud Inc builds a new datacenter full of the new H800 GPUs. It rents out a rack of them for $10/minute while paying $6/minute for electricity, interest, loan repayment, rent and staff.
Two years later, H900 is released for a similar price but it performs twice as many TFlOps/Watt. Now any datacenter using H900 can offer the same performance as NeoCloud Inc at $5/month, taking all their customers.
It's because they run 24/7 in a challenging environment. They will start dying at some point and if you aren't replacing them you will have a big problem when they all die en masse at the same time.
These things are like cars, they don't last forever and break down with usage. Yes, they can last 7 years in your home computer when you run it 1% of the time. They won't last that long in a data center where they are running 90% of the time.
A makeshift cryptomining rig is absolutely a "challenging environment" and most GPUs by far that went through that are just fine. The idea that the hardware might just die after 3 years' usage is bonkers.
Crypto miners undervote for efficiency GPUs and in general crypto mining is extremely light weight on GPUs compared to AI training or inference at scale
With good enough cooling they can run indefinitely!!!!! The vast majority of failures are either at the beginning due to defects or at the end due to cooling! It’s like the idea that no moving parts (except the HVAC) is somehow unreliable is coming out of thin air!
There’s plenty on eBay? But at the end of your comment you say “a rate cheaper than new” so maybe you mean you’d love to buy a discounted one. But they do seem to be available used.
One War. In 1962, i.e. 64 years ago. In meanwhile, US supported genocidal regime of Pakistan killing Hindus and many more in what is now Bangladesh. Sent billions more to Pakistan which are then used to fund terrorist activities in India. and some more recently under Covfefe.
"The Defence Ministry of India reported: 88 killed and 163 wounded on the Indian side, while 340 killed and 450 wounded on the Chinese side, during the two incidents.[6][7]
According to Chinese claims, the number of soldiers killed was 32 on the Chinese side and 65 on the Indian side in Nathu La incident; and 36 Indian soldiers and an 'unknown' number of Chinese were killed in the Cho La incident.[8] "
War might be overstating it a bit, "incident" might be more appropriate, but we can round up in the spirit of comity.
So adding it all up, the Chinese had 1-2 small foreign wars per decade in the 50s-70s, zero since 1979. It still doesn't justify the phrasing "threatening all their neighbors" in 2025, aside from Taiwan specifically.
In the case of the line of control with India, it's reached the point where they're having ritualized fistfights at high altitude for pride, that's just comical. It's not threatening.
Sure, but compared to an era when horses were used as a practical form of transport the number is effectively zero. Horses are a novelty that wealthy people play with. ICE cars will go the same way.
I think we are going to be driving hybrids in large numbers for several decades until there's a new battery technology that triples energy density over the state of the art now.
It can be invented sooner of course, technology prediction never really works.
Apart from Toyotas, hybrids are kind of unpopular precisely because they're a compromise. Not many people who do make the switch to EV go back.
Additional tipping points will come when cities start banning combustion engines on emissions grounds. Then gas stations start closing. After a while you get the reverse condition to EV range anxiety: having to drive further and further out of your way to fill up. Maybe you get a script-flipping service, an EV comes to the few remaining unconverted combustion vehicles with a small bowser of fuel.
The 1970s suggests that when gas stations start closing they may close a lot faster than people expect. Gas has a weird margin and is often something of a "loss leader" to convenience stores (or now full supermarkets). On the supply side it is capital-heavy and demand-driven in interesting ways that will respond to demand drops in terms of a "ratchet effect", in which many types of temporary shutdowns will result in permanent shutdowns that will be more expensive to restart than there should be investment interest.
Gas logistics is fascinating with how many possible places exist for disruption (in the negative sense more than the tech sense) to cause domino chains.
Things have the possibility to get "very interesting" in a unique shutdown spiral. I don't know how soon we'll see it, or if we'll see it, but if it happens I don't think it is "far" in the future, even as the 1970s starts to feel too far out of cultural memory to use as an allegory.
They are hugely popular in the US because there are more of them for sale and they have a lot of momentum from Toyota getting lost in the Hydrogen distraction.
From the perspective of a BEV with a modern range, hybrids have terrible all-electric range (if they even have true electric range) and worse maintenance schedules/cost of ownership. That's the compromise: less weight for good batteries for pure electric range and higher cost of ownership for high weight moving parts that you don't need in trips below electric range.
>and worse maintenance schedules/cost of ownership
Sorry, but this is simply incorrect. The Toyota Prius has the most reliable powertrain on the American market in most studies. This is the decades old Toyota hybrid planetary gearset engine and eCVT. It has less moving wear parts than an ICE engine, a generous warranty, absorbs brake wear, etc. It's pretty umambigious at this point, so I'm not sure where you're sourcing your facts from (vibes?).
Given this fact wasn't understood, there isn't much more content to engage with. Modern hybrids are popular because they're very good and side-step all of the myriad problems with electric vehicles.
More reliable than the average ICE, statistically, I will grant you. Though I think that has more to do with Toyota dominating US auto manufacturing reliability statistics in general, more than particulars of hybrids over ICE. The comparison, however, was to a full ("uncompromised") BEV. A full BEV has much fewer moving parts than ICE or Hybrid. (Especially because good electric motors don't even constitute "moving parts" when compared to an internal combustion engine, thanks to magnets.)
Solid state lithium batteries, which are starting to roll out now, are big enough jump. They have 50-80% more capacity which gives enough range for a day of driving. They charge faster so can charge on regular stops. They aren't flammable.
Cars don't need triple energy density, 600mi range and 15min charge is plenty. There is cost factor, but batteries have gotten cheap that isn't an issue. People talk about lithium shortage but it hasn't shown up.
The smartest direction for someone manufacturing for the US market right now is probably EREVs aka serial hybrids - pure electric drivetrain, onboard gas generator. Only use gas when you're taking the kids to Grandma's, or when some obstacle prevents you from charging normally. Ford seems to be going in this direction, possibly a little late.
"Backpacking" in the US is conceptually and vernaculary different from trekking, not to argue something you probably know already and aren't claiming. The guesthouses in these countries were also government sponsored or owned-outright in my experience. There's an economic benefit to providing employment for the caretakers and of course for foreign tourism and even local travelers.
Maybe highway rest-stops are the closest analog for the US but even many of those have been shuttered by governments driven to parsimony.
Isn't that due to a different relationship to travel? Many on foot, villagers passing through from one area to another perhaps for the market, it makes sense that there will be more opportunities for "walk-in" accommodations. In the US the expectation would be someone flying or driving long distances, or perhaps taking a bus, but not to sell produce at the regional market on foot. And foreign travelers to the US are often people of some financial means or are operating in specialized systems geared towards immigrants, like some of the "mexican" coach services in some states.
I did find, like it seems you did, that I loved traveling through Nepal and the accommodations you've described. Remarkable and tough people living hard lives with resilient cheer.
I have heard of people trying this sort of backpacking in Europe and South America but it seems to have involved a lot more planning than what you can get away with in the Himalayas.
That's actually true in India, because India has a huge population. So no matter where you are in India, you are never too far from civilization. Some people have backpacked across India with just a few essentials.
Nepal and Bhutan has less population, so the density of humans vs nature is less there.
But if you stick close to the hiking trails in this vast & beautiful subcontinent, you can backpack easily from one rest stop to another.
And after some hours of tough trekking, when you encounter the warm welcome, bonhomie, food & drinks, music & laughter from the natives/locals, it is an amazing experience, worthy to be cherished. Faith in humanity - restored!
LLM(eat apples in fruitbasket)
vs
foreach (apple in fruitbasket) apple.Eat()
Your comment can be repeated almost word for word here.
reply