Maybe you don’t. But most other people do and so does Intel. It’s not good business to have a poor perf per watt chip in 2024. Everything from phones, laptops, to servers care very much about perf per watt except the very hardcore DIY niche that you might belong to.
Do you have experience with modern high watagae CPU during summer? Yes, good cooler can make it work. But where does that heat end up? It gets blown out from the case and first heats your legs (depending on desk, position to wall, etc), and then the entire room. It can be very noticable and not in a good way.
I have 2 saved profiles in my bios, one where the cpu is allow to consume has much current as it want that I use from mid October to mid May and one where the CPU is capped at 65W for the rest of the year.
I do something similar with my GPU, 75% cap in the summer, 105% cap in the winter.
I actually have a similar desktop next to my legs, only it has the Xeon version with twice the cores. It's an absolute PITA in the summer with no A/C. It's also quite noisy when the temperature in the room reaches 27 ºC.
None of those are a problem on a desktop PC only on shitty laptops.
My post specifically states the perf per watt advantage on laptops, phones, any small device, servers. I also mentioned this advantage being less on hardcore DIY computers.
Do you realise how big the PC gaming sector is these days? High performing desktop chips are not for the hardcore DIY enthusiast market anymore. There are now millions of gamers buying of the shelf PCs with the highest spec components as standard.
DIY desktop market is smaller than ever. You can see this in the number of discrete GPU sales which has drastically declined over the last 20 years[0] save for a few crypto booms.
Gaming laptops are now more popular than gaming desktops.[1]
Lokos like gaming laptop vs gaming desktop sales are roughly the same:
"In terms of market share, according to a 2020 report by Statista, Notebooks / laptops accounted for 46.8% of the global personal computer market, while desktop PC made up 40.6% of the market. The remaining market share was made up of other devices such as tablets and workstations."
I'm upvoting this to counter the downvotes because, unfortunately, normal people don't know.
Specifically, normal people don't know what "watt" is. Seriously. There is a reason electrician is a skilled profession. Most of us here do know watts and the like, so it's easy to forget that normal people aren't like us.
Normal people understand watts.
Because they know what an electric heater is.
And using 2x to 3x more electricity means more heat in their room.
Also many countries have smart electricity meters with in home units which tell them exactly how many watts are currently being consumed and how much that costs them.
I’m going to push back on this with a simple example. Go to your local hardware store and check out the electric space heater section. There will be a wide variety of units rated for small rooms, medium rooms, and large rooms, based on square footage. The heaters will have a variety of form factors and physical dimensions. Many of them will have mentions of “eco” and “efficiency”. Every single one of them, and I mean literally, will be a 1500W heater (or whatever your region’s equivalent maximum load per plug is which may vary in 240v countries). Exact same wattage, all 100% efficient because their only job is to produce heat, with wildly different dimensions and text on the box. Customers will swear up and down about the difference between these units.
I had to do this after my gas costs went well above my electric costs. Maybe you are in a country/area where your hardware store doesn’t supply a variety of heaters, but at my local store, no two models were the same wattage.
Do they? Even I have problem nowadays with this, because they write 60W but it's a LED and it acts like a 60W bulb but it's not 60W. 60W is more like branding.