Hacker Newsnew | past | comments | ask | show | jobs | submit | hurfdurf's commentslogin

No idea if the link was pointing to the same message, but there is this:

"However, that post was subsequently removed as well because that image contained PII on a professional Rainbow Six Siege player who had been targeted by Bad Actors who wanted to steal his account."

https://nitter.net/vxunderground/status/2005754035928797672


UK and US seem to have death + 70 years.

Don't need to guess, the company name is mentioned in the second line of the linked posting.

My point is: they could have left it out. There are not that many manufacturers of insulin pumps and there is only one that the title could have conceivably applied to.

Dupe from just a couple of hours ago, which quickly fell off the frontpage?

https://news.ycombinator.com/item?id=46389444

397 points 9 hours ago | 349 comments


Interestingly there was no push back in the prior thread on Rob's environmental claims. This leads me to believe most HNers took them at face value.

Umm... are they not correct?

The energy demands of existing and planned data centres are quite alarming

The enormous quantity of quickly deprecating hardware is freaking out finance people, the waste aspect of that is alarming too.

What is your "push back"?


Happy to provide. I will say that literally all these sources are already available in this HN thread, but its hard to find and many of the comments are down voted. So here you go:

This link has a great overview of why generative AI is not really a big deal in environmental terms: https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

GenAI is dramatically lower impact on the environment than, say, streaming video is. But you don't see anywhere near the level of environmental vitriol for streaming video as for AI, which is much less costly.

The European average is 56 grams of CO2 emissions per hour of video streaming. For comparison: 100 meters to drive causes 22 grams of CO2.

https://www.ndc-garbe.com/data-center-how-much-energy-does-a...

80 percent of the electricity consumption on the Internet is caused by streaming services

Telekom needs the equivalent of 91 watts for a gigabyte of data transmission.

An hour of video streaming needs more than three times more energy than a HD stream in 4K quality, according to the Borderstep Institute. On a 65-inch TV, it causes 610 grams of CO2 per hour.

https://www.handelsblatt.com/unternehmen/it-medien/netflix-d...

Here is another helpful link with calculations going over similar things: https://nationalcentreforai.jiscinvolve.org/wp/2025/05/02/ar...


I won't ask about or speak to the overall message of your first link, I'm interested to digest it further for my own benefit. This is striking to me though:

    Throughout this post I’ll assume the average ChatGPT query uses 0.3 Wh of energy, about the same as a Google search used in 2009.
Obviously that's roughly one kilowatt for one second. I distinctly recall Google proudly proclaiming at the bottom of the page that its search took only x milliseconds. Was I using tens-hundreds of kW every time I searched something? Or did most of the energy usage come during indexing/scraping? Or is there another explanation?

One thing I would always keep in mind is that transmission of information is more expensive than localized computation on the information.

Streaming 4k video is several orders of magnitude more bandwidth intensive than UTF8 text at human rates. The fact that inference is so much more expensive than an amortized encoding of a video might actually wash out in the end.


I think you miss the point, I looked at that Substack, it is way off point

It is the training of models, is it not, that requires huge quantities of electricity. Already driving up prices for consumers.

OpenAI (that name is Orwellian) wants 25GW over five years, if memory serves. That is not for powering ChatGPT queries

Also the huge waste of gazillion of dollars spent on computer gear (in data centres) that will probably depreciate to zero in less than five years.

This is a useful technology, but a whole lot of greed heads are riding it to their doom. I hope they do not take us on their ride


Its still a small portion of the total energy use and already powering a lot of things. And to compare other uses accurately, you'd also have to consider the cost of creating those things (like creating a full tv show, or car manufacturing, etc).

> Its still a small portion of the total energy use and already powering a lot of things.

    data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028.
https://www.energy.gov/articles/doe-releases-new-report-eval...

> 397 points, 349 comments

Probably hit the flamewar filter.



That's saying something a bit different, though yeah I agree.

And btw, I have "auto-correct" disabled and this stupid bug still happens. Which is to say, yes, I agree, Apple is user hostile.


So just another case of getting proper support only if you make a big enough splash on social media and news outlets.

>The Cupertino, California-based tech giant collected 85% of operating profit and 48% of revenue from smartphone sales over the course of the year. https://www.bloomberg.com/news/articles/2023-02-03/iphone-gr...

>However, it outperformed a struggling smartphone market in terms of shipment, revenue and operating profit growth, in turn achieving its highest-ever shares of 18%, 48% and 85% in these metrics respectively, in 2022.

https://counterpointresearch.com/en/insights/2022-global-sma...

Almost three year old numbers, but not all that much should have changed since then.


Yes, but that was the case in past years too, where they kept the lead.


I like your "class half full" way of thinking.


>you can re-enable this by making a purchase on the Microsoft Store.

That is just a software decoder running on the CPU. HP and Dell are effectively killing any way to hardware-decode (and encode?) HEVC on these models. Which is a thing you want on a power- and heat-limited device such as a laptop.


I understood the article to say that it was a driver that enabled the hardware and not a software decoder.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: