Hacker Newsnew | past | comments | ask | show | jobs | submit | nwellinghoff's commentslogin

Yeah it does not make a whole lot of sense as the useful lifespan of the gpus in 4-6 years. Sooo what happens when you need to upgrade or repair?

This is a question that analysts don't even ask on earnings calls for companies with lowly earthbound datacenters full of the same GPUs.

The stock moves based on the same promise that's already unchecked without this new "in space" suffix:

We'll build datacenters using money we don't have yet, fill them with GPUs we haven't secured or even sourced, power them with infrastructure that can't be built in the promised time, and profit on their inference time over an ever-increasing (on paper) lifespan.


> This is a question that analysts don't even ask

On the contrary, data centers continue to pop up deploying thousands of GPUs specifically because the numbers work out.

The H100 launched at $30k GPU and rented for $2.50/hr. It's been 3 years since launch, the rent price is still around $2.50.

During these 3 years, it has brought in $65k in revenue.


They worked out because there was an excess of energy and water to handle it.

We will see how the maths works out given there is 19 GW shortage of power. 7 year lead time for Siemens power turbines, 3-5 years for transformers.

Raw commodities are shooting up, not enough education to cover nuclear and SMEs and the RoI is already underwater.


My cynical take is that it'll works out just fine for the data centers, but the neighbouring communities won't care for the constant rolling blackouts.

Okay but even in that case the hardware suffers significant under utilisation which massively hits RoI. (I think I read they only achieve 30% utilisation in this scenario)

Why would that be the case if we assume the grid prioritizes the data centers?

That is not a correct assumption. https://ig.ft.com/ai-power/

Reports in North Virginia and Texas are stating existing data centres are being capped 30% to prevent residential brownouts.


That article appears to be stuck behind a paywall, so I can't speak to it.

That's good for now, but considering the federal push to prevent states from creating AI regulations, and the overall technological oligopoly we have going on, I wonder if, in the near future, their energy requirements might get prioritized. Again, cynical. Possibly making up scenarios. I'm just concerned when more and more centers pop up in communities with less protections.


Beyond GPUs themselves, you also have other costs such as data centers, servers and networking, electricity, staff and interest payments.

I think building and operating data center infrastructure is a high risk, low margin business.


They can run these things at 100% utilization for 3 years straight? And not burn them out? That's impressive.

Not really. GPUs are stateless so your bounded lifetime regardless of how much you use them is the lifetime of the shitties capacitor on there (essentially). Modulo a design defect or manufacturing defect, I’d expect a usable lifetime of at least 10 years, well beyond the manufacturer’s desire to support the drivers for it (ie the sw should “fail” first).

The silicon itself does wear out. Dopant migration or something, I'm not an expert. Three years is probably too low but they do die. GPUs dying during training runs was a major engineering problem that had to be tackled to build LLMs.

> GPUs dying during training runs was a major engineering problem that had to be tackled to build LLMs.

The scale there is a little bit different. If you're training an LLM with 10,000 tightly-coupled GPUs where one failure could kill the entire job, then your mean time to failure drops by that factor of 10,000. What is a trivial risk in a single-GPU home setup would become a daily occurrence at that scale.


I don't see anything impressive here?

> the useful lifespan of the gpus in 4-6 years. Sooo what happens when you need to upgrade or repair?

Average life of starlink satellite is around 4-5 years


Starlink yes, at 480 km LEO. But the article says "put AI satellites into deep space". Also if you think about it, LEO orbits have dark periods so not great.

A better orbit might be Sun Synchronous (SSO) which is around 705 km, still not "deep space" but reachable for maintenance or short life deorbit if that's the plan. https://science.nasa.gov/earth/earth-observatory/catalog-of-...

And of course there are the LaGrange points which have no reason to deorbit, just keep using the old ones and adding newer.


damn. at this point its not even about a pretense for progress, just a fetish for a very dirty space

They re-enter and burn up entirely. Old starlinks don't stay in space.

So they pollute the upper atmosphere instead!

It's essentially a military network (which is why other power sphere want their own) and a way to feed money into spacex

Same that happens with Starlink satellites that are obsolete or exhausted their fuel - they burn up in the atmosphere.

With zero energy cost it will run until it stops working or runs out of fuel, which I'm guessing is between 5-7 years.

5 to 7 months given they want 100kw Per ton and magical mystery sauce shielding is going to do shit all.

> Sooo what happens when you need to upgrade or repair?

The satellite deorbits and you launch the next one.


so, instead of recycling as many components as possible (a lot of these GPU have valuable resources inside) you simply burn them up.

I'm guessing the next argument in the chain will be that we can mine materials from asteroids and such?


Such a waste of resources

not to mention that radiation hardening of chips has a big impact on cost and performance

You could immersion cool them and get radiation resistance as a bonus.

Yes, because launching then immersed in something that will greatly increase the launch weight will help...

A "fully and rapidly reusable" Starship would bring the cost of launch down orders of magnitude, perhaps to a level where it makes sense to send up satellites to repair/refuel other satellites.

A lot of these accounts seem anecdotal. I have a clean copy of win 11 iot ltsc running on my laptop and it runs well. The desktop management, included hyper V, wsl2 and awesome RDP make it a great platform to get work done. Most problems people encounter with Windows have to do with driver maturity. And in the case of a mega corp managed machine its all the “security” bs the put on there that slows you down to a crawl.Once you get stable drivers; I find Windows 11,with wsl as my shell, to be quite nice.

Well yes, it is anecdotal. After all, it's my personal experience, which is, by definition, an anecdote. At what point did I suggest the exact types of bullshit Win11 exposes me to are exactly the same as everyone else experiences?

Man that tech was cool and did you a solid.

Many techs went to work for the phone companies for a reason.

You can achieve the same thing with electronic voting. Just because its electronic does not mean you do away with the “layers”

Are these assumptions wrong? If I 1) execute the ai as a isolated user. 2) behind a white list out and in firewall 3) on a overlay file mount

I am pretty much good to go from a it can’t do something I don’t want it to do?


Agreed. So much easier with self hosted runner. Just get out of your own way and do it. Use cases like caching etc also much more efficient on self hosted runner.


You could restrict the ssh port by ip as well.


He is absolutely right. The soap opera effect totally ruins the look of most movies. I still use a good old 1080p plasma on default. It always looks good


It's funny, people complain about this but I actually like smooth panning scenes over juddery ones that give me a headache trying to watch them. I go so far as to use software on my computer called SVP 4 that does this but in a way better GPU accelerated implementation. I'm never sure why people think smoothness means cheapness except that they were conditioned to it.


Drives me insane when people say they can't tell the difference while watching with motion smoothing on. I feel for the filmmakers.


The soap opera effect drives me nuts. I just about can't watch something when it's on. It makes a multimillion dollar movie look like it was slapped together in an afternoon.


I watched the most recent avatar and it was some HDR variant that had this effect turned up. It definitely dampens the experience. There’s something about that slightly fuzzed movement that just makes things on screen look better


from what I heard, the actions scenes are shot in 48 fps and others are in 24 fps or something along those lines. You might be talking about that ?


My parents’ new TV adds a Snapchat like filter to everything. Made Glenn Close look young instead of the old woman she’s supposed to be in Knives Out.

Turning it off was shocking. So much better. And it was buried several levels deep in a weirdly named setting.


Nope, nope, I can't watch 24-30hz without my eyes bleeding during camera pans.


The “non exclusive” thing may come back to bite them. If another big player comes in to lic the tech and get “different” tech than nvidia it opens up law suits. Also this seems like it’s just a bet on time. The head engineer who invented this technology will be replicated. But I guess that will take a while and the margin money machine will print Bs while the dust settles.


I'm pretty sure Nvidia overpaid so that groq can charge the same absurd price to the second customer to whom the company's IP is worth maybe a billion or two.


As your parents age you should convince them to transfer their assets into a trust where they still maintain control but withdraws etc can be optionally approved by a spouse or other family member. The trust has many other benefits but is especially good for fraud as it can disassociate the holders identity from the assets and have specific conditions for withdrawal. It also can provide a clean transfer of ownership in the event of a death etc. I am sorry this happened to you, it is becoming more common in the us too. And all of these “companies” seem to establish bank accounts and addresses in Delaware…


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: