This is a question that analysts don't even ask on earnings calls for companies with lowly earthbound datacenters full of the same GPUs.
The stock moves based on the same promise that's already unchecked without this new "in space" suffix:
We'll build datacenters using money we don't have yet, fill them with GPUs we haven't secured or even sourced, power them with infrastructure that can't be built in the promised time, and profit on their inference time over an ever-increasing (on paper) lifespan.
My cynical take is that it'll works out just fine for the data centers, but the neighbouring communities won't care for the constant rolling blackouts.
Okay but even in that case the hardware suffers significant under utilisation which massively hits RoI. (I think I read they only achieve 30% utilisation in this scenario)
That article appears to be stuck behind a paywall, so I can't speak to it.
That's good for now, but considering the federal push to prevent states from creating AI regulations, and the overall technological oligopoly we have going on, I wonder if, in the near future, their energy requirements might get prioritized. Again, cynical. Possibly making up scenarios. I'm just concerned when more and more centers pop up in communities with less protections.
Not really. GPUs are stateless so your bounded lifetime regardless of how much you use them is the lifetime of the shitties capacitor on there (essentially). Modulo a design defect or manufacturing defect, I’d expect a usable lifetime of at least 10 years, well beyond the manufacturer’s desire to support the drivers for it (ie the sw should “fail” first).
The silicon itself does wear out. Dopant migration or something, I'm not an expert. Three years is probably too low but they do die. GPUs dying during training runs was a major engineering problem that had to be tackled to build LLMs.
> GPUs dying during training runs was a major engineering problem that had to be tackled to build LLMs.
The scale there is a little bit different. If you're training an LLM with 10,000 tightly-coupled GPUs where one failure could kill the entire job, then your mean time to failure drops by that factor of 10,000. What is a trivial risk in a single-GPU home setup would become a daily occurrence at that scale.
Starlink yes, at 480 km LEO. But the article says "put AI satellites into deep space". Also if you think about it, LEO orbits have dark periods so not great.
A "fully and rapidly reusable" Starship would bring the cost of launch down orders of magnitude, perhaps to a level where it makes sense to send up satellites to repair/refuel other satellites.
A lot of these accounts seem anecdotal. I have a clean copy of win 11 iot ltsc running on my laptop and it runs well. The desktop management, included hyper V, wsl2 and awesome RDP make it a great platform to get work done. Most problems people encounter with Windows have to do with driver maturity. And in the case of a mega corp managed machine its all the “security” bs the put on there that slows you down to a crawl.Once you get stable drivers; I find Windows 11,with wsl as my shell, to be quite nice.
Well yes, it is anecdotal. After all, it's my personal experience, which is, by definition, an anecdote. At what point did I suggest the exact types of bullshit Win11 exposes me to are exactly the same as everyone else experiences?
Agreed. So much easier with self hosted runner. Just get out of your own way and do it. Use cases like caching etc also much more efficient on self hosted runner.
He is absolutely right. The soap opera effect totally ruins the look of most movies. I still use a good old 1080p plasma on default. It always looks good
It's funny, people complain about this but I actually like smooth panning scenes over juddery ones that give me a headache trying to watch them. I go so far as to use software on my computer called SVP 4 that does this but in a way better GPU accelerated implementation. I'm never sure why people think smoothness means cheapness except that they were conditioned to it.
The soap opera effect drives me nuts. I just about can't watch something when it's on. It makes a multimillion dollar movie look like it was slapped together in an afternoon.
I watched the most recent avatar and it was some HDR variant that had this effect turned up. It definitely dampens the experience. There’s something about that slightly fuzzed movement that just makes things on screen look better
The “non exclusive” thing may come back to bite them. If another big player comes in to lic the tech and get “different” tech than nvidia it opens up law suits. Also this seems like it’s just a bet on time. The head engineer who invented this technology will be replicated. But I guess that will take a while and the margin money machine will print Bs while the dust settles.
I'm pretty sure Nvidia overpaid so that groq can charge the same absurd price to the second customer to whom the company's IP is worth maybe a billion or two.
As your parents age you should convince them to transfer their assets into a trust where they still maintain control but withdraws etc can be optionally approved by a spouse or other family member. The trust has many other benefits but is especially good for fraud as it can disassociate the holders identity from the assets and have specific conditions for withdrawal. It also can provide a clean transfer of ownership in the event of a death etc. I am sorry this happened to you, it is becoming more common in the us too. And all of these “companies” seem to establish bank accounts and addresses in Delaware…
reply