Assuming you have internet during setup (can be a pain loading drivers via USB) and are fine with creating the throwaway to just not use it, sure - but that's precisely the kind of thing people don't like about the setup requiring an account. Getting rid of the preinstalled packages is a bit of a game of russian roulette for the install as you never know when the next incremental update breaks something because the machine doesn't match baseline or something else was built with the assumption certain crap would be there.
If you're going to go through all of this there are easier ways to just work around the install and crap limitations. They are also still annoying, but at least easier.
If you mean editing ProRes is a better fit, if you mean final export software always beats hardware encoders in terms of quality, if you mean mass h.264 transcoding a Mac workstation is probably not the right place though.
I read it as they were talking about the idea of having kids still go play with actual tin cans and a string in this day and age rather than it be something only old people could have done.
0 of course, but wasn't 3G all shut down in the US in 2022 to open up the airspace?
One of those HN myths that comes from only being willing to Google (or ChatGPT) information, rather than encountering it in the real world.
3G still exists in rural and remote areas that no major carrier wants to serve, at least as of April, 2025 — the last time I did a round of real-world web testing. Next round is in September. Maybe with 5G in the cities, some hand-me-down 4G equipment has made it to the places where I test.
Thank you for proving my point, that people on HN falsely think they know more than others because they can Google a link, even though what's happening on the ground is entirely different.
Reality ≠ policy papers, press releases, or web links.
I guess what I am getting from this thread is, there is 3G service out there in the wild. However, in locations where 4G and 5G is available, 3G has been phased out
This doesn’t jibe with my experience trying to make phone calls on rural highways, where it seems there is no signal whatsoever more often than not.
I suppose this could be because ATT-Verizon-T-Mobile used to have 2G in that area (which was discontinued — 900Mhz analog voice band, also decommissioned) has moved on and left swathes of the US without signal, whereas, certain areas (commenter omits an example) never were served by major telecoms and have “evolved” their tech more slowly, so 3G is not decommissioned in those places. In that sense yes there is no contradiction. It still feels like we’ve gone backwards since there are places I used to be able to make a phone call that are now considered remote area with satellite SOS being you’re only way to reach someone
The big-3 have nationwide coverage (well, at least 2 of them).
But even beside that, AFAICT USCellular shut down 3G in January 2024, Appalachian Wireless in Dec 2022, Cellcom in Dec 2023, and C Spire sometime in 2022.
I'm interested to know where exactly public 3G still exists in the USA.
> I'm interested to know where exactly public 3G still exists in the USA.
I gotchu, but I wanna be clear that it is all just fringe/regional operators (which is what the claim was originally about anyway, not about major telcos).
I found a couple with user reports claiming 3G support still being active in random pockets of Wyoming/Colorado/etc. (but no confirmation on the official website), and one with confirmation on the official website.
The one with the official confirmation is Union Wireless[0], with UMTS being a stand-in for 3G (color-coded in grey on their coverage map; mostly southern Wyoming plus parts of Colorado, Utah, and Idaho).
I agree with your overall point though. Functionally, 3G is dead in the US. But factually, there are a few holdout fringe remote areas that still have it.
The 1.1.1.1 referred to in the above is Cloudflare's main resolver, 1.1.1.2 & 1.1.1.3 are for those intentionally looking for malware and content blocking.
Paris is consistently somewhere in the top 10 cities worldwide by number of tourists per year and this is an extremely important factor to the city. Even if if Le Monde was writing this in French the impacts to/from tourism would be relevant to the article.
By "free windows" do you just mean an unactivated copy of Windows? That doesn't prevent the user from configuring their preference in the browser itself.
If you bought a big ass server for your home 10 years ago it probably wouldn't have even have had a GPU/AI accelerator at all. If it did, it would have been something with wimpy compute and VRAM because you needed the video encoder/decoder for security cameras or the like.
I'm not sure that really gives confidence hardware has really slowed down enough to invest in it for decades. Single core CPU performance has but that's not really what new things are using.
It really just depends on if the hardware is "good enough" for whatever its purpose is. If the hardware today can locally run whatever models for your security cameras, it's likely they will still be "good enough" in 10 years.
Of course, similar to a 10 year old car or appliance, you will be missing any new features or bells and whistles that have become available in the meantime.
I agree; it's important to recognize that there are lots of use cases where computers have long since reached "good enough" and aren't really going obsolete anymore for those use cases.
My NAS is about 13 years old, the network switches it connects through are even older, and while 2.5GbE now exists I have no need throw out my "good enough" equipment to replace with something marginally faster or more power efficient. I don't even really need to expand the storage of that NAS anytime soon, because my music collection could never come close to filling it, my movie/TV collection isn't growing much anymore due to the shift to streaming, and the volume of other stuff that I need to back up from my other computers just isn't growing much over the years.
Decades is a long time for hardware, but "years" seems reasonable soon. The commercial models are "good enough" for a lot of things now, so if that performance makes its way into the on-device space for "home applicance"-level cost (<$5k at the start, basically), I'd expect a lot of stuff to start popping up there. In offices too.
Like the PC in the 80s starting to eat up "get a mainframe" or "rent time on a mainframe" uses.
You’re kindof undermining your own point. Ten years later the only thing you’d need to upgrade for your home server might be the GPU - because a new use-case emerged. Okay? Spend $500-$1000 on an eGPU. Problem solved. Will that eGPU setup last another ten years? If all it’s doing is processing security video and routing claw-like tasks, then yes.
Not sure I follow why - that the server from 10 years ago would be completely unfit for purpose now should not imply the one you buy today would therefore be the right hardware 10 years from now. Unless you can somehow guarantee we've reached the final set of new requirements we will ever have just these last few years the GPUs you buy today will probably be just as irrelevant to the new requirements a decade from now.
Of course one can always upgrade components piecewise as requirements change, but I don't see why you need to invest in a big ass server to do that. It'd be cheaper to go that route everyone has for decades at this point - upgrade with normal sized stuff as needed and not try to make it an up front multi-decade home investment out of it.
On the flip-side, if you intentionally plan to lock in the capabilities to the kinds of things one can run today and know you'll never therefore need to upgrade it then you can get whatever sized system makes sense for today's needs. You just need to be really sure you'll not be interested in "the next big thing" when it comes too.
Yeah but, how long do mainframes last? Think of the COBOL systems used in government. No reason to update them, they worked forever; their job is discrete and they performed it well enough where intense updating wasn't a requirement.
You also need to ask: How much do mainframes cost? They were engineered for backwards compatibility and reliability, with built in redundancy you don't find in consumer hardware.
AI models are changing every other day. I have to rebuild llama.cpp from source regularly. We are no where close to a personal "AI mainframe."
Just tried it on my Mac and sadly it doesn’t seem like it. I’m still on Sequoia, so possibly it does it on Tahoe, but probably unlikely. That’s a shame.
It’d be nice if someone on the Safari team added this though to match Chrome and Firefox!
Distance is usually the wrong measure in space. Something like delta-v will give you a much better scaling as once you manage to get something to orbit the rest is actually a lot closer than it would seem on the ground.
Not to say the effort somehow becomes peanuts, cheap, or easy... but the jump in delta-v needed to go from "100 km vertical ascent" to "hit the moon 350,000 km away" is more like a ~6-7x increase than a 3,500x one. If the moon were instead 700,000 km away the factor would still be ~6-7x.
Everything you've said is correct, but Delta-V scales logarithmicly with fuel load - you need to carry the new fuel. So for purpose of discussing altitude (a valid way to look at getting to the moon) the size of the rocket, and the fuel expended, does in fact grow much closer to linearly.
What I actually started with was comparing Electron to the current bos.space rocket and seeing the relationship was nowhere near linear. The above is the largest component of why I could think of but there is always more than 1 thing going in.
If you're going to go through all of this there are easier ways to just work around the install and crap limitations. They are also still annoying, but at least easier.
reply