Hacker News new | past | comments | ask | show | jobs | submit login

The crux of the problem is that bandwidth is currently oversold since it is largely unused most of the time.

Let's compare my session here on HN with how it would be if I was streaming my web browser from somewhere else.

Currently I pull down a few KB of ascii every i load up a comment thread and then spend maybe 20-30 minutes reading it during which time no bandwidth is used.

If I was streaming this from a browser in the cloud, I would be pulling down an order of magnitude more data since scrolling down the page could easily be a few MB vs no data at all.

You could have hardware "basestations" as you suggest which would be an interesting idea. Of course locating these geographically would be a difficult task. You might end up with rural areas where it would be impossible to do any computing at all because no basestation was in range, or you would end up with urban areas where there would be massive demand.

You would also have to account for what happens if somebody decides to have a big party or event somewhere and suddenly the local basestation computer goes from only having to service 10 - 100 clients to suddenly having to service 1000+?




You could have hardware "basestations" as you suggest which would be an interesting idea. Of course locating these geographically would be a difficult task. You might end up with rural areas where it would be impossible to do any computing at all because no basestation was in range, or you would end up with urban areas where there would be massive demand.

Rural would probably fallback to mobile networks and real datacenters, possibly with degraded service. Urban would just be adjusted based on trends - places with higher demand would get more powerful boxes; the same as scaling any other servers, really.

You would also have to account for what happens if somebody decides to have a big party or event somewhere and suddenly the local basestation computer goes from only having to service 10 - 100 clients to suddenly having to service 1000+?

Assuming a decent infrastructure, you'd only need to have enough wireless bandwidth; computing resources wouldn't necessarily have to be tied to local devices. So a party would just mean that more machines from the neighborhood would help via wired connections (I'm assuming basestations in the same neighborhood could speak to each other using local routers, without having to do a round trip to the ISP datacenter).

Of course, this still means that machines shouldn't be running at 100% capacity, but that's just common sense.


It's certainly any interesting concept with some merit. However it would change the game from being "damn I can't get any signal, no internet for me but I can still work offline" to "no signal, I literally can't do any computing at all!"

So what I could see happening is people at that point carrying around "personal compute servers" with them for just such an event, or at least having on in their home/office.

Also bear in mind that CPUs/GPUs appear to be getting pretty cheap as well as pretty small and you would need one anyway to do the video decompression so having something on your device capable of doing the basic UI etc isn't much of a stretch.

Perhaps you end up with 3 tiers of processing, i.e your device, your local node and "the cloud" (i.e a proper datacenter). The issue is making all of this transparent from a usability perspective.


Oh, I don't think we'll ever return to actual dumb terminals - I think the difference will be that some services will be available and others won't. Or some others will just be scheduled until you get near a basestation.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: