Physical limits are a major component of latency. Until we figure out a way to transmit information instantly with infinite bandwidth, local storage will always be a significant factor.
edit: And given trends in solid-state drives, local storage will be cheaper and more energy efficient than a wireless network for a long time.
If it takes an hour to copy a terabyte of data to a tape, put it in a car, drive to a location and read the tape, you just transported 8796093022208 bits in 3600 seconds, or about 2443359173 bits/second.
edit: I always understood the quote as "the [potential] bandwidth"- the bandwidth that can be available using the station wagon as physical infrastructure.
Unless I'm wrong, they've measured both: Latency is 1 hour (time it takes a packet to go from source to destination), bandwidth is that number they quoted.
Latency is 1 hour, but to get bandwidth you need to divide not by 1 hour but by the time between successive station wagon departures. For example if you only have one station wagon and driving back after copying the tapes takes a further 30 minutes, then the denominator is 5400 seconds and so the system bandwidth is 50% less than the number quoted.
Latency is greater than 1 hour, if it takes an hour to copy the data to the tapes. You've got to get them to the destination and copy them off after that.
>If it takes an hour to copy a terabyte of data to a tape, put it in a car, drive to a location and read the tape
That's 1 hr latency, unless I'm mistaken. Latency doesn't include disk operations on the client side, does it? Is latency time between a signal being sent and received, or between sent, received, and acknowledgement sent/received?
ie, is latency time(client->SYN->server->SYNACK->client->ACK->server), or time(client->SYN->server)?
>edit: And given trends in solid-state drives, local storage will be cheaper and more energy efficient than a wireless network for a long time.
At some point, though, you get diminishing returns. If all you're doing is playing mp3s at 192 kbit/s, existing wireless is fine and future wireless will be more than adequate. There's no need to add the expense of putting local storage on a mobile device if you can achieve the same end cheaper with cloud hosting.
Now, this is all academic, as we're talking about a radio, and I don't think they're likely to be around by then. Whether or not a device includes local storage will be an economic decision given its intended usage and market conditions of the time, which nobody can predict with great certainty.
I don't think it's unreasonable to expect ubiquitous wireless connectivity of one kind or another, and cloud storage. As for what form the device might take, who knows? It would have been pretty hard to predict smartphones in 1990; I expect further consolidation of devices, but have no idea what they'll look like.
I'd agree that caching is important, but when, realistically, would you not have any sort of wireless access in 20 years? Driving? Many cars these days have hard drives; it's easy to see that becoming more prevalent as storage gets even cheaper. The people designing mobile devices aren't worried about a single farmer in the middle of Saskatchewan. Cell phone providers didn't even bother with the whole of Saskatchewan for a number of years.
If you can get a cell signal, you can get streaming media, or at least you will be able to. I don't see that being much of a problem.
I live in Australia and already try to live mostly in the cloud. I have more than enough mobile data transferring > 5Mbps anywhere that I happen to be to live completely on cloud-based technologies if such technologies existed for all my requirements.
Living in NYC and traveling almost exclusively by subway makes me appreciate my local storage. In time there will be wireless underground, but for now, it's necessary to store music and podcasts on local storage. The AT&T 2 GB limit per month also encourages the need for local storage.